A List of Computing Predictions
A List of Computing Predictions
I, like many technophiles, maintain an interest in a lot of hardware and software developments. Recently, however, I've noted a proportion of technological developments which I am assured will become dead-ends. With that in mind, I'd like to present a list of my own computing predictions.
- There will be such a drastic increase in computing power over the next five years that the software industry will be unable to keep up.
With the advent of multi-core processors, personal computing no longer strives to do one thing faster than before, but many things at once. Symmetric multiprocessing and multi-threaded applications have existed for a long time in technical computing and supercomputing, but these developments have infrequently been applied to personal computer operating systems. With the quad-core processor released into the commercial market, and octo-core processors on the way, PC hardware finally has the chance to have useful multiprocessing capabilities.
That is, if programmers could apply their skills to multi-threaded applications. We're seeing problems with this already, with computer games having increasingly protracted development periods and operating systems blooming out of control with bloat, but the multi-core processor has been largely untouched in terms of consumer applications. Creating support for multithreading through software is difficult and will likely increase the protracted periods of development for games developers.
Another area which presents potential problems is that of increasing hardware miniaturisation. As technology gets smaller, people are logically going to try to apply that technology to smaller applications, including portable music players, mobile phones and handheld consoles. While I'm highly optimistic about having more power in the palm of my hand, and greatly enjoy using my current smartphone, increasing complexity in these devices has rarely been taken well. We already have massive problems with the Luddites going, "Why can't a phone just be a phone?" (I strongly object to your opinions, and disagree vehemently to your objections on principle, BTW), and people finding it difficult to navigate interfaces on portable devices. While this is improving, with clearer interfaces and bigger screens than before, including those on hybrid slider and touch-screen phones, there's a long way to go before these will become appreciated in the same way as a PC. The problem is on the software side, not the hardware side, and that's something that's going to have to improve.
- Cloud computing will not catch on within the next ten years, and will remain a niche application.
Ah, cloud computing. I've heard so much about this, with applications like Google Docs presenting office software over the internet. It's completely overrated. You know, it reminds me of something I've read about, something that was beginning to die out about the time I was born; a concept called time-sharing.
You see, back in the 1950s and 1960s, when electronic computers really started to come into their own, computers were hugely expensive devices, only within the financial reach of scientists, universities, businesses, governments and military organisations. They were crude, often accepting their entry through Hollerith punch cards and front-panel switches, and later through mechanical teletypes, which were loud, crude machines vaguely representative of a typewriter. The problem was that most of these control mechanisms only allowed one person to use the computer at once, and so, the idea of time-sharing was devised. With time-sharing operating systems, a single computer could be connected to by several terminals at once, and the computer would divide processing power to users through controls programmed into their applications. This persisted throughout the 1960s and 1970s, used with the increasingly powerful mainframes and minicomputers, to the point where hundreds of people could be supported on some of these computers at a time.
Then, during the late 1970s and 1980s, there was a development which drastically changed the face of computing. The development of the personal computer meant that people would no longer have to use a massive centralised minicomputer or mainframe for many of their applications, and time-sharing began to die out as the PC became more powerful. The personal computer was developed to get people away from the concept of a massive centralised computing facility.
And therein lies my objections to cloud computing. Right now, the computer I'm typing on has more power than the fastest 1980s supercomputer. My computer at home would be in the TOP500 list all the way to the mid-1990s. Why then, when we have computers that can do so much, would we willingly move ourselves metaphorically back in time to the idea of an centralised application server? I mean, it's not even like most of the consumer-targeted programs are any faster than the ones we have on our home computers. Indeed, because many of them are programmed in Java, and because our internet connections are generally so woefully inadequate for using a fully-featured office suite, these applications tend to be slower!
Now, I can strictly understand the idea of internet archiving, although I still think that it would be more logical to carry around a USB stick, but I do not understand why you'd want to move most of your workload to a slow, inadequate office suite or imaging program, and therefore, I must conclude that cloud computing will not find success outside of niche applications, and that it will not even catch on for those within ten years. People are making too much of a technology which was actually rendered obsolete more than twenty years ago.
- There will be no input technology within the next ten years that will displace the keyboard and mouse.
We've all seen the success of the Wii, using its strictly technically inferior but still groundbreaking motion controls, and we've seen massive success for touch-screen phones, despite the notable inadequacies of the iPhone and many of its competitors. With all this buzz around these new input methods, it would be easy to presume that we'll have new input devices which will displace the ones we're all using right now.
I'm not so convinced.
You see, people have been predicting virtual reality and new input methods for years, and yet, devices developed several decades ago are still riding strong. The mouse was first developed in the 1960s, and the keyboard can trace its past back to the mid-1800s, via visual display terminals and mechanical teleprinters. The fact remains that there is no technology currently faster for entering text data than the keyboard. Specialist stenographic keyboards work more quickly, but they still operate on many of the same principles as a typewriter's keyboard. The mouse as well has many advantages which are hard to ignore. It has the sort of sensitivity, accuracy and precision which motion controls and touch screens would kill for, were they personified.
I have mobile devices with both a QWERTY-style keypad and an on-screen touch-sensitive keyboard. When it comes to entering data in quickly, the keypad will completely destroy the touch screen in a speed test, and that's with a more precise stylus-based resistive touch screen as well. I'd absolutely loathe to try typing in an entire review with the iPhone, as I've actually done with my Nokia E71.
There are other reasons why I feel that touch screens aren't going to displace the keyboard. Using a touch screen with your finger or thumb feels even worse than using a chiclet-style keyboard, of the type most derided when they were presented with the IBM PCjr or ZX Spectrum. There are reasons why people still buy typewriters for hard-copy writing. There are reasons why people spend over $100 to buy twenty-year-old IBM Model M keyboards. It's because of the superior tactile feel of these devices and the audible response that the keys make when they're successfully depressed, that tactile feel which is almost completely eliminated when you try to engage a touch screen with your finger.
I think, once again, that people are missing the big picture, and that's why I'm predicting that I'll still be considered normal for using my keyboard on my computer in 2020.
- Social networking is a fad and its use shall have sharply declined in three years' time.
And finally, we move on to my most controversial issue. I don't like social networking. Not one bit - I've even considered writing an essay entitled, "Social Networking Considered Harmful". You see, I reckon that it's a fad, just like all of those other fads that I've grown up with. When I speak to people at college, I don't find many who actually want to engage with technology on a level any greater than the internet and office software. I don't find many that even want to use Photoshop or who admit to using it to aid fictional writing. Perhaps I'm talking to the wrong people, but these don't seem like people that are particularly interested in computers, and that makes me inclined to believe that they're not going to continue to use computers in the same way they do now.
The problem is that internet communication is often devoid of any real meaning. The limitations of text in e-mails and electronic messages remove most of the emotion of a message, which is perfect for business e-mails and acceptable in personal e-mails, but not so adequate when it comes to expressing yourself on a social level. For that reason, when you look at a Bebo or MySpace page, it generally looks something like a GeoCities page, circa 1998. Clashing colours, poorly-chosen backgrounds and hideous spelling and grammar lend the impression of something that's been utterly hacked together. There's a reason why the web-pages of sites like Google and even the W3M, maintainers of the World Wide Web standard, stick to minimalist designs. Well-designed corporate websites stick to clean designs. Social networking pages, however, do not.
And that's even before we get into the actual content of the pages. It strikes me as quite frightening that this is the impression that some people actually want to create of themselves. That these pages are checked by companies for background information is even more frightening. When your pages don't exactly give the impression that you are even fully literate, let alone a potentially intelligent and creative employee, you are well and truly fornicated up the rectal cavity.
As somebody who's never signed up to a social networking service, it's very difficult to understand those snippets of insider information that I hear from the news, often distorted as the newscasters fail to understand the true nature of the technology. However, I still find it impossible to understand why there are people who spend their time virtually befriending people they intend to have no contact with on the internet at all, even though I spend my time writing reviews, articles and technological rants for people I've never met. Maybe there's some sort of leap of faith that has to be made, but I'm still convinced that social networking will be a phenomenon on life support within three years, just like the "dot-com bubble" burst.
- There will be such a drastic increase in computing power over the next five years that the software industry will be unable to keep up.
With the advent of multi-core processors, personal computing no longer strives to do one thing faster than before, but many things at once. Symmetric multiprocessing and multi-threaded applications have existed for a long time in technical computing and supercomputing, but these developments have infrequently been applied to personal computer operating systems. With the quad-core processor released into the commercial market, and octo-core processors on the way, PC hardware finally has the chance to have useful multiprocessing capabilities.
That is, if programmers could apply their skills to multi-threaded applications. We're seeing problems with this already, with computer games having increasingly protracted development periods and operating systems blooming out of control with bloat, but the multi-core processor has been largely untouched in terms of consumer applications. Creating support for multithreading through software is difficult and will likely increase the protracted periods of development for games developers.
Another area which presents potential problems is that of increasing hardware miniaturisation. As technology gets smaller, people are logically going to try to apply that technology to smaller applications, including portable music players, mobile phones and handheld consoles. While I'm highly optimistic about having more power in the palm of my hand, and greatly enjoy using my current smartphone, increasing complexity in these devices has rarely been taken well. We already have massive problems with the Luddites going, "Why can't a phone just be a phone?" (I strongly object to your opinions, and disagree vehemently to your objections on principle, BTW), and people finding it difficult to navigate interfaces on portable devices. While this is improving, with clearer interfaces and bigger screens than before, including those on hybrid slider and touch-screen phones, there's a long way to go before these will become appreciated in the same way as a PC. The problem is on the software side, not the hardware side, and that's something that's going to have to improve.
- Cloud computing will not catch on within the next ten years, and will remain a niche application.
Ah, cloud computing. I've heard so much about this, with applications like Google Docs presenting office software over the internet. It's completely overrated. You know, it reminds me of something I've read about, something that was beginning to die out about the time I was born; a concept called time-sharing.
You see, back in the 1950s and 1960s, when electronic computers really started to come into their own, computers were hugely expensive devices, only within the financial reach of scientists, universities, businesses, governments and military organisations. They were crude, often accepting their entry through Hollerith punch cards and front-panel switches, and later through mechanical teletypes, which were loud, crude machines vaguely representative of a typewriter. The problem was that most of these control mechanisms only allowed one person to use the computer at once, and so, the idea of time-sharing was devised. With time-sharing operating systems, a single computer could be connected to by several terminals at once, and the computer would divide processing power to users through controls programmed into their applications. This persisted throughout the 1960s and 1970s, used with the increasingly powerful mainframes and minicomputers, to the point where hundreds of people could be supported on some of these computers at a time.
Then, during the late 1970s and 1980s, there was a development which drastically changed the face of computing. The development of the personal computer meant that people would no longer have to use a massive centralised minicomputer or mainframe for many of their applications, and time-sharing began to die out as the PC became more powerful. The personal computer was developed to get people away from the concept of a massive centralised computing facility.
And therein lies my objections to cloud computing. Right now, the computer I'm typing on has more power than the fastest 1980s supercomputer. My computer at home would be in the TOP500 list all the way to the mid-1990s. Why then, when we have computers that can do so much, would we willingly move ourselves metaphorically back in time to the idea of an centralised application server? I mean, it's not even like most of the consumer-targeted programs are any faster than the ones we have on our home computers. Indeed, because many of them are programmed in Java, and because our internet connections are generally so woefully inadequate for using a fully-featured office suite, these applications tend to be slower!
Now, I can strictly understand the idea of internet archiving, although I still think that it would be more logical to carry around a USB stick, but I do not understand why you'd want to move most of your workload to a slow, inadequate office suite or imaging program, and therefore, I must conclude that cloud computing will not find success outside of niche applications, and that it will not even catch on for those within ten years. People are making too much of a technology which was actually rendered obsolete more than twenty years ago.
- There will be no input technology within the next ten years that will displace the keyboard and mouse.
We've all seen the success of the Wii, using its strictly technically inferior but still groundbreaking motion controls, and we've seen massive success for touch-screen phones, despite the notable inadequacies of the iPhone and many of its competitors. With all this buzz around these new input methods, it would be easy to presume that we'll have new input devices which will displace the ones we're all using right now.
I'm not so convinced.
You see, people have been predicting virtual reality and new input methods for years, and yet, devices developed several decades ago are still riding strong. The mouse was first developed in the 1960s, and the keyboard can trace its past back to the mid-1800s, via visual display terminals and mechanical teleprinters. The fact remains that there is no technology currently faster for entering text data than the keyboard. Specialist stenographic keyboards work more quickly, but they still operate on many of the same principles as a typewriter's keyboard. The mouse as well has many advantages which are hard to ignore. It has the sort of sensitivity, accuracy and precision which motion controls and touch screens would kill for, were they personified.
I have mobile devices with both a QWERTY-style keypad and an on-screen touch-sensitive keyboard. When it comes to entering data in quickly, the keypad will completely destroy the touch screen in a speed test, and that's with a more precise stylus-based resistive touch screen as well. I'd absolutely loathe to try typing in an entire review with the iPhone, as I've actually done with my Nokia E71.
There are other reasons why I feel that touch screens aren't going to displace the keyboard. Using a touch screen with your finger or thumb feels even worse than using a chiclet-style keyboard, of the type most derided when they were presented with the IBM PCjr or ZX Spectrum. There are reasons why people still buy typewriters for hard-copy writing. There are reasons why people spend over $100 to buy twenty-year-old IBM Model M keyboards. It's because of the superior tactile feel of these devices and the audible response that the keys make when they're successfully depressed, that tactile feel which is almost completely eliminated when you try to engage a touch screen with your finger.
I think, once again, that people are missing the big picture, and that's why I'm predicting that I'll still be considered normal for using my keyboard on my computer in 2020.
- Social networking is a fad and its use shall have sharply declined in three years' time.
And finally, we move on to my most controversial issue. I don't like social networking. Not one bit - I've even considered writing an essay entitled, "Social Networking Considered Harmful". You see, I reckon that it's a fad, just like all of those other fads that I've grown up with. When I speak to people at college, I don't find many who actually want to engage with technology on a level any greater than the internet and office software. I don't find many that even want to use Photoshop or who admit to using it to aid fictional writing. Perhaps I'm talking to the wrong people, but these don't seem like people that are particularly interested in computers, and that makes me inclined to believe that they're not going to continue to use computers in the same way they do now.
The problem is that internet communication is often devoid of any real meaning. The limitations of text in e-mails and electronic messages remove most of the emotion of a message, which is perfect for business e-mails and acceptable in personal e-mails, but not so adequate when it comes to expressing yourself on a social level. For that reason, when you look at a Bebo or MySpace page, it generally looks something like a GeoCities page, circa 1998. Clashing colours, poorly-chosen backgrounds and hideous spelling and grammar lend the impression of something that's been utterly hacked together. There's a reason why the web-pages of sites like Google and even the W3M, maintainers of the World Wide Web standard, stick to minimalist designs. Well-designed corporate websites stick to clean designs. Social networking pages, however, do not.
And that's even before we get into the actual content of the pages. It strikes me as quite frightening that this is the impression that some people actually want to create of themselves. That these pages are checked by companies for background information is even more frightening. When your pages don't exactly give the impression that you are even fully literate, let alone a potentially intelligent and creative employee, you are well and truly fornicated up the rectal cavity.
As somebody who's never signed up to a social networking service, it's very difficult to understand those snippets of insider information that I hear from the news, often distorted as the newscasters fail to understand the true nature of the technology. However, I still find it impossible to understand why there are people who spend their time virtually befriending people they intend to have no contact with on the internet at all, even though I spend my time writing reviews, articles and technological rants for people I've never met. Maybe there's some sort of leap of faith that has to be made, but I'm still convinced that social networking will be a phenomenon on life support within three years, just like the "dot-com bubble" burst.
"When conversing about oligarchy, you must consider this point: Many children are taught from an early age that monarchies are primarily benevolent, and are given very little to discourage this opinion until at least adolesence. Even more prevalent are stories about the supposed benevolence of heirs to the throne, symbolising a cycle which they are taught is good."
in response to your first point, I just wanted to mention that another prediction is that 3d technology will improve in the next 10 years and become slightly less expensive, possibly in reach of the average consumer.
with that done, video games will become 3d, much like the hologram princess leia in Star Wars that Luke (?) found in R2D2
with that done, video games will become 3d, much like the hologram princess leia in Star Wars that Luke (?) found in R2D2
Re: A List of Computing Predictions
Ignoring of course large scale database applications, in which case hardware may eventually cease being a ridiculous bottleneck.RAK wrote:- There will be such a drastic increase in computing power over the next five years that the software industry will be unable to keep up.
Or better yet STFU noob and go sage something attainable.
I miss the good ol' USSA.
- Jesus Christ
- Mamma's Gang member
- Posts: 1314
- Joined: Wed Mar 08, 2006 1:32 am
I predict that in the next seven years, computers will be able to outwit house-cats.
I have returned! (again)
FIGHT ME!
FIGHT ME!
- Manoil
- Wastelander's Nightmare
- Posts: 3701
- Joined: Wed Feb 22, 2006 12:05 pm
- Location: Drifting Onward
How about this:
-The constant counterfeitting of paper bills will eventually lead to all money being made into electronic transactions. Given enough time, hackers will eventually be able to modify the amount credited to their account, as well as that of other accounts. This widespread break in security will lead to chaos-- and possible crucifixion of the most blatant offenders.
-The constant counterfeitting of paper bills will eventually lead to all money being made into electronic transactions. Given enough time, hackers will eventually be able to modify the amount credited to their account, as well as that of other accounts. This widespread break in security will lead to chaos-- and possible crucifixion of the most blatant offenders.
MySpace, Facebook, etc. remind me of the old "scroll" websites of the early/mid '90s.
It's not about the informational quality.
At least in the intellectual sense.
People threw shit up back then because it was new and fun.
Most got bored.
It's different now- the consumers now have grown up with the technology and have internalized it in a way that conforms to their view of what technology should do for them.
It's about going full circle back to the old tribal ways in terms of constant social or in this case virtual-social interaction.
Technology counteracting the isolation created by technology.
The people who use this stuff are not concerned about the literary merit of their personal online pastiche.
They are repurposing technology of geeks to replicate the interpersonal reality that they experience without technology in their ordinary lives.
It's technology for them.
The Great Unwashed.
It's not about the informational quality.
At least in the intellectual sense.
People threw shit up back then because it was new and fun.
Most got bored.
It's different now- the consumers now have grown up with the technology and have internalized it in a way that conforms to their view of what technology should do for them.
It's about going full circle back to the old tribal ways in terms of constant social or in this case virtual-social interaction.
Technology counteracting the isolation created by technology.
The people who use this stuff are not concerned about the literary merit of their personal online pastiche.
They are repurposing technology of geeks to replicate the interpersonal reality that they experience without technology in their ordinary lives.
It's technology for them.
The Great Unwashed.
I've sort of adopted the whole tl;dr approach a lot recently. I just end up infodumping about everything. Count yourselves lucky that I haven't posted a topic on space warfare. Now, that would be atrociously long.PiP wrote:is this some copypasta? Anyhow I got headache so tl;dr
"When conversing about oligarchy, you must consider this point: Many children are taught from an early age that monarchies are primarily benevolent, and are given very little to discourage this opinion until at least adolesence. Even more prevalent are stories about the supposed benevolence of heirs to the throne, symbolising a cycle which they are taught is good."
http://www.projectrho.com/rocket/index.htmlRAK wrote:Count yourselves lucky that I haven't posted a topic on space warfare. Now, that would be atrociously long.
Pretty much covers most aspects.
Indeed. Which is why I cited it when I wrote a gargantuan essay on the whole "why big fuck-off space dreadnoughts won't work, and why nukes don't work properly either" topic. They always seem to be shooting shiny red lasers from a shorter distance than WWI warships engaged at, and crewed with enough people to rival most real-world armies.MadBill wrote:http://www.projectrho.com/rocket/index.htmlRAK wrote:Count yourselves lucky that I haven't posted a topic on space warfare. Now, that would be atrociously long.
Pretty much covers most aspects.
"When conversing about oligarchy, you must consider this point: Many children are taught from an early age that monarchies are primarily benevolent, and are given very little to discourage this opinion until at least adolesence. Even more prevalent are stories about the supposed benevolence of heirs to the throne, symbolising a cycle which they are taught is good."
- Thor Kaufman
- Mamma's Gang member
- Posts: 5081
- Joined: Mon Dec 16, 2002 11:56 am
- Contact:
In the future there will be a cloud-computing Alternet run by nodes of graphics cards (used as multicore processor arrays instead of for graphics) bundled with miniservers/flash-stick aggregates.
The connections in the network would be any wireless devices in range, plus the regular internet and phone system. (if naughty, add Ham radio, satlinks, etc.)
It could operate as a shadow internet, routing past censorship, power outages, broken cables, etc.
The connections in the network would be any wireless devices in range, plus the regular internet and phone system. (if naughty, add Ham radio, satlinks, etc.)
It could operate as a shadow internet, routing past censorship, power outages, broken cables, etc.
- Goretheglowingone
- Mamma's Gang member
- Posts: 1280
- Joined: Thu Jun 22, 2006 7:49 am
- Location: DAC (YEA FUCKERS! WHAT'S IT TO YOU? HUH! HUH! , I Gotta go butt sex a nun now..