I've been using computers (started with a Kaypro II) for 26 years now and
something I've seldom seen discussed, in fact almost never, is just what is
the point of further refinement of them and the programs running them. Once
a program/computer gets to the point that you can do a task almost
instantanously, why "improve" it any further, seeing as how learning a "new"
program/computer almost always involves a huge learning curve that quickly
erases any minor improvement you may see in your final product? I guess you
could call me a modern day Luddite, but it gets to be a major league pain in
the arse keeping up with the "improvements" when the only thing improved
substantially that I can see is the bottom line of the software developer or
computer manufacturer.