Computer evolution: And we continue to wait.
Let's take a short trip down technology lane and look back at "what was" back then.
For me, it all started with a C64. My dad worked for a shipping and forwarding company, and they made the change from punched or IBM card systems to a new form of computing. The company he worked for made their employees an offer to purchase a Commodore 64 and use it at home. So he brought home the first computer I worked with.
One of the most notable things was that once you got it to display all of the files on one of the floppies, you needed a certain amount of patience to get it to actually start the program. There was no such thing as switching on your computer and just starting to work. Following (or something similar) was seen more than once in the lives of the C64 users:
1 2 3 4 5 6 7 8
LOAD "$",8 LIST LOAD "TOM AND JERRY",8,1 <get coffee> <drink coffee> <flip over the floppy> <wait some more> RUN
And you would be ready to start playing, or perhaps even start working. And this is just one example that I am quite familiar with. Something probably not even expected when Alan Turing described his Turing machine, but somehow managed to start with the ENIAC, follow us even after the transistors were invented and then integrated in to integrated circuits, and has since followed us from the Intel 4004 past the more modern systems like the Intel 80386 that included the FPU, and even nowadays with the new AMD Athlon and Intel Xeon processors.
Now, one would say that we have an immense amount of computational power. And I can't do anything else but agree with you. If you take a closer look at Moore's law and plot everything out you can even see that our computational power made a great leap. And that's absolutely great!
So by now we have so much computational power that we don't have to wait anymore when we want to do something, right? That must have changed since the days of the C64?
That's what you would probably expect, right? It's true, we don't have to insert floppies anymore. Instead we use a different medium with a higher data density. And we can run programs that are much bigger and complex on our machines right now that we ever imagined in the days of the Intel 4004.
But we continue to wait. How many people do you know at the office who start off with switching on their computer and then getting a cup of coffee while we wait for the operating system to load or for the programs they work with to start? I know loads of people. If I take the company I work for in to account, you can see that it's a huge piece of software that you install to your servers. But we still wait quite long times when we try to work with the program, and it's usually not the system that is waiting for some user input.
Fact is that we continue to find new things that we can actually compute. The more computational power we have, the more we try to compute. New technologies like nanocomputers won't change anything there. We will have a short lived feeling that is like "wow, this is really fast!", and then compute more and lose the (feeling of) speed.
In short; The computer continues to evolve, and we continue to wait.
<flip over the floppy>
<wait some more>
We want to compute more and more data so the market brings out stuff that can do it faster? See some of the newer games and software that won't run smoothly on ANY of todays reasonable hardware.
If we need to compute some giant amounts humanity creates faster hardware.
When the faster hardware is there people will work out new things to compute that take an equivalent of that or even more starting the cycle all over again.
It's just from what point you see the cycle.
[Comment edited on Thursday 1 October 2009 13:42]
<flip over the floppy>
<wait some more>
Living with computers was much more relaxed that nowadays.
Now everything has to go ultrafast and stress levels are higher then ever...
The 4004 was indeed the first Intel chip, but never used for consumers. The first (Intel) processor found in a consumer computer was the 8086 (which was a followed up from 8085, 8080 and 8008). The 8080 was basicly the first real microprocessor.
Also, you forgot (I think) that there were a few more CPU's around already before the first Intel ever made it to "computer". Motorola 6800 (1974), Microchip PIC16X (1975), MOS Technology 6502 (1975), Zilog Z80 (1976), and Motorola 6809 (1977) to give a few examples (MOS 6502 was in your C64 even).
The world is bigger than Intel. For the rest, the blog is fun and you were lucky with your Floppy Drive. I remember starting up my Tandy TRS80 CoCo 2 and running from tape. Try to load a decent game then, it takes forever
I picked the 4004 because it started the "revolution" we now know in our systems, and it is basically one of the true grandfathers of the CPU's we use nowadays.
I know about the other processors, but they were (imo) variations to a certain theme. The 4004 offered the first "spark" toward our current generation of CPU's, although that probably would not have existed if it weren't for the other procs. I just tried to give a very small timeline, but a bigger one might be fun to dive in too for a future blogpost.
And thanks for the compliment.
Comments are closed