Partitioning of processing power

I’ve been wondering a lot about the practical implications of the partitioning of processing power on the PC.

I’ve been dutifully buying machines with faster and faster Intel processors, I even bought a dual proc machine, and I have to say — the results have been underwhelming. All this processing power just seems to get sopped up by Windows, Office, all the taskbar cruft that programs and oems install, etc etc. As I have doubled and tripled the power available to me, my life hasn’t gotten a lot better.

On the other hand, I’ve also been adding processing power even faster to my video subsystem as I’ve kept up with the nvidia and ati product leapfrogs. And I have to say — my computing experience has gotten visibly, materially better over the last 24 months, games are looking better by the month.

And so I have started to wonder about the “right” model for partitioning processing power on the PC platform. I am not wondering in computer science sense. I am wondering at pragmatic, social level. I wonder if creating a pool of dedicated processing power, not generally accessible to general purpose programs, has created an environment in which games have been able to move ahead faster. And I am wondering if keeping this processing power away from the general use pool, away from all the general purpose cruft on my machine, has in fact been the reason that games have been able to take off. If all game processing power was intermingled with the general purpose cpu, would we have seen the same rate of improvement?

I wonder if we wouldn’t be served by more special purpose co-processors. A network/security processor where all the code for firewalling, virus checking, intrusion detection, compression, etc could happen. A reliability coprocessor where dedicated code could be backing up, examining systems for failure, predicting problems, etc.

Certainly not a traditional view of how to evolve the PC but I wonder.