We all had some laughs when the two-bladed razor was
improved with the
addition of a third blade. Late-night comedians joked about
razors coming with four or even five blades.
Then, we all had a few more chuckles when the four-bladed Schick
Quattro actually made it to market, soon followed by the five-bladed
Gillette Fusion. This led to speculation about the natural
end-point:
href="http://agrumer.livejournal.com/414194.html">
class="inset" alt=""
src="http://www.grumer.org/lj_images/razors.gif" border="0"
height="256" width="264">
The same thing is happening inside our computers. For the
first couple of decades, they had one video card. Then two:
Then three:
And of course, four, at least on the bench:
Need I say that this is getting to be ridiculous?
NVidia
reports that for a two-card system with their leading product
(two 8800GTX cards), you need a lot of power:
For the NVIDIA GeForce 8800 series, the power
requirements to run in single card mode aren't all that much different
then running any other high end card. NVIDIA recommends a 450W power
supply for GTX-based cards or a 400W power supply for GTS-based cards.
For the 8800GTX card you'll require 2 available PCI-E power connectors,
the 8800GTS only requires 1. If you're a hearty soul who likes to live
life to the fullest and can afford to super-size it by running dual
8800 cards, you're going to need to find a power supply in the 850-1000
watt range. This is roughly the equivalent of running an extra
refrigerator in your house.
So a four-card system will be like running an extra
refrigerator AND
having an extra teenager in your house.
What is far more interesting, not to mention practical, is the
href="http://www.tomshardware.com/2007/09/13/hardware_components/index.html">experimental
system they concocted at Tom's Hardware recently.
This is a system designed to run off power from solar panels.
The cost was about $1000, for the computer itself.
(Solar panels are extra.)
If that seems like a high price, consider that a single nVidia 8800GTX
will set you back >= $550.
The power required for this thing is a stark 61 watts,
at peak. It has an AMD Athlon 64 X2 BE-2350 CPU, 2GB RAM, and
uses onboard graphics from the Gigabyte GA-MA69GM-S2H motherboard.
They are all off-the-shelf components, although the
href="http://www.tomshardware.com/2007/09/13/hardware_components/page3.html">power
supply is a specialty item, since it runs on DC.
That 61 watt
power requirement includes the monitor, by the way.
The monitor accounts for 23 watts.
It runs Vista acceptably well. As a bonus, it has
HDMI and DVI with HDCP output capability.
The authors cheated a little bit: they stated by saying they wanted a
system that could play high-definition video. This system
will, but they did not include speakers. It can run 5.1
audio, but obviously you'd need speakers for that.
- Log in to post comments
Reminds me of the oddly prescient Onion article in 2004 (language warning):
http://www.theonion.com/content/node/33930
Hardcore gamers will tell you 2 video cards is necessary, and to an extent they have a point. If you want to get the best frame rates and the lowest lag, the more video processing capability to take away from the CPU and leave up to the video cards, the less lag you get. Meaning in online play, when you see the other person on the screen and shoot they are dead. Whereas when you lag, there may be a millisecond difference in the time when you saw the person in your sites, and your shot registered.
And besides, when it comes to technology you will never hear me say, "Oh that's ridiculous! When will someone ever need 'x'." We all know Bill Gates famous 'last words'... ;)
I understand that serious gamers need the highest frame rates they can get. But consider the power requirements. PC Power & Cooling's biggest PSU provides 1 kW. A few companies make 1.2 kW PSU's. It sounds as though even the 1.2 kW PSUs would not be enough to power four 8800-class GPUs along with the rest of the system. Plus, even the best motherboards can't run four GPUs at 16x; two will have to run at 8x.
Beyond two GPUs you are going to have to deal with some serious practical limitations, with decreasing incremental benefits.
Hm, a serious irony to Ray Kurzweill singularity curve and so, so true. Infinite blades was way too precious.
Video processor manufacturing is behind CPU manufacturing in some ways; e.g. feature size, which correlates to power use. The competition between video card manufacturers is so fierce that they cannot afford the possibility of the sort of delays which might occur by translating their chips to the latest fab techniques. Intel and AMD are both shipping CPUs manufactured at 65 nm, but GPUs are still at 90 nm. It took Intel quite a while to ditch their Pentium 4 toaster and introduce a more efficient CPU architecture, I hope the GPU manufacturers will get with the program some day.
There are other possibilities for increasing video throughput than using multiple cards. PCI-express 2.0 will be out soon, with twice the bandwidth, and PCI 3.0 is already being planned.