Video Card Singularity

We all had some laughs when the two-bladed razor was
improved with the
addition of a third blade.  Late-night comedians joked about
razors coming with four or even five blades.



Then, we all had a few more chuckles when the four-bladed Schick
Quattro actually made it to market, soon followed by the five-bladed
Gillette Fusion.  This led to speculation about the natural
end-point:




i-ce56b69c6aeefa3dfd9dc67b6a2b8025-many-blades.JPG



href="http://agrumer.livejournal.com/414194.html"> class="inset" alt=""
src="http://www.grumer.org/lj_images/razors.gif" border="0"
height="256" width="264">



The same thing is happening inside our computers.  For the
first couple of decades, they had one video card.  Then two:



i-6d4a7dccdad21a543defaab67584dbd9-nvidia_sli_multi_gpu.jpg



Then three:

i-b0987be5db26276b2d3a37b41f7b54c4-IMG0021046.jpg



And of course, four, at least on the bench:

i-b15b10a3614f307fd43eaab881a79633-four-sli.JPG



Need I say that this is getting to be ridiculous?



NVidia
reports
that for a two-card system with their leading product
(two 8800GTX cards), you need a lot of power:


For the NVIDIA GeForce 8800 series, the power
requirements to run in single card mode aren't all that much different
then running any other high end card. NVIDIA recommends a 450W power
supply for GTX-based cards or a 400W power supply for GTS-based cards.
For the 8800GTX card you'll require 2 available PCI-E power connectors,
the 8800GTS only requires 1. If you're a hearty soul who likes to live
life to the fullest and can afford to super-size it by running dual
8800 cards, you're going to need to find a power supply in the 850-1000
watt range. This is roughly the equivalent of running an extra
refrigerator in your house.



So a four-card system will be like running an extra
refrigerator AND
having an extra teenager in your house.  



What is far more interesting, not to mention practical, is the
href="http://www.tomshardware.com/2007/09/13/hardware_components/index.html">experimental
system they concocted at Tom's Hardware recently.
 This is a system designed to run off power from solar panels.
 The  cost was about $1000, for the computer itself.
 (Solar panels are extra.)



If that seems like a high price, consider that a single nVidia 8800GTX
will set you back >= $550.


i-25b176c68b74138a5d55244bac59fec8-solar_system_2.jpg



The power required for this thing is a stark 61 watts,
at peak.  It has an AMD Athlon 64 X2 BE-2350 CPU, 2GB RAM, and
uses onboard graphics from the Gigabyte GA-MA69GM-S2H motherboard.
 They are all off-the-shelf components, although the href="http://www.tomshardware.com/2007/09/13/hardware_components/page3.html">power
supply is a specialty item, since it runs on DC.
 That 61 watt
power requirement includes the monitor, by the way.
 The monitor accounts for 23 watts.



It runs Vista acceptably well.  As a bonus, it has 
HDMI and DVI with HDCP output capability.  



The authors cheated a little bit: they stated by saying they wanted a
system that could play high-definition video.  This system
will, but they did not include speakers.  It can run 5.1
audio, but obviously you'd need speakers for that.


More like this

This is not a traditional review, in that I am not going to discuss the distro in any systematic or comprehensive fashion.  There already are several reviews out there in the href="http://distrowatch.com/table.php?distribution=suse">usual places.  Rather, I am going to outline my experiences…
It amused me today to see two blog posts, both about users switching from Gentoo to *buntu.   rel="bookmark" title="Permanent Link: Why I am making the switch from Gentoo to Kubuntu">Why I Am Making The Switch From Gentoo To Kubuntu href="http://fxjr.blogspot.com/2008/03/bye-gentoo-welcome…
I'll be curious to see if there turns out to be a parallel between what is happening now in the auto industry, and what happens in the future in the computing industry.   We recently passed the 25th anniversary of the original IBM PC ( href="http://www-03.ibm.com/ibm/history/exhibits/pc25/…
Almost every resource on the Internet on building your own computer is oriented towards building a gaming computer. The second most common discussion is how to build a "budget PC." When I sought out the latest information on building a computer a few weeks ago, I did not like either of these two…

Hardcore gamers will tell you 2 video cards is necessary, and to an extent they have a point. If you want to get the best frame rates and the lowest lag, the more video processing capability to take away from the CPU and leave up to the video cards, the less lag you get. Meaning in online play, when you see the other person on the screen and shoot they are dead. Whereas when you lag, there may be a millisecond difference in the time when you saw the person in your sites, and your shot registered.

And besides, when it comes to technology you will never hear me say, "Oh that's ridiculous! When will someone ever need 'x'." We all know Bill Gates famous 'last words'... ;)

I understand that serious gamers need the highest frame rates they can get. But consider the power requirements. PC Power & Cooling's biggest PSU provides 1 kW. A few companies make 1.2 kW PSU's. It sounds as though even the 1.2 kW PSUs would not be enough to power four 8800-class GPUs along with the rest of the system. Plus, even the best motherboards can't run four GPUs at 16x; two will have to run at 8x.

Beyond two GPUs you are going to have to deal with some serious practical limitations, with decreasing incremental benefits.

Hm, a serious irony to Ray Kurzweill singularity curve and so, so true. Infinite blades was way too precious.

Video processor manufacturing is behind CPU manufacturing in some ways; e.g. feature size, which correlates to power use. The competition between video card manufacturers is so fierce that they cannot afford the possibility of the sort of delays which might occur by translating their chips to the latest fab techniques. Intel and AMD are both shipping CPUs manufactured at 65 nm, but GPUs are still at 90 nm. It took Intel quite a while to ditch their Pentium 4 toaster and introduce a more efficient CPU architecture, I hope the GPU manufacturers will get with the program some day.

By Tegumai Bopsul… (not verified) on 04 Oct 2007 #permalink

There are other possibilities for increasing video throughput than using multiple cards. PCI-express 2.0 will be out soon, with twice the bandwidth, and PCI 3.0 is already being planned.

By Tegumai Bopsul… (not verified) on 04 Oct 2007 #permalink