One thing is for sure though if you’re planning to get Windows 7, make sure it’s on a decent video card, an onboard video card won’t be enough if you’re planning on watching TV on Windows media center or play computer games. But of course most people already know that.
Your telling people on a website dedicated to the remaking of a video game to make sure and get a real graphics card before upgrading their computers?
Right…
My HTPC has a HD3400 in it with 256Meg VRAM and that handles NFS : MW Perfectly, plus its a good cheap card, mine is an Asus card that is fanless. No heat or noise issues here. The case is the Antec P180 (the soundproofed one).The mobo has an onboard HD3200 built into the chipset (Asus M4A78 PRO) altough thats disabled, when I had to rescue my HD4850X2 BIOS, it handled it absolutely fine with aero enabled, I guess what Im trying to say is that the low end requirements are just that, low end.
Mark.
I was writing to the people who don’t know visiting the site. I know everyone here who’s on the forums knows to get a PCIE video card.
The 8800GT isn’t old! I’m still running an ATI Rage 128 Pro (16mb VRAM) and I’m fine with it, along with my good, ol’ PIII and 128mb RAM. 
With proper tweaking even low-end systems such as mine are able to work flawlessly as long as you don’t install a ton of programs, and closely monitor all registry entries, processes and services.
That’s for general computing. As for gaming, I rely on my laptop with a 9800m GS, C2D 2.13ghz, 4gb RAM, and 15.4" 1366x768 display. Suits me fine, runs Crysis maxed out ~25fps, no AA enabled.
Now, to answer the OP’s question: A year ago I would say nVidia due to the piss-poor drivers and overheating issues of the high-end ATI cards, but now I would say both vendors are pretty much equal (while I think that nVidia’s current GTX275 is a great bang for the buck, it is also getting outdated, and with the upcoming releases of new GFX cards we can only expect that the price will drop).
My two cents.
Uhhh… the 8800GT has been out for just about 3 years, thats pretty old in the computer world.
1gb Radeon 4890 atm, switching to 5870/5890 as soon as I get mun-neeeh 
I use a Nvidia GeForce 8400 GS 512 MB which I overclocked from 300/500 to 550/1032
It can run GTA IV on it at 32 FPS and Half Life 2 at 121 FPS
But what’s the point of running a game at 121 FPS? Above 60 you probably won’t notice a difference.
Seeing as most monitors don’t refresh much faster then that anyways (sure you can get 120 Hz, but who has those anyways), and the eye is close to 60 anyways (even though eyes don’t work in FPS, but it works out to be around that.
Because the same card that runs HL2 at 121fps, only runs GTA4 at 30, as he said.
So he could get it back to normal so HL2 runs at 60. But then GTA4 would be unplayable.
Because it’s another way to look “cool” cough on the internet.
I wish my 8400GS could clock that high… I only get about 12fps in GTA IV at 350Mhz.
I understand that laptops aren’t designed for gaming, but a 50mhz overclock is pretty pathetic.
I notice a difference between 60 and 75, including changing the Hz to 75, of course.
Yea, it’s been around for 3 or so years now… but they still do a great job of playing just about every game out there (minus Crysis). I have two 8800GTS 640mb cards sitting on my desk which I just replaced with a single GTX285 1gb and I have to say they were probably the best video card purchase I’ve ever made. Usually after 3 years, a video card is so outdated that you can hardly play any new games. But those 8800gts’s still play the vast majority of games very well @ 1920x1200.
As for the topic, If I were currently in the market for a new card I’d hold off until nVidia’s new GT300 card comes out. While ATi’s new 5970 is a super powerful card, they’re almost impossible to buy because the company who manufacturers the silicon chips (TSMC) is having lots of issues with producing both ATi’s newest chips, but these issues are also the cause of nVidia’s GT300 being delayed. Unfortunately TSMC is the only company who can make complex chips at the .40nm process, so ATi released a product they cannot get in adequate quantities… and nVidia decided to not lose money on every card so they held off on release until they get the technology down.
Lies, LIES!!!
Why do people still believe this?!
Because it makes people who can’t break 60fps feel special.
https://en.wikipedia.org/wiki/Frame_rate#How_many_frames_per_second_can_the_human_eye_see.3F
So between 60 and 120. The mid-point is 90, so 85 is o.k.
Actually its only 2 years old now, but yes it still does a great job of playing the latest games. Its mainly because there hasn’t been any real advancement from DirectX 10 until now and I think this card will only really start to struggle once DirectX 11 games become mainstream (which I think is still a few years away). Right now, most games are still made on DirectX 9c.
I use nvidia purely because it works and it’s widely available, which is the same reason I use Windows.
I’m not saying it’s better or anything, just that it’s convienient.