Introducing the GeForce4: nVidia strikes back
So it's that time again and the six-month life cycle for high-end graphics cards has come to an end. The GeForce3 is dead, long live the GeForce4! True to the motto: The king is dead, long live the king, even if ATI's Radeon 8500 has usurped the crown in many areas.
For a few weeks now, adventurous rumors have been circulating through the Internet about what the ultimate miracle chip from nVidia should look like. For the most part, there was agreement that, as with the nV2a in the Xbox, a second vertex shader should virtually double the geometry performance. There was also speculation about 6 rendering pipelines, probably due to a PDF from nVidia, in which chief developer David Kirk spoke of a 'massively programmable, massively parallel and pipelined graphics monster'. 3dfx technology should have found its way into the new chip and thus enable new heights in the field of full-screen anti-aliasing. DirectX-9 capabilities were also suspected, although this is still under the strict NDA (Non-Disclosure Agreement) and should not be released until the end of this year.
Since today the time has come for the Californian graphics chip developer nVidia unleashed the next generation of graphics chips at a presentation in Brussels. This means that the mandatory NDA, which has so far been an obligation to keep silent, no longer applies to us.In the following we want to take a closer look at the innovations, changes and also the properties known from previous chips. So now, even if no final test samples are available, the end of the speculations is in sight and one thing should be said: Many speculations contained, as so often, a bit of truth.
But first a word of warning: This preview is largely based on the not exactly exhaustive press information that nVidia has published in advance under NDA. Reliable comparison material is unfortunately not available. So much of the information given here should be treated with a little caution.
On the next page: Models and Specifications