Introducing the GeForce4: nVidia strikes back
Finally we come to nView. Many will probably still be familiar with the well-known TwinView from the old GeForceMX series. Following the example of Matrox, you have the option of connecting two output devices, mostly monitors, televisions or TFT displays, to a single graphics card. The possible applications that result from this are diverse. Be it that you want to output a business presentation on the laptop and at the same time on a projector, be it that you simply need more space on the desktop to work, or you just want to enjoy your games or DVDs on your home television. Now nVidia's solution called TwinView was not really competitive in the past, as only a single RAMDAC was integrated in the GeForceMX chip and a second, external RAMDAC had to be soldered onto the board for dual-screen operation.
This is no longer necessary for the new chips based on the nV17 (MX) and the nV25 (Ti). They have two independent RAMDACs with 350MHz each, so more than enough reserve to operate even high-class monitors not under their capabilities. Furthermore, the complete hardware for TV-out functionality up to a resolution of 1024x768 pixels is integrated into the chips. So it isFinally also possible, e.g. to continue to operate the monitor with flicker-free 85Hz while playing a DVD on the television, a major shortcoming of earlier days has now been resolved. The associated driver software for nView should also be able to create up to 32 individual desktops and toggle them via hotkey or to link to a specific logged-in user.
A feature that, according to the information available to us, only has the nV17, i.e. the chip on the GeForce4MX cards, is the video processing engine. This VPE is, among other things, responsible for DVD acceleration, whereby nVidia can catch up with ATi in this area too, as they now also offer an iDCT-capable graphics card, which further relieves the CPU during DVD playback. Of course, that doesn't mean that high-quality DVD playback is not possible with the GeForce4 Ti, but here the CPU has to step in and do a larger part of the work. Here and with the integrated LVDS transmitters, a GeForce4 Ti is actually inferior to an MX, it does not have two of these, like the MX.
Even if it is for a final conclusion In view of the one-sided information available to us it may be too early, but let's dare a few final words.
It seems as if nVidia woke up just in time after both the performance crown and also the sympathies of the users got more and more in danger. With its excellent Radeon8500, ATi has set a clear warning about nVidia's bug and has already moved a little ahead in the area of 3D and multimedia features. With the GeForce4, nVidia has shortened the gap a little in the feature area and in view of the specified clock rates, the performance crown should migrate from Canada back to California.Although we have been provided with some reference results for both the MX and the Ti version of the GeForce4, we would like to postpone an evaluation of the performance until a later date when we have real available copies of the cards.
For questions our forum is of course responsible as always.