Company: BFG Technologies
Authour: Mark 'Ratchet' Thorne
Date: October 29th, 2005
Ok I lied, I will do some IQ comparisons here, but not the normal style. Instead I will look at NVIDIA's SLI AA as a sort of addendum to the results in my previous reviews. First, a quick once over of SLI AA for the uninitiated.
SLI-AA, as the name implies, is a special AA mode that is unique to NVIDIA's SLi technology. In reality SLI-AA is another SLI rendering mode, and sits in the same options panel (once enabled via a Coolbits) as the other two SLi modes; SFR and AFR. SFR (Split Frame Rendering) splits the frame horizontally and has each card render one half. AFR (Alternate Frame Rendering) on the other hand has each card render alternating frames, one cad takes even numbered frames and the other takes odd numbered frames. Where both SFR and AFR modes increase performance by dividing the rendering workload, SLI-AA increases image quality by dividing the anti-aliasing work load.
Truthfully, NVIDIA's nemesis ATI originally came up with this concept when they announced CrossFire, their answer to NVIDIA's SLI mutli-GPU technology, back in May. NVIDIA however managed to beat them to the punch and include it with a simple driver update well ahead of ATI having CrossFire hardware available.
With NVIDIA's SLI-AA there are actually two modes to choose from, 8x SLI and 16x SLI. Getting them to work requires some simple registry modification (simple depending on your level of experience I suppose. The steps are outlined here). I'll take the 8x SLI mode as an example and attempt to explain how it works.
When using 8x SLI each card renders the same scene with 4x AA, but with slightly offset (jittered) sample patterns so that, when the scenes are combined, the result is in effect 8x AA. However, instead of 4 AA samples and 1 texture sample, the combined result is 8 AA samples and 2 texture samples. In essence, a sort of super-sampled mixed AA mode very similar to NVIDIA's 8xS mode. Here's a nice chart which should help:
With the knowledge of how the 8x SLI mode works we can make a pretty good guess at how the 16x SLI mode works as well. The difference between the two modes is that instead of using 4x AA on both cards as is the case with 8x SLI, 16x SLI uses 8xS on both cards resulting in 16 AA samples and 4 textures samples being used to render the final image, basically giving us true full scene super-sampling.
Image Quality and Performance
With that little SLI-AA primer out of the way, let's take a look at image quality and the level of performance you can expect.
I'll outline my thoughts on the image quality aspect at the end of this page, but first let's take a look at performance. I tested both SLI 8x and SLI 16x on Half-Life 2 and Doom 3 over my standard timedemo tests using all the resolutions we normally run. Here are the results:
SLI AA Summary
After checking out SLI AA for a bit I'm honestly not sure what the point of it is supposed to be. I assumed that after reading the documents the whole point was to improve image quality, but as you can see from the IQ tool above it doesn't do a good a job as you would initially expect. SLI 8x doesn't seem to be anywhere near as good at cleaning up the image as NVIDIA's 8xS mode, and SLI 16x looks to only be about on par with 8xS (though maybe a little better in some cases). However as you can see the performance impact of SLI AA is quite high and certainly wouldn't be useable in most newer games.
One area I think where SLI AA would be of use, and likely remain very useable, is with any of the older games that don't demand as much from the graphics system (or maybe even for the newer slower paced games where high-framerate isn't an absolute necessity). In that case, where a modern CPU would be the limiting factor and not graphics power, SLI AA has a place. For the newer twitch games though you would need a very, very high-spec machine to make use of it.
Ultimately I think SLI AA was more a response to a similar technology that ATI announced with their CrossFire multi-GPU technology than anything else, but I'm not going to take points off for adding a feature, no matter the reason, and even if it's not entirely practical in most situations it's there and it works. NVIDIA just needs to figure out a way to make it easier to access.
SLI Improved - Mixed Vendors
Along with SLI AA I also wanted to check out some of the other improvements that NVIDIA has included in the release 80 drivers.
Previously when using SLI you had to use identical cards from the same manufacturer. While I don't think this restriction would have been a problem for anyone planning on SLI at the outset of an upgrade, it would probably have caused problems for someone looking at upgrading to SLI in stages. For example, if you were to buy a single SLI capable card with the intention of upgrading to a full SLI system somewhere down the road, maybe a few months or a year or more later, looking for another identical card could have been a problem if the manufacturer stopped making them or changed them during that time (even different BIOS versions used on otherwise identical cards could break the old SLI).
Thankfully with the release 80 drivers NVIDIA has fixed things up so that you can now mix and match vendor cards at will, making upgrading to SLI much easier.
To test the claim I simply installed the BFG 7800 GT OC and the reference 7800 GT in my system and connected the SLI link. Upon boot the system recognized it as SLI capable and popped up one of those "click here to enable SLI" popups. Once enable, SLI worked as well as it did with both identical BFG cards installed. Not only does mixed vendor work, but from what I can tell the reference 7800 GT is actually automatically overclocking itself to match the BFG 7800 GT OC clock rates. You still can't mix a GT with a GTX (I tried), and apparently if the clock rates between both cards differ by enough then mixing won't work, but mixed-vendors goes a long way toward needed improved flexibility.
Other SLI Improvements
Other improvements to SLI include the ability to enable and disable it dynamically without needing to reboot, TV/HDTV support, and further additions to list of games supported in the profiles.
As far as SLI as a technology goes, the biggest issue with it for many people was that you had to use identical cards. This issue no longer exists thankfully, and the update seems to work very well (I haven't run into any problems in this regard, but NVIDIA does mention that sometimes it will not work and a reboot might be required). It's still not as flexible as ATI's CrossFire (where you can mix and match vendors without big restrictions on the GPU), but it's a lot better than it was before.
Another issue that some people have taken with SLI was the need for game profiles in the drivers. The problem is that any game without a profile in the drivers won't work with the performance enhancing modes of SLI (SLI AA should always work). This could certainly a big deal if you play a game that NVIDIA hasn't included a profile for (you can add game profiles yourself, but it requires registry editing and some knowledge of SLI. Certainly not a straight forward task for most people). The good news is that NVIDIA has been working with developers and has helped them get support for SLI right in the game, negating the need for NVIDIA to use game profiles in their drivers. F.E.A.R. and Call of Duty 2 are two such games that include SLI support out of the box with undoubtedly more to follow.
Even ATI saw the success that NVIDIA was having and changed their attitude toward multi-GPU graphics pretty quickly (but not quickly enough some would say). Over the last year and a half NVIDIA has taken SLI from debatable impracticality to unquestionably practical and highly desirable.
Next we'll take a look at general SLI performance in a bunch of different games and compare it to a host of other graphics cards over a bunch of different resolutions and settings. Read on...
content not found