Geforce GTX280 & Radeon HD4870X2 AA Scaling with XP & Vista

VGA Reviews by jmke @ 2008-09-04

In this in-depth article we take a look at the performance of the NVIDIA Geforce GTX 280 and ATI Radeon HD 4870 X2 when anti-aliasing is enabled. We test 8 different games at several different AA levels under Windows XP as well as under Windows Vista. How does performance scale when you go from XP to Vista, how much impact does enabling AA have? Read on to find out!

Introduction & Test Setup

Introduction

In this article we’ll take a closer look at the performance of ATI’s latest high end card, compared to NVIDIA’s top card. Both cards offer plenty of headroom when using the latest games. In our first review of the ATI HD 4870 X2 vs NVIDIA Geforce GTX 280 we found that you do not want to invest in these products if you don’t own a high end CPU and have a high resolution monitor.

If your game setup is up to the challenge you’ll find this review interesting as we’ll be using a multitude of Anti-Aliasing settings to see how each card handles the extra rendering load. The HD 4870 X2 GPU can access its onboard 2Gb GDDR5 and this should give it an edge once the resolution and AA levels are increased. By how much you’ll find out on the following pages.

The second effect on performance we liked to investigate was the OS. Our previous review was done with Windows XP SP3. While the majority of users out there are still using XP, those into gaming and multi-GPU high end configurations are more likely to use Vista, and to be able to use more than 3Gb system memory, 64-bit Vista.

So we’ll investigate AA performance in XP SP3 (32-bit) and Vista SP1 (64-bit).

Madshrimps (c)


Which OS will offer the best gaming performance?

Test Setup & Benchmarks

We build our test setup with the help of Tones.be (Belgian’s Largest Hardware Shop!) who helped us with the hard drives, CPUs and monitors, MSI for the motherboards, OCZ for the memory, Coolermaster for the cases and power supplies and last but not least, Scythe for the silent CPU coolers.

We like to thank Sapphire for providing the HD 4870 X2 for test and Leadtek for their Winfast GTX 280. Without their support this article would not have been possible.

Madshrimps (c)


Intel Test Setup
CPU Intel Core 2 E8200 @ 3.52Ghz
Cooling Scythe Ninja 2
Mainboard MSI P45 Platinum
Memory 2 * OCZ 1Gb PC2-6400
& 2 * OCZ 2Gb PC2-8500
Other
  • Coolermaster CM690 Enclosure (3*120mm case fans)
  • Coolermaster UCP 900W Power Supply
  • Western Digital 80Gb HDD (system)
  • Samsung 640Gb HDD (Data)


  • At the time of writing the system we build would cost you approximately ~€1200 without the VGA card. While it’s not a budget system, it’s also far from high end as we’re using a DDR2 motherboard and a mid-range Wolfdale CPU. Combining it with a €300+ VGA card does place it in the more expensive bracket when it comes down to building a game machine.

    One of the costs for a system is the monitor for sure, the system price mentioned above includes this screen, a SAMSUNG Syncmaster 2493HM 24inch, it has a native resolution of 1920x1200 this screen offers quite low 5ms latency. Again this screen is mid-range as more expensive models are available, but the resolution of most 26”~27” screens remains the same at 1920x1200 resolution. You need to invest into a 30” to go higher to 2560x1600 at which point you will be spending a pretty hefty sum.

    Software config:

  • OS: Windows XP SP3 and Windows Vista SP1 64-bit
  • NVIDIA Drivers: Forceware 177.41
  • ATI Drivers: Catalyst 8.8 (8.52.2)


  • These are the games we tested:

  • Devil May Cry 4
  • Unreal Tournament 3
  • The Elder Scrolls IV: Oblivion
  • S.T.A.L.K.E.R.
  • Trackmania Nations
  • Tomb Raider Legend
  • Mass Effect
  • Crysis


  • All tests were done at 1920x1200, the test setup had 2Gb of ram unless otherwise noted.
  • Devil May Cry 4

    Devil May Cry 4

    Devil May Cry 4 is the fourth installment of the Devil May Cry series. It was announced in March 2007 that the game would be released simultaneously for the PlayStation 3, Xbox 360, and PC. In the game, the player controls both Nero, and Dante, the game's protagonist and the series' title character respectively and fights enemies in close combat using firearms, swords, and other weapons. The characters Lady and Trish from previous games in the series make appearances, along with new characters Kyrie, Credo, Gloria, and Agnus.


    The game has both DX9 and DX10 render mode. We’re using the freely released performance benchmark of Devil May Cry 4.

    Madshrimps (c)


    Going from XP to Vista using the DX9 render path doesn’t show much difference, choosing the DX10 render path did give a strange outcome with the HD 4870 X2, we repeated the test several times, but performance without AA was lower than 4AA. This did not happen when using DX9.

    AA scaling is in favor of the Geforce GTX 280 when going from 0xAA -> 4xAA using the DX9 render path. In Vista with DX10 the GTX 280 doesn’t handle the extra rendering load very well.

    The HD 4870 X2 never goes below 200 fps, even when using 8xAA, which is quite nice to see, going from 0xAA to 8xAA drops performance by ~20%, but with such high FPS numbers you won’t notice this.

    For a detail view of the results, with AA scaling and XP -> Vista Scaling see this table

    Unreal Tournament 3

    Unreal Tournament 3

    Epic Games had quite a bit success with their Unreal Engine ; at first competing with the ID Tech engine for licensing they now seem to have pulled ahead with the latest incarnation : Unreal Engine 3. The list of games using this engine is huge with blockbuster titles like: Bioshock, Mass Effect, Gears of Wars, Rainbow Six Las Vegas (1&2) and of course their next iteration of the UT series, UT3.

    While many games share the same Unreal 3 engine, the developers can decide how high the system requirements will be, by increasing the level of detail. Rainbow Six Las Vegas for example is known for being more demanding than Bioshock on the same hardware. Unreal Tournament 3 by Epic Games provides an amazing balance between image quality and performance, rendering beautiful scenes even on lower end hardware; on high end graphics cards you can really turn up the detail which makes it picture perfect.

    We used HardwareOC’s benchmark tool which does a fly-by of the chosen level, do note that performance numbers reported are higher compared to in-game. The map used was “Corruption”

    Madshrimps (c)


    We added the QAA settings for the Geforce GTX 280 as these should offer higher IQ, but to be honest, we have to look extremely hard to notice a difference between 8xAA and 16xQAA. So while these numbers are a lot lower, in a fast paced game like UT3 you might as well stick with 4x or 8xAA.

    Comparing performance between XP and Vista we see that both cards are slower in Vista when no AA is used, the X2 drops by ~25% the GTX only by ~9%. When AA is enabled however the situations is different, now the GTX 280 performs pretty much the same under XP as under Vista, except for the 16xAA setting which is noticeably slower under Vista. The HD 4870 X2 fares better from the OS move, under Vista performance is consistently higher when AA is enabled.

    For a detail view of the results, with AA scaling and XP -> Vista Scaling see this table

    The Elder Scrolls IV: Oblivion

    The Elder Scrolls IV: Oblivion

    The Elder Scrolls IV: Oblivion, or simply Oblivion, is a single player fantasy-themed action-oriented computer role-playing game developed by Bethesda Game Studios and published by Bethesda Softworks/ZeniMax Media and the Take-Two Interactive subsidiary 2K Games.
    Oblivion's story focuses on a former prisoner drawn into a Daedric Lord's plan to invade the mortal realm of Tamriel. Gates to the hellish realm of Oblivion are opened, through which many daedra flow. The game continues the open-ended tradition of previous Elder Scrolls games, allowing the player to travel anywhere in the game world at any time, including the option to ignore or postpone the main storyline indefinitely.


    When Oblivion was released in spring 2006 we had found a new system benchmark, the expansive gameworld of Oblivion required a hefty configuration if you wanted to play it at the highest detail level. Not only did it tax the video card, the CPU was quite important too.

    Madshrimps (c)


    Oblivion was once able to bring any system to its knees, in our tests neither graphics card goes below 60 FPS, even with 16xAA enabled. The HD 4870 X2 has a comfortable lead over the GTX.

    The performance of Oblivion under XP is pretty much on par with the numbers under Vista; NVIDA gains ~4% overall, ATI loses less than half a percent.

    For a detail view of the results, with AA scaling and XP -> Vista Scaling see this table

    S.T.A.L.K.E.R.: Shadow of Chernobyl

    S.T.A.L.K.E.R.: Shadow of Chernobyl

    S.T.A.L.K.E.R.: Shadow of Chernobyl, previously known as S.T.A.L.K.E.R.: Oblivion Lost, is a first-person shooter computer game by Ukrainian developer GSC Game World, published in 2007. It features an alternate reality theme, where a second nuclear disaster occurs at the Chernobyl Nuclear Power Plant in the near future and causes strange changes in the area around it. The game has a non-linear storyline and features gameplay elements such as trading and two-way communication with NPCs. The game includes elements of role-playing and business simulation games.

    In S.T.A.L.K.E.R., the player assumes the identity of an amnesiac "Stalker", an illegal explorer/artifact scavenger in "The Zone", named 'The Marked One'. "The Zone" is the location of an alternate reality version of the Chernobyl Power Plant after its second (fictitious) explosion, which contaminated the surrounding area with radiation and caused strange otherworldly changes in local fauna, flora and even the laws of physics. "Stalker" in its original (film) context roughly meant "explorer" or "guide", as the stalker's goal was to bring (guide) people into the Zone. On July 11th, 2007, GSC Game World announced a prequel S.T.A.L.K.E.R.: Clear Sky which would be released on 5th of September in 2008


    STALKER is a special case when it comes down to Anti Aliasing… as you’ll see

    Madshrimps (c)


    STALKER's game engine uses a form of rendering called Deferred Shading. This effectively means it cannot support proper Antialiasing which would remove all the jagged outlines in the game world, regardless of which type of graphics card you have. Thus the Antialiasing slider has no real impact on performance or image quality. Do note however that with the most recent Nvidia Forceware drivers that you can 2xAA Antialiasing in STALKER through the graphics card control panel. This will reduce jaggedness, primarily on terrain and buildings rather than the foliage; however it can come at a very high cost in performance. (src: Tweakguides.com


    Looking at the performance numbers under Vista do reveal that there is no performance difference between the different ES (Edge Smoothing) settings. With the GTX 280 we can enable 2xAA and performance drops sharply, the game is still playable with an avg FPS of 45 but there’s not much headroom.

    The performance difference between XP and Vista for the GTX 280 is negligible (+2.5%); the HD 4870 X2 under XP is quite another story, without ES enabled performance is all ok at ~93fps, as soon as the ES slider is moved it seems CrossFire is no longer working as performance drops sharply. Most likely a driver issue.

    For a detail view of the results, with AA scaling and XP -> Vista Scaling see this table

    Trackmania Nations

    Trackmania Nations

    TrackMania is a series of arcade racing games for Windows, which include stunting with cars, track-building, elements from puzzle games, as well as elements testing the players' dexterity. It was developed by the French team Nadeo for the PC. Instead of following the usual trend of choosing a set car and track for playing in the game, the TrackMania games allow the player to create their own tracks using a "Block" process in the spirit of the 1985 game Racing Destruction Set and the 1990 Brøderbund release, Stunts.

    In contrast with most other racing games, the TrackMania series lets the player race a track as many times as he/she wants, until time runs out. He/she can choose to respawn at any time possible, due to landing upside down, going off the track, or even just because the start did not go optimally. Although in multiplayer games multiple cars race on the same track, they cannot actually collide or otherwise influence each other.


    The ugly duckling in our games’ line-up, this is a free game (go download it now!) based on stunt drive racing with fast cars. Nothing realistic about this game but pure arcade fun.

    Madshrimps (c)


    The outcome of these tests are quite interesting, GTX 280 drops ~10% going from XP to Vista, while the HD 4870 X2 gains ~6%, this brings these cards closer together. Performance with AA enabled is noticeably better under Vista for the X2.

    For a detail view of the results, with AA scaling and XP -> Vista Scaling see this table

    Tomb Raider Legend

    Tomb Raider Legend

    Lara Croft Tomb Raider: Legend is the seventh game in the Tomb Raider series. Published by Eidos Interactive, this is the first game in the series not to be handled by British-based Core Design, developed instead by British-owned U.S. studio Crystal Dynamics. The PS2, Windows, Xbox, and Xbox 360 versions were released in Europe on April 7, 2006 and in North America on April 11, 2006.


    When the game was released, only the highest end systems could run it with “next-gen visuals” options enabled. Since then more cards are able to run this game properly at the highest quality settings.

    Madshrimps (c)


    The game poses no problems for the GTX 280 which pushes 100+ fps avg even with 16xQAA enabled, going from XP to Vista boosts performance by ~2%.

    The Radeon HD 4870 X2 has a more difficult time with this “TWIMTBP” NVIDIA title, under XP Crossfire fails completely and the game is running at 58-56fps no matter what the AA level. When going to Vista CF is working better this time around as the X2 catches up with the GTX, only at 16xAA level the performance drops off sharply again…

    For a detail view of the results, with AA scaling and XP -> Vista Scaling see this table

    Mass Effect

    Mass Effect

    Mass Effect is set in the year 2183 AD. Thirty-five years prior, humanity discovered a cache of technology built by a technologically advanced but long-extinct race called the Protheans. Studying and adapting this technology, humankind has managed to break free of the solar system and has established numerous colonies and encountered various extraterrestrial species within the Milky Way galaxy. Utilizing alien artifacts known as "Mass Relays", the various space-faring species are able to travel instantly across vast stretches of the galaxy.


    This game is based on the Unreal 3 engine but has noticeably higher system requirements, BioWare created very detailed environments making this title quite demanding, you’ll need a high end system to run this game at high resolution with AA enabled:

    Madshrimps (c)


    AA scaling between the two cards is quite similar with this title, they both handle Mass Effect nicely at max detail without AA enabled, going to Vista drops performance by 5% for the X2 while the GTX 280 gets a 9% increase. When enabling 4xAA performance drops 30~40% on both cards under XP, under Vista the hit is slightly smaller but still very noticeable.

    Using AA levels higher than 8AA with the X2 is not recommended, the QAA modes of NVIDIA should not be enabled as performance drops below 30fps and gameplay becomes very choppy.

    For a detail view of the results, with AA scaling and XP -> Vista Scaling see this table

    Crysis

    Crysis

    Crytek became famous with their Far Cry first person shooter game, not only for the open-ended gameplay but also because of the stellar system requirements to be able to play the game at high detail. Crysis is their second game and doesn’t disappoint in either gameplay or system requirements.

    Crysis offers several methods to test performance with their game, they include two batch files, one geared toward CPU testing, the other toward GPU testing. These two methods provide very repeatable results but unfortunately don’t reflect real gameplay performance, and only give you an indication of how the game will run.

    We used this Crysis benchmark tool which enables you to define a custom time demo; using the build-in “Assault” run-through we measured performance of both cards. We briefly tested the 64-bit executable included with Crysis when installed on a 64-bit OS, but performance was actually lower with no IQ benefit, so we stuck to the 32-bit .exe for all our tests.

    First with High Quality preset:

    Madshrimps (c)


    As the first reviews of the HD 4870 X2 hit the web we read through most of them and found the performance numbers of Crysis to differ quite a bit between reviews. We did a small report on this here. Now we have tested it for ourselves, and as you can see from the results the choice of OS and Render path is crucial to the outcome of the benchmark.

    With the Geforce GTX 280 going from XP to Vista doesn’t drop performance on bit, but enabling the DX10 render path is fatal as you can see, especially with AA enabled the FPS drops below playable levels.

    The Radeon HD 4870 X2 fares better, going from XP to Vista with DX9 actually boosts performance by ~30%! Switching to the DX10 render path has minimal impact without AA, and no impact with AA enabled.

    Next up, Very High Quality and 2gb vs 4gb testing:

    Madshrimps (c)


    Neither of these cards is fast enough to run Crysis with Very High preset at 1920x1200, below 30fps numbers everywhere, the HD 4870 X2 is almost twice as fast when AA is enabled, but still too slow. Adding more system memory (2gb -> 4gb) does nothing for the GTX 280, but with the Radeon we did see small fps increase.

    For a detail view of the results, with AA scaling and XP -> Vista Scaling see this table




    You must also have noticed the stellar performance when it comes down the AA scaling with Crysis; according to the results could enable 4xAA or 8xAA and not notice a drop in performance… to see whether the benchmark was telling us the truth we loaded our own manual FRAPS run-through of the same Assault map; do note that a slightly different path was taken. These tests were done with the HD 4870 X2 and system 2gb ram:

    Madshrimps (c)


    At the high quality settings the findings are pretty much on par, the difference between 4xAA and 8xAA is indeed negligible. But unlike the results with Very High quality would like you to believe, there is a larger performance drop going from 4xAA to 8xAA here. We’ve read some claims of people playing Crysis at 1920x1200 Very High detail with 16xAA enabled at 30fps + with the Radeon card… but after several weeks of testing we just can’t seem to duplicate this, as soon as AA is enabled performance drops below 30fps when Very High preset is used.

    Conclusive Thoughts

    Conclusive Thoughts

    After testing 8 games using a multitude of Anti-Aliasing settings under both XP and Vista 64-bit we can finally make an informed conclusion on gaming with these high end cards.

    Anti Aliasing Performance Scaling

    We put all our results together for all games tested under both operating systems we can extract the following findings:

  • Going from 0xAA to 4xAA

    - Geforce GTX 280 under XP: -19.1%
    - Radeon HD 4870 X2 under XP: -23.1%

    - Geforce GTX 280 under Vista: -21%
    - Radeon HD 4870 X2 under Vista: -10.6%

    For both cards under XP the drop is pretty much on par; under Vista the HD 4870 X2 shows a remarkable difference, as performance only drops by ~10% on average!

  • Going from 0xAA to 8xAA

    - Geforce GTX 280 under XP: -26%
    - Radeon HD 4870 X2 under XP: -25.8%

    - Geforce GTX 280 under Vista: -28.7%
    - Radeon HD 4870 X2 under Vista: -15.4%

    When going from 0xAA to 8xAA the HD 4870 X2 is now better in both XP and Vista, under Vista the performance drop is still quite small, only ~15%! The GTX 280 drops by nearly double.


  • For those wondering if their next gaming machine should be running Windows XP or Windows Vista, the following numbers will be most interesting:

    Performance Scaling XP to Vista

    On average with the Geforce GTX 280 you lose -1.6% by switching to Vista 64-bit, so in short: same performance and no loss!

    The Radeon HD 4870 X2 does even better, if we leave out the numbers of Stalker and Tomb Raider (as Crossfire failed to run properly under XP) we see a +5.8% boost in average FPS going from XP to Vista. If we add the two game titles and let them enjoy CF-scaling the number jumps up to +27.7%!


    This by no means a definitive performance overview, only a snapshot in time using these video cards under XP and Vista. Many of the performance difference can be attributed to driver issues as both these cards are relatively new in the market.

    From a gamer’s perspective there should be no reason why you wouldn’t want to use Windows Vista 64-bit. While our test setup with 2gb was sufficient to run all current generation games, there’s no doubt that having access to more system memory will prove useful in the future, and with a 64-bit OS you can profit from it.

    We hope this article will be helpful for those looking to upgrade their gaming system with a more capable 3D card! Until next time.


    Update 07-09-2008:: Some readers pointed out that NVIDIA uses different AA rendering modes compared to ATI and for fair comparison sake the following needs to be taken into account:

    - NVIDIA 8xAA = 4xMSAA, 16xAA = 4xMSAA (CSAA mode), 16xQAA = 8xMSAA (CSAA mode)
    - ATI 16xAA is a superAA mode where each core will render the same frame with a different AA pattern resulting in superior image quality.


    This does put in perspective the performance of the Geforce GTX 280 when the high quality AA levels are forced to match ATI's levels, as the performance of the GTX 280 is really trailing then. Of course you have to consider how much AA you need for the game to look smooth to you, this changes from person to person.

    We like to thank those who detailed the NVIDIA & ATI rendering modes, much appreciated!
      翻译: