ATI HD4870X2 vs NVIDIA GTX 280 - High-End VGA Comparison

VGA Reviews by jmke @ 2008-08-12

ATI launches their high end single VGA card today, the HD4870X2 is compared to NVIDIA top offering in this review using 8 different games to see which ones comes out on top. Read on to find out!

Introduction

Introduction

Anybody not living under a rock has heard of the ATI HD4xxx series by now. Officially launched less than 2 months and already topping the sale charts and recommended by enthusiasts and newcomers alike when building or upgrading their PC.

ATI kept pricing amazingly low for their latest video cards which gave them a significant price/performance edge over the competition. The HD4850 can now be found for €130~150 while the HD4870 goes for slightly more than €200. The HD4850 matches previous NVIDIA high end card 9800 GTX in performance, while the HD4870 is challenging the new NVIDIA high-end GTX 260. This leaves the NVIDIA GTX 280 lonely at the top as the fastest single GPU video card solution… until today.

We already got a preview of things to come when we saw Crossfire action with the HD4xxx cards, delivering performance on par/higher than the GTX 280. Last generation ATI produced a dual GPU solution on one card, the HD3870X2, which provided quite a good price/performance balance in the high-end GPU bracket. Today they release its successor: The ATI HD4870X2.

Madshrimps (c)


What we are looking at is basically two HD4870 on one PCB, with fancy cooler to keep the two 55nm based GPUs from overheating. Specification wise it’s quite easy: take HD4870 specs and double them:

  • Transistor Count: 956 Million * 2
  • Core Clock: 750Mhz
  • Memory Clock: 900Mhz GDDR5
  • Stream Processors: 800*2
  • Texture Units: 40*2
  • ROPs: 16*2
  • Memory Bus: 256-bit * 2
  • Memory Size: 1Gb * 2

    The card is PCIe 2.0 compliant and requires extra power from power supply with one 6-pin and one 8-pin connector. There will be several launch partners who will provide the end-users with a HD4870X2 product. As with the HD3870X2 the differences between the different manufacturers will be small as the cooling design and PCB are quite unique and providing a custom cooling is far from straightforward.

    AMD send us a Sapphire HD4870X2 2Gb GDDR5 sample to evaluate and that is what we’ll do. Of course Sapphire is not the only company that will sell HD4870X2 based products, below you find box shot of several other launch partners, but not all:

    Madshrimps (c)Madshrimps (c)Madshrimps (c)


    Let’s take a closer look at the card ->
  • Sapphire HD4870X2

    Sapphire HD4870X2

    Sapphire is one of ATI’s launch partners, their product follows reference design, they do add a few extra software titles to increase the total value. The HD4870X2 is as multimedia capable as a single HD4870 and to help you on your way you get a full version of Cyberlink PowerDVD 7 and their DVD Creation Suite.

    Inside the package we also found 3DMark Vantage with a Professional License. Certainly a fine technology demo to show the powers of your VGA card, but not very interactive.


    Madshrimps (c)

    Madshrimps (c)


    FYI: Inside this media sample box we didn’t find any HDTV cable or DVI->D-Sub converter, but these items are listed on the official box contents.

    Madshrimps (c)


    Madshrimps (c)


    Madshrimps (c)


    Madshrimps (c)


    Last minute addition: this nice expanded shot of the MSI HD4870X2

    Madshrimps (c)

    Leadtek Winfast Geforce GTX 280

    Leadtek Winfast Geforce GTX 280

    ATI releases the HD4870X2 to take on NVIDIA fastest GPU to date, the GTX 280. Leadtek was kind enough to send in a sample of their high end product, the Winfast GTX 280.

    Leadtek follows the reference NVIDIA design with their GTX 280 unit, the card is as big as the HD4870X2 length wise and also takes up two slots. The heatsink shroud does increase the total dimensions of the card, so make sure your case has room.

    The Geforce GTX 280 we’re testing is based on 65nm fabrication process; there are rumors floating around that a revised 55nm based product will be released in the coming months, for now we’ll have to settle with this one.

    • Transistor Count: 1400 Million
    • Core Clock: 602Mhz
    • Shader Clock: 1296Mhz
    • Memory Clock: 1107Mhz GDDR3
    • Stream Processors: 240
    • Texture Units: 80
    • ROPs: 32
    • Memory Bus: 512-bit
    • Memory Size: 1Gb

    The specifications of the GTX 280 put it on par with the HD4870X2, same amount of Texture Units and ROPs, but lower Core clock, GDDR3 vs GDDR5, but a 512-bit dedicated memory bus. All in all a worthy successor of the popular G80 8800GTX high end card.

    Madshrimps (c)

    Madshrimps (c)


    Inside the package Leadtek provides a Driver CD and ….. There’s also a HDTV-out cable and…

    Your power supply should not be a 1000W if you only plan to run one of these in your system, but you do have to have a spare 6-pin and 8-pin connector to make this card power up.

    Madshrimps (c)

    Madshrimps (c)

    Madshrimps (c)

    Madshrimps (c)

    Madshrimps (c)


    Test Setup & Benchmarks

    Test Setup & Benchmarks

    We build our test setup with the help of Tones.be (Belgian’s Largest Hardware Shop!) who helped us with the hard drives, CPUs and monitors, MSI for the motherboards, OCZ for the memory, Coolermaster for the cases and power supplies and last but not least, Scythe for the silent CPU coolers.

    We aim to build a VGA test setup we can keep using for some time to come. To be able to have some performance numbers for our readers today we quickly configured the system with the parts we had at hand.

    Madshrimps (c)


    Intel Test Setup
    CPU Intel Core 2 E8200 @ 2.66Ghz
    Cooling Scythe Ninja 2
    Mainboard MSI P45 Platinum
    Memory 2 * OCZ 1Gb PC2-6400
    Other
  • Coolermaster CM690 Enclosure (3*120mm case fans)
  • Coolermaster UCP 900W Power Supply
  • Western Digital 80Gb HDD (system)
  • Samsung 640Gb HDD (Data)


  • At the time of writing the system we build would cost you approximately ~€1200 without the VGA card. While it’s not a budget system, it’s also far from high end as we’re using a DDR2 motherboard and a mid-range Wolfdale CPU. Combining it with a €300+ VGA card does place it in the more expensive bracket when it comes down to building a game machine.

    One of the costs for a system is the monitor for sure, the system price mentioned above includes this screen, a SAMSUNG Syncmaster 2493HM 24inch, it has a native resolution of 1920x1200 this screen offers quite low 5ms latency. Again this screen is mid-range as more expensive models are available, but the resolution of most 26”~27” screens remains the same at 1920x1200 resolution. You need to invest into a 30” to go higher to 2560x1600 at which point you will be spending a pretty hefty sum.

  • The latest official Forceware drivers were used. There are some PhysX enabled beta drivers available for NVIDIA cards which enables PhysX acceleration in games using this technology. Upcoming games in the future will provide a noticeable difference, for now you need special maps (UT3) or special games (Cellshock) in order to see the PhysX effects.

  • ATI Catalyst drivers used are 8.52.2 which will be known later on as the Catalyst 8.8.

  • We used a clean install of Windows XP with SP3. Vista benchmark results will be added at a later date.

  • OS: Windows XP SP3
  • NVIDIA Drivers: Forceware 177.41
  • ATI Drivers: Catalyst 8.8 (8.52.2)


  • In our tests today we aim to find out how the ATI HD4870X2 and Geforce GTX 280 perform at 1920x1200 for those with 24~ 27” screens as well as 1600x1200 for the owners of the now very affordable 20~22” monitors.

    If you are looking to buy a top end graphics card, chances are you don’t want to settle for anything less than the highest detail in-game. All the games we tested have the highest quality settings enabled in-game for textures and world-detail. The Anti-Aliasing and Anisotropic Filtering was changed depending on the game, for these settings please see the detailed results chart on the following pages.

    These are the games we tested:

  • Crysis (HardwareOC – Custom Benchmark)
  • Quake 4 (Manual FRAPS)
  • Unreal Tournament 3 (Manual FRAPS)
  • World in Conflict (Build-in Benchmark)
  • Race Driver:GRID (Manual FRAPS)
  • Supreme Commander Forged Alliance (Build-in Benchmark)
  • The Elder Scrolls IV: Oblivion (Manual FRAPS)
  • Trackmania Nations (Build-in Benchmark)

    In the future we’ll add more game tests for now this performance test with the games listed above should give you an idea of what the expect.
  • Crysis

    Crysis

    Crytek became famous with their Far Cry first person shooter game, not only for the open-ended gameplay but also because of the stellar system requirements to be able to play the game at high detail. Crysis is their second game and doesn’t disappoint in either gameplay or system requirements.

    Crysis offers several methods to test performance with their game, they include two batch files, one geared toward CPU testing, the other toward GPU testing. These two methods provide very repeatable results but unfortunately don’t reflect real gameplay performance, and only give you an indication of how the game will run.

    Madshrimps (c)


    Then there’s the custom timedemo option, after installing the Crysis Editor on your system you can load up any of the maps, then you jump into the map and open the console, “record demoname” and set your own gameplay demo. There are tools like HardwareOC’s which allow you measure min/avg/max FPS while running through your custom demo.

    The third option is using FRAPS and recording the FPS this way while you play through a section of the game repeatedly to get an idea how the game performance.

    We chose the second method for the repeatability, we tried a few custom demos on one of the island levels, but the most taxing level seemed to be one where you’re inside the Alien spacecraft. This demo was used to test the VGA cards today. The graphics were set to “High” and the benchmark started:

    Madshrimps (c)


    At 1600x1200 both cards offer excellent frame rates, you can definitely enjoy Crysis at these quality settings and resolution.

    Madshrimps (c)


    At 1920x1200 we see a ~5fps drop for both cards, but numbers are still acceptably high. We did notice a lower min. FPS on the HD4870X2, suspecting that this was maybe due to micro-stuttering we manually measured the FPS with FRAPS in-game in the same Alien Spaceship level, but we didn’t notice any stuttering, the 23 fps drop was early on the level.

    When we finished these tests we repeated them using Anti-Aliasing (4xAA) but ran into some performance and settings issue which we’ll have to spend more time on before we can share our findings; these issues may be driver related as the 8.8 is still beta.

    Quake 4

    Quake 4

    ID Software is known for their excellent 3D engines, the current id Tech4 is used in DOOM 3, QUAKE 4, Prey and Enemy Territory: QUAKE Wars. These game include high resolution textures and still run fluent with the highest resolutions with recent video cards.

    With the extra power of the HD4870X2 and GTX 280 we can enable high AA/AF levels and still push enough frames.

    By default the id Tech4 engine caps the maximum framerate to 60. Both cards hit these limits immediately. By entering com_fixedtic “1” in the console we removed the FPS cap and ran through a section of the game, using FRAPS to record min/avg frame rate.

    Madshrimps (c)



    Madshrimps (c)

    Madshrimps (c)


    As you can see from the results above, even with 4xAA/16xAF enabled and using Quake 4 highest in-game quality settings we were not limited by the GPU power. The small drop in performance going from 1600x1200 to 1920x1200 means we have a CPU bottleneck.

    Unreal Tournament 3

    Unreal Tournament 3

    Epic Games had quite a bit success with their Unreal Engine ; at first competing with the ID Tech engine for licensing they now seem to have pulled ahead with the latest incarnation : Unreal Engine 3. The list of games using this engine is huge with blockbuster titles like: Bioshock, Mass Effect, Gears of Wars, Rainbow Six Las Vegas (1&2) and of course their next iteration of the UT series, UT3.

    While many games share the same Unreal 3 engine, the developers can decide how high the system requirements will be, by increasing the level of detail. Rainbow Six Las Vegas for example is known for being more demanding than Bioshock on the same hardware. Unreal Tournament 3 by Epic Games provides an amazing balance between image quality and performance, rendering beautiful scenes even on lower end hardware; on high end graphics cards you can really turn up the detail which makes it picture perfect.

    We used HardwareOC’s UT3 benchmarktool to do a Fly-By of the Deimos level, but it reported very high frame rates, which compared to in-game performance levels did not match at all. So we switched to a manual FRAPS session of a 5 minute deathmatch on Deimos; averaging the results of several session per setting.

    Madshrimps (c)


    The Unreal 3 engine also has a FPS cap, to unlock it you need to edit the “Base Engine.ini” located under \Engine\Config. Then find “bSmoothFrameRate” and change the value to “False”.

    Madshrimps (c)

    Madshrimps (c)


    At 1600x1200 we get very high frame rates, minimum never going below 60FPS.

    Madshrimps (c)

    Madshrimps (c)


    At 1920x1200 the HD4870X2 has a very small lead, the min FPS is now slightly lower but you’ll still get very fluent frame rates. It seems we are CPU limited again here though. Is there another game out there beside Crysis which can stress these cards?

    World in Conflict

    World in Conflict

    World in Conflict (also known as WiC or WIC) is a real-time tactical video game developed by the Swedish video game company Massive Entertainment and published by Sierra Entertainment for Windows PC. The game is set in 1989 during the social, political, and economic collapse of the Soviet Union. However, the title postulates an alternate history scenario where the Soviet Union pursued a course of war to remain in power.Generally considered a real-time strategy (RTS) game, World in Conflict includes gameplay typical of real-time tactical (RTT) games.


    The game engine of WIC is quite detailed for the scale it provides, while you can reduce the graphics settings to allow playable FPS with mid-range video cards, you can also set everything to maximum and see all the splendor of warfare rendered with high end GPUs. While this game does require a nice graphics card to get the most out of it, the CPU is also a very deciding factor when it comes down to performance as you’ll see.

    Madshrimps (c)


    We used the game’s build-in performance test for these tests:

    Madshrimps (c)

    Madshrimps (c)

    Madshrimps (c)

    Madshrimps (c)


    Even with 8xAA and 16xAF the game is not taxing the graphics cards much, we are yet again CPU limited.

    Race Driver: GRID

    Race Driver: GRID

    This game is made by Codemasters, the creators of the TOCA/Race Driver series. GRID is a hybrid between arcade and simulator of mainly tarmac racing that consists of 43 cars. There are several types of competitions for different cars: GT races, open wheel races, demolition derbies, etc. There are also several tracks from different countries, including Japan, United States and European tracks such as Le Mans and Spa Francorchamps. A track through the streets of Milan is also available.


    The game runs on a modified in-house engine previously used by Colin McRae: DIRT. The similarities between the two games visually are noticeable. Grid adds a much more advanced damage model and several other improvements which makes it one of the best looking race games on PC.

    Madshrimps (c)


    Its heavy use of shaders makes it very demanding for today’s video cards; at the lowest graphics settings it’s definitely far from good looking but does run on a Geforce 8500 GT, albeit not at high resolutions. With the HD4870X2 and GTX 280 we had no such problems as the detail was cranked up the maximum and AA/AF settings tweaked.

    We raced through the Okutama track several times with a Toyota Supra and recorded min/avg frame rates. The average result from several runs were included in the charts below.

    Madshrimps (c)

    Madshrimps (c)


    In the performance previews released earlier last month we saw some promising numbers from GRID, and in our tests today we can confirm this. At 1600x1200 without AA both cards perform excellently, when adding 4xAA to the mix the GTX 280 takes a ~17% performance hit, the HD4870X2 only ~6%.

    Madshrimps (c)

    Madshrimps (c)


    At 1920x1200 the results are quite similar to those at 1600x1200. Where as consoles like the XBOX 360 and PS3 have a hard time getting the game to run at a steady 30fps at 1280x720 these cards run at 60fps in worse case scenario at 1920x1200 with AA/AF enabled!

    Supreme Commander: Forged Alliance

    Supreme Commander: Forged Alliance

    Supreme Commander: Forged Alliance is a standalone real-time strategy computer game expansion to Supreme Commander, and was released in February 2007, developed by Gas Powered Games and published by THQ, and the second title in the franchise. Forged Alliance adds new gameplay features to the game, several new units for the three preexisting factions, and is further optimized for increased performance


    This large scale RTS can really tax the most highest end of systems when you up the unit count into the thousands; while limited mostly to CPU speed (and showing love to Quad Cores) it can definitely do with a nice graphics card to be able to render all the GFX at high detail at high resolution and on multiple screens as the game supports dual monitor support!

    Madshrimps (c)


    We ran our performance test using the building benchmark which can be accessed by adding “/map Perftest” to the game’s shortcut.

    Madshrimps (c)

    Madshrimps (c)


    Supreme Commander is one of the only games which can truly benefit from a quad core CPUs and it shows in this benchmark. Both VGA cards are fast enough to provide you with the GFX but the CPU has a lot more work here to do. Adding a second screen to mix will definitely up the ante for the system requirements; but these cards did not even blink going from 1600x1200 to 1920x1200 with 4xAA/16xAF enabled.

    The Elder Scrolls IV: Oblivion

    The Elder Scrolls IV: Oblivion

    The Elder Scrolls IV: Oblivion, or simply Oblivion, is a single player fantasy-themed action-oriented computer role-playing game developed by Bethesda Game Studios and published by Bethesda Softworks/ZeniMax Media and the Take-Two Interactive subsidiary 2K Games.
    Oblivion's story focuses on a former prisoner drawn into a Daedric Lord's plan to invade the mortal realm of Tamriel. Gates to the hellish realm of Oblivion are opened, through which many daedra flow. The game continues the open-ended tradition of previous Elder Scrolls games, allowing the player to travel anywhere in the game world at any time, including the option to ignore or postpone the main storyline indefinitely.


    When Oblivion was released in spring 2006 we had found a new system benchmark, the expansive gameworld of Oblivion required a hefty configuration if you wanted to play it at the highest detail level. Not only did it tax the video card, the CPU was quite important too.

    Madshrimps (c)


    We had to wait until the end of 2006 for a single VGA solution, the Geforce 8800 GTX, that could run the game at 1600x1200 with AA enabled at acceptable frame rates. We are now two and half years later, let’s see how today’s high end products handle this demanding game. Using FRAPS we chose an outdoor scene and walked a path several times to get repeatable results.

    Madshrimps (c)

    Madshrimps (c)


    The HD4870X2 has a very small lead in Oblivion, both cards provide excellent frame rates, min FPS at the 60 mark? It can’t get much better.

    Trackmania Nations

    Trackmania Nations

    TrackMania is a series of arcade racing games for Windows, which include stunting with cars, track-building, elements from puzzle games, as well as elements testing the players' dexterity. It was developed by the French team Nadeo for the PC. Instead of following the usual trend of choosing a set car and track for playing in the game, the TrackMania games allow the player to create their own tracks using a "Block" process in the spirit of the 1985 game Racing Destruction Set and the 1990 Brøderbund release, Stunts.

    In contrast with most other racing games, the TrackMania series lets the player race a track as many times as he/she wants, until time runs out. He/she can choose to respawn at any time possible, due to landing upside down, going off the track, or even just because the start did not go optimally. Although in multiplayer games multiple cars race on the same track, they cannot actually collide or otherwise influence each other.


    The ugly duckling in our games’ line-up, this is a free game (go download it now!) based on stunt drive racing with fast cars. Nothing realistic about this game but pure arcade fun.

    Madshrimps (c)


    The game engine is very scalable, as you can play it with entry level cards or even onboard IGP, but if you turn up all the detail it will provide a challenge for the higher end cards, so will Crossfire scale well in this game with the HD4870X2 or will the single GPU of the GTX 280 prevail? Let’s find out:

    Madshrimps (c)

    Madshrimps (c)


    Let’s get this out of the way first; Trackmania Nations is sponsored by NVIDIA. Looking at the benchmark results it’s hard to ignore the fact that this game really like Geforce cards. The HD4870X2 still performs admirably, min FPS going to ~37 at 1920x1200 4xAA/16xAF but never causing any serious issues. It’s actually quite surprising to see this freeware game to be so demanding.

    CPU Scaling

    CPU Scaling

    The benchmarks on the previous pages were run with the Core 2 Duo E8200 at default speeds. Clocked at 2.66Ghz this dual core Intel CPU with 6Mb L2 cache is far from slow, but if you are going to consider spending €300+ for a video card, you shouldn’t skimp on the CPU. By today’s standards the E8200 can be considered as a mid-range CPU.

    Lucky for us the Core 2 Duo CPU series scales excellent and overclocking is almost too easy, but we’re not complaining. We bump the CPU speed up to 3120Mhz in step 1 and 3520Mhz in step 2 and ran repeated several of the benchmarks. That’s a maximum boost of 32% in CPU clocks. In the charts below we’ll comment quickly the % boost going from 2.66Ghz -> 3.52Ghz.

    The outcome is almost predictable…

    Madshrimps (c)
  • ATI HD4870X2: 22.5%
  • NVIDIA GTX 280: 14.5%





  • Madshrimps (c)
  • ATI HD4870X2: 11.4%
  • NVIDIA GTX 280: 14.3%





  • HOCBench ~FLyBy of Serenity Map
    Madshrimps (c)
  • ATI HD4870X2: 10.2%
  • NVIDIA GTX 280: 2.2%





  • Madshrimps (c)
    We kept the most exiting result for last:
  • ATI HD4870X2: 2.1%
  • NVIDIA GTX 280: 2.4%


  • As you can see extra CPU power increases performance in almost all games, although the improvement is not 1/1 compared to the CPU clock increase. The ATI HD4870X2 shows promising scaling results in Trackmania and UT3; Supreme Commander favors NVIDIA, while Crysis is definitely not CPU limited.

    For the HD4870X2 to catch up with the GTX 280 in TrackMania you need a ~10Ghz Core 2 Duo… maybe with Intel Core i7 (Nehalem) in September we might see a different outcome.

    Power Usage, Noise, Temperatures & Conclusion

    Power Usage, Noise, Temperatures

    As we come to the end of our evaluation let’s take a look at some of the none-gaming related numbers. The HD4870 is based on 55nm and is quite power efficient, however when you add two RV770 onto one PCB board (creating the R700) then the power efficiency goes down the drain, since Crossfire (or SLI) doesn’t offer 100% performance scaling. The GTX 280 is based on 65nm and is quite power hungry, surpassing the power usage of a 9800GX2 even.

    Madshrimps (c)


    With the system at idle the both cards are closely matched, but under load the difference is almost 100W! While a single HD4870 might do well against the GTX 280, a HD4870X2 is definitely more power hungry, if you ever plan to run two of them in Crossfire, essentially creating a Quad GPU setup, make sure your power supply is of good quality. A single Geforce GTX 280 is not a low consumer by any means but in performance/watt ranking does beat the HD4870X2 setup here.

    Madshrimps (c)


    Both cards are equipped with temperature controlled fans, our room temp of 24°C is not exotic nor freezing; when installed inside the Coolermaster CM690 case (which is equipped with 3 case fans) the cards were quickly reaching their “trigger” temperature which sets the onboard fan in motion and gradually increases the fan speed as the temperature tries to increase.

    Both cards stopped at 82~83°C while the fan ramped up to high noise speeds, the ATI HD4870X2 is without a doubt noisier, the NVIDIA GTX 280 does better but is also very audible. If you want a high end system which runs cool and quiet you’ll have to invest in custom cooling. The Geforce GTX 280 will be easier to equipped with a 3rd party product thanks to its simpler design.

    We also noticed with both cards that system usage increased when the temperature got higher. Colleague reviewer Massman who’s been testing the ATI Radeon HD4850 experienced a similar outcome, as soon as he increase cooling on the HD4850 the power usage decreased.


    Conclusion

    Last week a few e-tailers were a bit too happy to announce the upcoming HD4870X2 and posted prices, the Sapphire HD4870X2 was spotted at €433 and quickly other numbers followed, going as low as €395. The Leadtek Winfast GTX 280 is available starting at €338, those from other manufacturers like MSI, Asus are available at similar prices.

    So we’re talking about a ~€100 difference in the worse case scenario in favour of the Geforce GTX 280. It this important? Let’s take a step back and wonder about who these products are meant for.

    High end gamers. How money conscious are high end gamers? If you are gamer with a 20~22” TFT screen you don’t need either of the cards tested today as you’ll be better of with a single HD4870 or GTX 260 or even HD4850 if you want better bang for the buck.

    To be able to warrant the need for a Geforce GTX 280 or Radeon HD4870X2 you should make sure you have a 24” or higher monitor and a dual-core CPU running at 3Ghz at least (and not be afraid to overclock). At this point in time you have invested quite a bit of money so €100 more or less doesn’t really matter if your aim is to build a high end gaming rig.

    So get the fastest that is available to you. In a single card configuration this is the ATI Radeon HD4870X2. If you want more still, two Geforce GTX 280 in SLI or two ATI Radeon HD4870X2 for Crossfire-X should please you lots… but when you have such a budget available just for VGA, it won’t come down to pricing, but scalability, raw performance and usability. The Geforce GTX 280 uses less power and generates less heat, is less noisy and offers the benefit of higher performance in case multi-GPU scaling fails. The ATI Radeon HD4870X2 definitely offers the performance edge when everything scales well.

    Update: Vendor Comparison Chart! (source)





    Thank you for reading and stay tuned as we have a lot more ATI and NVIDIA product coverage coming up!

    Madshrimps (c)


    We like to thank Steeve from AMD and Selene from Leadtek for allowing us to test their latest products.
      翻译: