Leadtek 9600 GT SLI reviewed and compared with 9800 GTX

VGA Reviews by geoffrey @ 2008-04-20

With the Geforce 9600 GT pricing reaching new lows, it becomes interesting to see if pairing up two of them can get you more bang for the buck compared to the more expensive Geforce 9800 GTX. In this review we use two Leadtek 9600 GT Extreme cards and run our battery of game tests to find out what performance you can expect.

Introduction

Introduction

Now that the 9600GT video cards have been out for a while prices are starting to drop and we are slightly going to the point where two 9600GT's come close to the 9800GTX price wise. The 9600GT is known for having quite some power on board, we wondered if at this moment two 9600GT's stacked in SLI is an attractive solution when it comes down to rendering 3D graphics. Leadtek supported our point of view by providing us two of their PX9600GT Extreme samples. Being factory overclocked you even get few percent extra performance, but enough to take on the 9800GTX?




Madshrimps (c)


Almost from the outset in October 1986, Leadtek Research Inc. embarked on a vision filled quest with pioneering designs for computer graphics products and motherboards. The fledgling company quickly claimed its 'realm' in a fiercely competitive market and began delivering copyrighted EGA BIOS and AT motherboards to an abidingly loyal clientele. So that by 1990, Leadtek's progress with quality graphics cards had resulted in global recognition for its popular brand name 'WinFast'. With the PX9600GT Extreme we're looking at the latest addition of more then 15 years experience; let's hope this card can make Leadtek shine once again.

Madshrimps (c)
Leadtek Winfast PX9600GT 512MB Extreme | core: 720MHz | shader: 1750MHz | memory: 900MHz





The WinFast PX9600GT Extreme is a Leadtek product based on NVIDIA's reference design of the 9600GT. The card comes with the NVIDIA reference PCB but Leadtek factory overclocked their cards up to 720/1750/900. This means around 11% more GPU core power and nearly 8% extra shader power. The 9600GT comes with 64 stream processors; using two GT's in SLI for extra performance makes the combo in fact as powerful as a standard 9800GTX, in theory at least:

Madshrimps (c)


Both solutions have their pros and contras, let's quickly pop open the package and have a view inside the box ->

Inside the box & closed-up pictures

Inside the box

Madshrimps (c)


  • Leadtek Winfast PX9600GT 512MB Extreme
  • DVI -> Sub-D VGA adapter
  • DVI -> HDMI adapter
  • Video out -> S-Video & Component adapter cable
  • 6-pin PCIe to molex power cable converter
  • SPDIF connection cable
  • Driver CD and quick install quide
  • PC Game: Overlord

    Specifications

    Madshrimps (c)


  • NVIDIA 65nm G94-A1 core
    - core clock: 720MHz
    - shader clock: 1750MHz
  • Samsung 512Mb (8x64Mb) GDDR3 with 256 bit interface
    - Memory clock: 900MHz
  • active heatpipe and fan cooled, single slot aluminium and copper solution
  • 2x Dual Link DVI
  • native HDMI support
  • TV-out
  • Integrated HDTV decoder
  • 2-way SLI ready
  • PCI-Express 2.0 16x
  • Dimensions: 23cm x 11cm

    Closed-up pictures

    Leadteks impressed me with their 9600GT, not only did they add extra performance via a reprogrammed overclocked BIOS, they also made sure that their product is complete HDCP and HDMI ready, and as a surplus they even added a full pc game! The card itself looks a bit like the 8800GT's, it comes with nearly the same board size and single slot cooler, a fast and compact video card it seems, the next best thing in HTPC land?

    Madshrimps (c)


    The Leadtek PX9600GT Extreme comes with the many basic features found on today’s NVIDIA video cards, just like Galaxy's card found in a previous article it has dual DVI-I video connectors combined with a 9 pins TV-out connector. The card uses a 16x PCI-Express 2.0 compatible bus interface and via a SLI connector you can utilize up to two PX9600GT's inside your gaming setup.

    Madshrimps (c)


    Off course, the 6 pin power connector is mandatory here too, the connector has been nicely covered in the full cover heatsink which somehow gives this card retro looks.

    Madshrimps (c)


    The G94 NVIDIA GPU's are HDMI ready, but most card will only come with dual DVI connectors, in order to utilize HMDI Leadtek added a free HDMI dongle. The board itself holds place for a small connector where the added SPDIF cable can be plugged in, this way digital audio will also be available via HDMI cable making the Leadtek PX9600GT Extreme full HDMI compatible.

    The back of the board reveals nothing spectacular; the Leadtek 9600GT seems to be based on NVIDIA's reference board design and is powered by a three phase synchronous DC-DC converter. This VRM, named ADP3208A, can be found at the left side of the above picture and is in fact a 7-bit programmable CPU mosfet driver, although that the A might stand for a special designed version aimed for video cards. At this moment we're not sure if it is possible to reprogram the GPU core voltage via this advanced VRM, but theoretical it could be possible that the VGA BIOS and ADP3208A have some kind of serial interface which allows both to communicate with each other.

    Madshrimps (c)


    People changing heatsinks will have their work, the card counts up to 12 screws, taking those of will results in something like this:

    Madshrimps (c)


    Madshrimps (c)
    The memory modules: Leadtek uses the NVIDIA reference 9600GT boards which come with Samsung K4J52324QE-BJ1A GDDR3 chips, also found on Galaxy's card. Maximum rated at 1000MHz at 1,9V, this RAM should yield quite well in our overclocking experiments as NVIDIA actually feeds it with 2V. We found 8 modules each containing 64MB memory space, this results in 512MB for the GPU to store data on, GPU <-> Memory interface is 256 bits wide which is just what this card needs.
  • Test setup & Benchmark methodology

    Test setup

    Geoffreys' Test Setup
    Madshrimps (c)
    CPUIntel E6600 @ 3,6GHz
    CoolingZalman 9700 LED
    Mainboard
  • ASUS P5E
  • MSI P7N SLI Platinum
  • Memory2x1Gb TEAMGROUP Xtreem 800MHz 4-4-4-10-35-4-10-10-10-2T
    Graphics
  • Leadtek WinFast PX9600GT Extreme 512MB
  • Gainward Bliss 9800 GTX PCX 512MB
  • Sparkle 8800GTS 512MB
  • Leadtek WinFast PX8800 GT ZL 512MB
  • HIS HD3870 512MB IceQ3 Turbo
  • Sparkle 8800GTS 320MB
  • Galaxy 9600GT 512MB OverClocked Xtreme Tuner
  • Other
  • FSP Epsilon 900 PSU
  • Maxtor 80Gb PATA HDD
  • Seagate 200GB SATA HDD
  • Antec Nine Hundred housing
  • 20" Dell UltraSharp 2007FP TFT monitor


  • The Intel E6600 was being overclocked to 3,6GHz by changing the FSB from 266MHz to 400MHz and by keeping the multiplier at default (9). This way the CPU would score more or less on par with the highest clocked Wolfdale CPU's.
  • Teamgroup's pair of 1Gb DDR2 sticks were clocked to default settings via the Intel X38 chipset.
  • ForceWare 169.21 drivers for GeForce 8800GTS 320MB video card
  • ForceWare 174.12 drivers for GeForce 8800GT and GeForce 9600GT video cards
  • ForceWare 174.40 drivers for GeForce 9800GTX video card
  • Catalyst 8.2
  • While Windows Vista is now officially launched, we decided to test with a mature Windows OS (XP SP2).

    Benchmark methodology

    The game benchmarks were completed with the Leadtek PX9600GT 512MB, Gainward 9800GTX, Sparkle GeForce 8800 GTS 320MB and Sparkle 8800GTS 512MB at stock speeds, while we clocked the HIS HD3870 512MB IceQ3 Turbo, Galaxy 9600GT 512MB OverClocked Xtreme Tuner and the Leadtek PX8800GT ZL 512MB down to NVIDIA's reference levels.

    We tested the Leadtek 9600GT cards on both NVIDIA and Intel chipsets and it seemed that on the Intel X38 board the video cards come out just a tad faster. To make sure we were comparing the cards at the same performance level made sure that ALL 9600GT tests were done on the same platform, and since the MSI K7N SLI Platinum is the only board we have around supporting SLI it is no wonder why that become our preferred board for benchmarking. Non the less did we add the results we obtained with other NON-9600GT video cards.

    All tests were done with a 20” LCD monitor with a maximum resolution of 1600*1200, where 4xAA and 16xAF offered the best in-game graphics we found that AA disabled didn't decrease the eye-candy that much while the frame rate now actually become more acceptable. The in-game settings were always maxed out, we also made sure we were using the full games with the latest patches applied in order get the best up-to-date performance and graphics.

    With our following performance charts you will be informed at all time at what IQ settings were being used. FRAPS was used to measure the FPS during repeated manual run-troughs of a certain part of the games tested, the minimum, maximum and average values were recorded.

  • Colin McRae DIRT
  • Crysis
  • The Elder Scrolls Oblivion
  • Rainbow Six: Vegas
  • Bioshock
  • Futuremark 3DMark Series
  • Synthetic benchmarks: Futuremark 3D Mark

    Futuremark 3D Mark

    Madshrimps (c)Madshrimps (c)


    The 3D Mark benchmark applications are one of the first stress tests for any incoming review sample. Futuremark's software may not be representative of gaming performance; you do get an idea on what we can expect from a new set of video cards. Besides, a 3D Mark test session requires only 2 clicks and after approx. 10 minutes you get a score which should indicate the speed of your pc configuration. Do keep in mind that 3D Mark tests the entire hardware setup; results do not always reflect the power of the graphics card alone.

    Madshrimps (c)


    The 9800GTX is the fastest single GPU solution as we speak but in 3D Mark 2003 it falls quite a lot behind the dual GPU solutions, compared to the 9800GTX the dual 9600GT's score 43% better!

    Madshrimps (c)


    In 3D Mark 2006 the Leadtek GeForce 9600GT's stacked in SLI scores around 13% higher than the 9800GTX. Do take into consideration that with benchmark scores this high the CPU can make a noticeable difference in the test runs since 3D Mark 2006 is very CPU demanding. Where 3D Mark 2003 shows quite some extreme differences between both same priced solutions, in-game scores might not be this extreme though, we do all remember how the 9800GTX does not have many problems beating the sub €200 product market.

    For now you can easily judge that SLI cards do help quite a lot when aiming for top spot overclocking world records, let's hope the in-game results are as good as their synthetic counterparts, head on ->

    TES: Oblivion

    TES: Oblivion
    Official website

    The Oblivion graphics engine can be very taxing for even the fastest hardware once you increase the level of detail. There is also a difference in performance compared running outside, in city area or inside a dungeon, the outside is most taxing for the graphics card, the town area is noticeably easier on the GPU while the dungeon is usually less taxing of all. We tested the heaviest environment with HDR enabled and in-game settings maxed out.

    Madshrimps (c)


    In Oblivion the difference between mainstream 9600GT and high-end 9800GTX is rather small. Yes, there is an 18% difference in speed, but when you start looking at the price the 9600GT is a lot more interesting. Same can be told about the SLI setup, the second card did help to improve the frame rate and over the complete test our double GT's scored just a tad better then the 9800GTX, but compared to a single 9600GT the SLI setup is far from interesting.

    Madshrimps (c)
    Click to enlarge

    Colin McRae D.I.R.T.

    Colin McRae D.I.R.T.
    Official website

    With D.I.R.T., Codemasters has yet again updated their popular Rally car racing game. This time they've pushed their game past the classic point-to-point rallying by adding extreme off-road action on dirty and muddy tracks, or one could have some fun on gravel circuits. The brand new engine for the new Colin McRae game pushes even the latest video cards down to their knees; can our round-up competitors offer anything worth talking about?

    We moved on from testing the demo to the full game, as the final release seems to have gotten a minor performance fix for current generation video cards. We still went through the Avelsbachring racetrack located in Germany, this time we could run the game at an acceptable 30 frames per second at 1600x1200, enjoyable!

    Madshrimps (c)


    During previous benchmarking sessions I noticed how with my preferred image quality settings the game began to play more fluently when using the high-end cards. Theoretically the 9600GT SLI came out as fastest but actually playing the game I noticed the same issue's I had when testing the HD3870X2: micro-stuttering. To me, moving to double GT's felt like I only got one card installed, the 9800GTX is a much better choice here if you want extremely fluent gameplay at the 1600x1200 resolution.

    Madshrimps (c)
    Click to enlarge

    Rainbow Six: Vegas

    Rainbow Six: Vegas
    Official website

    This First Person tactical shooter is based on the Unreal 3 game engine; it looks splendid and takes a very high end video card if you want to run it at high resolutions. It makes heavy use of shaders and excels in delivering close very realistic Las Vegas setting, with a series of Casinos on the strip that you need to “clean”. The Unreal 3 engine pushes GFX effects to the next level but requires a hefty VGA card to deliver playable frame rates at higher detail and resolution.

    We went on hunting down terrorists, the Calypso Casino level looked to be one of the heavier levels to run through:

    Madshrimps (c)


    Vegas is one of the less good looking games in this round-up, yet it does put enough load on todays mainstream videocards. The 9600GT on its own all ready offers very fluent gameplay at our selected resolution and image quality settings, but going dual card will make the setup ready for even higher resolutions and Anti-Aliasing settings. In our test the dual GT's performed on par with the 9800GTX, it isn't really worth it here to consider the GT's over the GTX but if you have all ready bought one GT then the extra card does give you the possibility the scale up to much better looking graphics.

    Madshrimps (c)
    click to enlarge

    Bioshock

    Bioshock
    Official website

    This latest game based on the Unreal 3 engine has been getting rave reviews for delivering an immersive first person shooter experience. You crash land with your plane near and underwater city and explore the depths of it for coming 20 or so gameplay hours. The art style and details really make this “shooter” stand out from the rest, and with a gripping story will have you playing it for hours on end.

    Madshrimps (c)


    Even though the ATI HD3870 videocard come out here surprisingly high last time, its seems that Bioshock does like to threat the green cards good too because when going dual GPU made our 9600GT SLI came out as fastest leaving the high-end 9800GTX far behind, it outperforms it by roughly 25%. Looking at the minimum framerate both are much closer but even then the GT's claim the victory with around 12% higher performance.

    Madshrimps (c)
    click to enlarge

    Crysis

    Crysis
    Official website

    It's the year 2019 when a group of researchers suddenly disappears just after discovering something very frightening. Your team is there to get them back, but instead of making the quick run through the jungle the island seem to been stacked full of Korean soldiers and alien scumbags. Things get worse since an unknown force seem to be altering the wither system...

    From the hands of those who made Far Cry, Crytek now presents you Crysis, a first person shooter based on the second generation CryEngine game engine. This shooter comes with near real graphics quality, definitely a challenge for even the fastest video cards found on today’s market. We used the Crysis Benchmark Tool to measure in-game performance, the "Sphere" level was our test of choice:

    Madshrimps (c)


    I've spend quite some time exploring the frozen Korean jungles last few months, all I can say is that Crysis is one big resource hog; Bioshock plays a lot better on current generation hardware while the graphics do look good too. Non-the-less, we all know Crysis’s gfx are sublime, there are few graphical improvements which could make it worthy for you to upgrade but even the utterly fast 9800GTX struggles to play Crysis fluently when using high image quality settings. To my surprise, going for double Leadtek 9600GT video cards gave me the possibility to play Crysis at some pretty amazing frame rates which I couldn't get with a single 9600GT solutions. The dual GPU's even top out the 9800GTX in every aspect, but there isn't enough room left to alter the engine settings to a higher resolution. As being a pure FPS game the high frame rates will certainly come in handy anyway and with roughly 50% higher minimum frame rate the SLI setup is producing very kick-ass results in compare with a single 9600GT video card.

    Madshrimps (c)

    Overclocking: 3D Mark & in-game

    Overclocking: 3D Mark 2006

    Overclocking the video card can be very risky, but from my experience I'd say that it is still a pretty safe way to increase the gaming performance of your brand new video card, unless you start doing volt modifications off course. The GPU design and manufacturing quality is mostly related to the overclocking capabilities of a certain video card, in that way we have seen graphic cards gaining less then 10% performance while others can boost frame rate up to 30%! Pushing hardware to the limit to see how fast it can go seems to be of interest to a larger than expected crowd, I recon some of you are probable looking at this page just to know how much they can get for free, here is what I found out about today's tested product.

    Madshrimps (c)


    With both our Leadtek cards installed in SLI we managed to increase the video card settings up to 770MHz for the core, 1920MHz for the shaders and 1100MHz for the memory. That's still quite a lot since the Leadtek factory clocks all ready push the card roughly 10% higher on the core clock compared with the NVIDIA reference clock.

    In the end, both Leadtek 9600GT's are now running at roughly 20% higher clocks compared to the standard 9600GT, but even then don't top out as fastest solution, the HD3870X2 remains a few percent better but the difference is negligible at best. Overclocking the single Leadtek 9600GT brought a much higher boost, in 3D Mark 2006 we obtained 8% extra performance, when you go comparing it with the results of the NVIDIA reference board this means that you'll be gaining around 13% extra performance.

    Overclocking: Colin McRae D.I.R.T.

    The cards had no problems with stability over longer periods of intensive gaming, we ran DIRT for hours and hours without having any blue screen or other oc-related system error. Let's have a look at our results:

    Madshrimps (c)


    With SLI not scaling that well in most games you always win less performance when overclocking dual cards instead of a single card system. In-game overclocking our dual GPU solution didn't offer any extra rendering performance worth mentioning. In single card solutions we found that there isn't much to win either when your 9600GT has all ready been clocked higher by default, non-the-less when we compare with the NVIDIA reference clocked card we gained 5FPS which is still enough to make the game play more fluently, or better looking instead.

    Power consumption & Noise

    Power Consumption

    Madshrimps (c)


    The system power consumption was measured using a Brennenstuhl PM 230 electrical energy meter. In real life the power consumption is not measured, it is being calculated by measuring the AC voltage at the back of your pc's power supply multiplied with the measured current flowing through the mains cable and multiplied with the power factor which occurs when using capacitive or inductive loads on alternating current (AC). In any case, our device is no professional equipment, our results can be off be 5%, if not more, but at least we are not left guessing the power consumption. Do understand that this is the total system powered measured at the back of our PC's power supply, it is not an indication of what to expect from a single computer component. Here is what we got for our tested products:

    Madshrimps (c)


    The 9600GT is one of the least power consuming mid-range products as we speak. The shrunk G94 GPU comes with only 64 shaders by default and this certainly plays a role here because this way the transistor count can be reduced dramatically, overall lowering the power consumption too. Though, once we stacked two GT's together we saw the power consumption rise by a lot, now you get double the pleasure but also double the transistor count; thus higher power requirements. The two GT's are still demanding less then the HD3870X2 in full load but since ATI has equipped their cards with the PowerNow technology the X2 scores a lot better when not being stressed. Do note that for the 9600GT SLI power consumption measurement we had to switch mainboards: for the other cards we used the ASUS P5E (X38) where for the SLI setup we used the MSI P7N SLI Platinum mainboard.

    Temperature

    Madshrimps (c)


    Rivatuner is the tool every video card geek should have in house, it not only allows adjusting clock rates to higher levels, it also gives us the opportunity to log the GPU temperature over longer periods of time. We used Rivatuner to log the GPU temperature while 3D Mark 2005 in loop was used to stress the video card, here are our results:

    Madshrimps (c)


    When going SLI the good side of the story is that you have two separate video cards and thus the heat is much more spread out this way, cooling wise the 9600GT is pleased with only a single slot cooler. The temperature level of the GPU is at 64°C not to be considered any problem, and even if you do not have the best airflow inside your pc case the fan will auto adjust its speed in order to prevent the card from overheating. We did not notice any of the two cards running warmer then the other, probable this is due to the MSI mainboard which leaves quite some space for feeding cool air to each VGA card, and off course there is always the excellent cooling ability's of the Antec 900 housing.

    Noise

    Madshrimps (c)


    Noise created by fans and spinning hard disks is not what everyone cares about, though some people are really keen on dead silent pc's, therefore we added a noise chart. Noise is measured with a Smart Sensor AR824 digital sound sensor 50cm away from our Antec PC housing. We stopped all fans in order to give an exact representation of the noise that is being created solely by the tested video cards.

    Madshrimps (c)


    Out-of-the-box the cards never were noisy, it was noticeably during heavy 3D load, but with high-end CPU's, high-end mainboards and other high power components around the noise level created by the video card(s) only might not even be noticed. In the end, we’re talking about high-end, where performance comes over noise and power consumption for most people. When you are interested in using only one Leadtek 9600GT I'd say that there are probable better solutions to go with if you are being serious about low noise computing, we may come back on this with future articles/round-ups.

    Pricing & Conclusive thoughts

    Pricing

    In retail stores we found the 9600GT priced in between €120 and €160, while the reference clocked 9800GTX can be found for somewhere in between €240 and €280. Due to the Leadtek cards being Extreme versions which have their performance increased by default, expect to spend another €10~€20 over the reference clocked samples, you can find Leadtek in most local (e)retailers.

    Conclusive thoughts

    Before we let you go, let us have a look at what we at Madshrimps have been testing lately. Here is performance overview chart:

    Madshrimps (c)


    After pushing the video cards through multiple tests we can conclude that the Leadtek 9600GT is always a tad faster than the reference 9600GT. The Extreme clock speeds puts the Leadtek card roughly 7% in front of the reference board, unfortunately this doesn't bring the extra power needed for improved image quality setting; at best you may get an one-step higher AA or AF filtering level.

    We found the card not too noisy for our gaming purposes, the fan does speed up when you load the video card over longer periods of time but since it is auto adjustable it will only get noisy when there is insufficient case cooling. The GPU core temperature level remained low, with temperatures as high as 65°C there is nothing to worry about, not even when using two cards stacked in SLI.

    Talking about SLI, we tested two Leadtek 9600GT Extreme samples and compared the results with the 9800GTX reference board. Performance wise you can easily judge that SLI can give you quite some extra performance, up to roughly 70% increased FPS! Here and there you'll stumble upon games which have less support for NVIDIA's multi GPU-technology, there you'll have to do it with around 30% extra performance. But on the other hand, even with the 9800GTX having only one GPU and thus probable less configuration problems, the card never took the lead in average frame rate over the dual 9600GT's branded Leadtek. Yet, again, in CM DIRT for example, the GT's had to deal with the phenomena called Micro Stuttering, this does not occur when using single GPU configurations.

    As for other games we didn't test, I do not belief that NVIDIA has fixed all of their multi-gpu's issue's and so the doubled GT setup might not yield any higher performance in the non-supported games. Spending double the amount of money does not yield double the amount of performance, but with high-end did we ever expect any different? Look at the 9800GTX, it doesn't offer double frame rates either when compared to the 9600GT, though when you go for the 9800GTX you are most likely guarantied of your performance boost while with SLI the system performance is much more dependent on the variable that SLI should function properly.

    Madshrimps (c)


    Overclocking wise we saw that Leadtek did leave quite some margin for the end-user to play around with, when using one manually overclocked 9600GT expect to gain around 7% extra, all together this will make your card fast enough to use higher image quality settings compared to the reference design. Then again, when you are thinking about overclocking your video card you could as well just go for the reference clocked cards since these will mostly come cheaper. When used in SLI the margins are much smaller, it is not interesting enough to waste your precious time with because there is hardly anything to gain.

    Two cards stacked together do demand some extra electrical current, when you're limited by your PSU the 9800GTX might be a better choice. Heating and cooling are of no issue, the Leadtek 9600GT heatsink has no problem dealing with the heat produced by the G94 core, not even when going SLI. With two cards installed inside your system you might probable notice one card slightly running warmer then the other, though in my testing the mainboard provides a lot of breathing room for the cards and together with the high cooling abilities of my Antec 900 series housing I did not notice any of the two cards running slightly higher then the other. The cards are not too noisy either, they don't offer a superb soundless computing experience but the auto-fan design makes the card rather silent in most classical pc cases.

    In the end, price wise you will get slightly higher performance when purchasing a Leadtek 9600GT Extreme. You get the Overlord full game as extra which does add up to the value of the product, if you do not already own the game that is. Whether you should go SLI is another issue, the two Leadtek Extreme samples are factory overclocked but since overclocking doesn't really help that much when using SLI it's pretty much safe to say that a 9600GT SLI setup will most likely perform on par with the 9800GTX, sometimes even outperforming it. Doubling the amount of 9600GT's will get you around the 9800GTX price; you should look around in your favorite on-line store / retailer to see for yourself. The SLI setup doesn't guarantee the extra performance though; where the 9800GTX should have no issue's (in theory) so spend your money wisely.

    Leadtek PX9600GT 512MB Extreme:

    + Slightly higher performance
    + Acceptable noise level when being stressed
    + Overclocking headroom
    + HDMI dongle and SP/DIF audio cable
    + PC-game Overlord
    + Power consumption
    + Averaged priced but high/performance ratio

    - Does not add much value for money compared to reference 9600 GT

    Leadtek PX9600GT 512MB Extreme in SLI:

    + High performance, mostly better then 9800GTX
    + Acceptable noise level when being stressed
    + Overclocking headroom
    + HDMI dongle and SP/DIF audio cable
    + PC-game Overlord
    + Acceptable priced compared with 9800GTX

    - SLI functionality not guaranteed in all available games
    - Power consumption
    - Negligible performance gain when overclocked
    - Requires SLI compatible mainboard

    Madshrimps (c)


    I would like to thank Angela from Leadtek to give us the possibility to test the Leadtek PX9600GT 512MB Extreme, until next time, cheers!

    Madshrimps (c)
      翻译: