Sparkle 9800 GTX+ With Custom PCB and Cooling Review

VGA Reviews by jmke @ 2008-12-03

In time for holiday shopping spree Sparkle launches a custom Geforce 9800 GTX+ which features onboard HDMI and a silent GPU cooling. Can it run the latest games fluently? We test twelve on them to find out.

Introduction & Specs

Introduction

With the winter holiday season approaching people are looking for new products to put under the Christmas tree. With the economic recession in full force a new video card which costs less than €200 might be more feasible this year. In this review we take a closer look at a custom designed Geforce 9800 GTX+ from Sparkle.

Madshrimps (c)


Specifications

We have tested a Geforce 9800 GTX in the past from Gainward here, the revised GTX + increases the shader and GPU clocks to give the card a fighting chance below the €200 price point.

The original Geforce 9800 GTX and older 8800 GTX specs compared below:

Madshrimps (c)


And the new kid on the block: the Sparkle 9800 GTX+

Madshrimps (c)


The GPU got 9.3% boost, the shader clock only 8.6%, memory clock stayed the same. The original 9800 GTX didn’t quite live up to its product name, seeing as the older 8800 GTX was able to keep up with it nicely, thanks to its 384-bit memory bus and larger memory size.

Let’s find out if the Sparkle 9800 GTX+ can make for a worthy Christmas present ->

Up Close

A Closer Look

Sparkle uses a custom designed PCB and GPU cooling, they also integrated native HDMI onboard as you’ll see in the photos below.

The packaging is a standard affair, front reveals a nice Sparkle logo with clear specifications at the top.

Madshrimps (c)


The rear goes more in-depth regarding the 9800 series’ features.

Madshrimps (c)


Inside you’ll find the usual suspects:

Madshrimps (c)


  • Installation Manual & Driver CD
  • S/PDIF audio cable
  • two 4-pin to 6-pin power converters

    There is no DVI>HDMI or DVI>D-Sub dongles included, because the Sparkle 9800 GTX+ has everything onboard:

    Madshrimps (c)


    The cooling heatsink is made from aluminum with a shroud featuring Sparkle logo. One single fan in the middle pushes cool air over the fins. This is a dual slot solution so make sure you have room for it. Also open up a PCI slot at the back of your case to allow the hot air to escape. Sparkle kept the SLI connectors on the PCB so 3-way SLI is possible with these cards.

    Madshrimps (c)


    The rear doesn’t reveal much, a proper mounting bracket for the GPU cooling makes sure it won’t fall of by accident.

    Madshrimps (c)


    The S/PDIF cable is hooked up at the top here:

    Madshrimps (c)


    The 2x 6-pin power connections are in the same place as on the 8800/9800GTX:

    Madshrimps (c)


    Time to plug and play ->
  • Test Setup

    Test Setup & Benchmarks

    We build our test setups with the help of Tones.be (Belgian’s Largest Hardware Shop!) who helped us with the hard drives, CPUs and monitors, MSI for the motherboards, OCZ for the memory, Coolermaster for the cases and power supplies and last but not least, Scythe for the silent CPU coolers.

    Madshrimps VGA Test Stations

    CPU 2x Intel Core 2 E8200 @ 3.375Ghz
    Cooling 2x Scythe Ninja 2
    Mainboard 2x MSI P45 Platinum
    Memory 2 kits of 2 * OCZ 2Gb PC2-8500 Reaper
    Other
  • 2x Coolermaster CM690 Enclosure (3*120mm case fans)
  • 2x Coolermaster UCP 900W Power Supply
  • 2x Western Digital 80Gb HDD (system)
  • 2x Samsung 640Gb HDD (Data)


  • At the time of purchase each system we build cost us approximately ~€1200 without the VGA card. While it’s not a budget system, it’s also far from high end as we’re using a DDR2 motherboard and a mid-range Core 2 Duo Wolfdale CPU. Combining it with a €300+ VGA card does place it in the more expensive bracket when it comes down to building a game machine.

    One of the costs for a system is the monitor for sure, the system price mentioned above includes this screen, a SAMSUNG Syncmaster 2493HM 24inch, it has a native resolution of 1920x1200 this screen offers quite low 5ms latency. Again this screen is mid-range as more expensive models are available, but the resolution of most 26”~27” screens remains the same at 1920x1200. You need to invest into a 30” to go higher to 2560x1600 at which point you will be spending a pretty hefty sum.

    Software config:

  • OS: Windows Vista SP1 64-bit
  • NVIDIA Drivers: Forceware 180.48


  • These are the games we tested:

    We tried to expand our list with the latest games and different genres to give you an idea of how the 9800 GTX+ performs.

  • Trackmania Nations
  • Crysis
  • Crysis Warhead
  • World In Conflict
  • Unreal Tournament 3
  • The Elder Scrolls IV: Oblivion
  • Race Driver : GRID
  • Call of Duty 4
  • Left 4 Dead
  • Devil May Cry 4
  • Far Cry 2


  • Benchmark Methodology

    We tested the Sparkle 9800 GTX+ 512Mb against a reference clocked XFX 8800 GTX 768Mb card in the benchmarks mentioned above. Each card was installed in the same hardware setup and we ran each benchmark side by side, especially in games which do not have any build-in benchmark mode this came in very handy as we could reproduce the same game path on both systems, as illustrated in the small movie below:



    We used a VIP-844-BC 4 port PS/2 Keyboard Multicaster, which allows you to control up to 4 PCs simultaneously with one keyboard, for the mouse we used 2 Logitech MX1000 receivers and synchronized one MX1000 with both receivers, with this setup we are able to control both PCs 99% simultaneously. It proved a challenge to set up at first as every window and field in Windows has to match up 100% otherwise you would have to focus on one system each, using a HDD image copy we got a complete setup match and benchmarking could start.

    The venerable Geforce 8800 GTX which can be had for €130~150 if bought second hand is not much cheaper than the Sparkle 9800 GTX + which costs new €160~170; which one is faster? And by how much?
  • Futuremark Synthetic 3D Benchmarks

    Futuremarks tests

    These synthetic 3D benchmarks from Futuremark allow you to evaluate the expected performance of a system with different generations of games. As each 3DMark uses different features and quality settings, it allows you to get an idea of how your system will perform.

    Madshrimps (c)Madshrimps (c)


    Let’s start with the oldest batch of Futuremark benchmarks, 3DMark2001SE and 3DMark03

    Madshrimps (c)


    3DMark2001SE shows us that the system’s bottleneck is not the VGA card, as the scores are practically the same. 3DMark03 is more GPU dependant, and it shows, the 9800 GTX+ has a 13% lead.

    Madshrimps (c)


    3DMark05 is less GPU dependant then 3DMark03, but still a 10% lead for the GTX+.

    Madshrimps (c)


    Since we’re using Vista as our host OS it’s time to test the latest Futuremark addition, 3DMark Vantage was run with the Performance preset (1280x1024 resolution, no AA):

    Madshrimps (c)


    3DMark Vantage doesn’t care much for the extra GPU power, only 3% faster with the GTX+. The 3DMark06 benchmark on the other hand is heavily impacted as we see a 18% lead for the GTX+.

    So on average, going from old to new the Sparkle 9800 GTX+ is about 10% faster than the 8800 GTX. Will we come to the same conclusion with our game benchmarks?

    Trackmania Nations

    Trackmania Nations

    TrackMania is a series of arcade racing games for Windows, which include stunting with cars, track-building, elements from puzzle games, as well as elements testing the players' dexterity. It was developed by the French team Nadeo for the PC. Instead of following the usual trend of choosing a set car and track for playing in the game, the TrackMania games allow the player to create their own tracks using a "Block" process in the spirit of the 1985 game Racing Destruction Set and the 1990 Brøderbund release, Stunts.

    In contrast with most other racing games, the TrackMania series lets the player race a track as many times as he/she wants, until time runs out. He/she can choose to respawn at any time possible, due to landing upside down, going off the track, or even just because the start did not go optimally. Although in multiplayer games multiple cars race on the same track, they cannot actually collide or otherwise influence each other.


    The ugly duckling in our games’ line-up, this is a free game (go download it now!) based on stunt drive racing with fast cars. Nothing realistic about this game but pure arcade fun.

    Madshrimps (c)


    The game engine is very scalable, as you can play it with entry level cards or even onboard IGP, but if you turn up all the detail it will provide a challenge for the higher end cards. With the 9800 GTX+ card we were able to run at high quality with/without AA enabled, AF was set to 8x

    Madshrimps (c)


    Without AA enabled both cards offer very smooth gameplay at 1600x1200 and 1920x1200. The GTX+ is ~12% faster overall.

    Madshrimps (c)


    With 4xAA enabled the difference is smaller, the 8800 GTX average FPS is lower, but still high enough for smooth FPS. The lead of the GTX+ is now only 7%

    Crysis

    Crysis

    Crytek became famous with their Far Cry first person shooter game, not only for the open-ended gameplay but also because of the stellar system requirements to be able to play the game at high detail. Crysis is their second game and doesn’t disappoint in either gameplay or system requirements.

    Crysis offers several methods to test performance with their game, they include two batch files, one geared toward CPU testing, the other toward GPU testing. These two methods provide very repeatable results but unfortunately don’t reflect real gameplay performance, and only give you an indication of how the game will run.

    We used this Crysis benchmark tool which enables you to define a custom time demo; using the build-in “Assault” run-through we measured performance of both cards. We briefly tested the 64-bit executable included with Crysis when installed on a 64-bit OS, but performance was actually lower with no IQ benefit, so we stuck to the 32-bit .exe for all our tests.

    Madshrimps (c)


    First up, HQ setting with/without AA enabled, seeing as even a HD4870X2 can struggle in this game, our hopes weren’t very high :

    Madshrimps (c)


    We stuck to 1600x1200 with the HQ setting, without AA both cards are able to average 30fps, but that doesn’t mean smooth gameplay at all. With AA enabled it’s worse still, as you can expect, the 8800 GTX is slightly faster here, most likely thanks to it’s superior memory bandwidth and size.

    Let’s dial things down a bit, everything set to Medium Quality:

    Madshrimps (c)


    Very playable now, even at 1920x1200; the GTX+ has an impressive ~22% lead. Let’s include 4xAA:

    Madshrimps (c)


    Neither card quite up for that, again the gab closes between the GTX+ and the 8800 as the resolution/AA increases.

    Last test is with a mix of Medium and High Quality settings to delivery close to HQ visuals.

    Madshrimps (c)


    The Sparkle 9800GTX+ definitely the better card if you start tweaking Crysis’ image settings, close to 20% faster than the 8800 GTX at playable frame rates.

    Crysis Warhead

    Crysis Warhead

    The long awaited successor of Crysis finally left beta stage a few months ago, it features a tweaked and updated Cry Engine 2 which promises better visuals and lower system requirements.

    Crysis Warhead updates and refines the gameplay of the original game through a side story plot involving Psycho, one of previous protagonist Nomad's allies. The game is a parallel story that follows Sergeant Michael "Psycho" Sykes, a character from the original Crysis, as he faces his own trials and challenges on the other side of the island during the time period of the first game. It features new fully customizable weapons, vehicles and enemies, along with new multiplayer content.


    We used this Crysis Warhead benchmark tool which enables you to play back a gameplay demo; we used the included Frost, Ambush and Avalanche timedemos.

    Madshrimps (c)


    Crysis Warhead does away with the low/medium/high/very high quality settings, instead you now have Minimum/Mainstream/Gamer/Enthusiast. Minimum is low quality of Crysis, Mainstream is Medium Quality, Gamer is High Quality while Enthusiast is the Very High quality setting of Crysis. With this mid-range video cards we stuck to the Mainstream and Gamer quality setting, the Enthusiast setting gave us unplayable frame rates, even at 1600x1200 without AA.

    First up Frost timedemo under DX10:

    Madshrimps (c)


    Under DX10 at 1600x1200 the Gamer setting proved unplayable, the Mainstream Quality setting was better. GTX+ overall 13% faster.

    Better stick to DX9 though:

    Madshrimps (c)


    Smaller advantage for the Sparkle, Gamer setting remains unplayable. This doesn’t bode well for the claim of “lower system requirements” for Crysis Warhead…

    Madshrimps (c)


    The Ambush level is slightly less heavy for our GPUs, the Sparkle takes a noticeable ~25% lead, although again at Gamer quality setting the min. FPS drop below smooth.

    Madshrimps (c)


    Avalanche is the third and heaviest level, min FPS go lowest here, at Gamer Quality setting you’ll definitely notice this drops. Overall the GTX+ is only ~10% faster here.

    World in Conflict

    World in Conflict

    World in Conflict (also known as WiC or WIC) is a real-time tactical video game developed by the Swedish video game company Massive Entertainment and published by Sierra Entertainment for Windows PC. The game is set in 1989 during the social, political, and economic collapse of the Soviet Union. However, the title postulates an alternate history scenario where the Soviet Union pursued a course of war to remain in power. Generally considered a real-time strategy (RTS) game, World in Conflict includes gameplay typical of real-time tactical (RTT) games.


    The game engine of WIC is quite detailed for the scale it provides, while you can reduce the graphics settings to allow playable FPS with mid-range video cards, you can also set everything to maximum and see all the splendor of warfare rendered with high end GPUs. While this game does require a nice graphics card to get the most out of it, the CPU is also a very deciding factor when it comes down to performance as you’ll see.

    Madshrimps (c)


    This game can be quite demanding on the latest hardware, we enabled high quality with/without AA enabled at 1600x1200 and 1920x1200:

    Madshrimps (c)

    Madshrimps (c)


    Without AA enabled the Sparkle holds a small lead, frame rates are acceptable for a 3rd person RTS. With AA enabled though the 8800 GTX catches up.

    Madshrimps (c)

    Madshrimps (c)


    The outcome at 1920x1200 is quite expected, as the bandwidth requirements go up the 8800 GTX comes into its own and with AA enabled it takes a very small lead. The frame rates are borderline acceptable, better to stick to 1600x1200 though.

    Unreal Tournament 3

    Unreal Tournament 3

    Epic Games had quite a bit success with their Unreal Engine ; at first competing with the ID Tech engine for licensing they now seem to have pulled ahead with the latest incarnation : Unreal Engine 3. The list of games using this engine is huge with blockbuster titles like: Bioshock, Mass Effect, Gears of Wars, Rainbow Six Las Vegas (1&2) and of course their next iteration of the UT series, UT3.

    While many games share the same Unreal 3 engine, the developers can decide how high the system requirements will be, by increasing the level of detail. Rainbow Six Las Vegas for example is known for being more demanding than Bioshock on the same hardware. Unreal Tournament 3 by Epic Games provides an amazing balance between image quality and performance, rendering beautiful scenes even on lower end hardware; on high end graphics cards you can really turn up the detail which makes it picture perfect.

    Madshrimps (c)


    We used HardwareOC’s benchmark tool which does a fly-by of the chosen level, do note that performance numbers reported are higher compared to in-game. The map used was “Corruption”.

    Let’s start of with 0xAA/8xAF:

    Madshrimps (c)


    At 1280x1024 the performance is limited by our CPU speed; at 1600x1200 and 1920x1200 the Sparkle 9800 GTX+ can take a ~10% lead.

    Madshrimps (c)


    With 4xAA enabled the performance drop is noticeable overall, less at 1280x1024 though as expected. At 1600x1200 with 4xAA the 8800 GTX comes quite close (only 6% behind) but at 1920x1200 the Sparkle 9800 GTX+ proves superior (+13%)


    The Elder Scrolls IV: Oblivion

    The Elder Scrolls IV: Oblivion

    The Elder Scrolls IV: Oblivion, or simply Oblivion, is a single player fantasy-themed action-oriented computer role-playing game developed by Bethesda Game Studios and published by Bethesda Softworks/ZeniMax Media and the Take-Two Interactive subsidiary 2K Games.
    Oblivion's story focuses on a former prisoner drawn into a Daedric Lord's plan to invade the mortal realm of Tamriel. Gates to the hellish realm of Oblivion are opened, through which many daedra flow. The game continues the open-ended tradition of previous Elder Scrolls games, allowing the player to travel anywhere in the game world at any time, including the option to ignore or postpone the main storyline indefinitely.


    When Oblivion was released in spring 2006 we had found a new system benchmark, the expansive gameworld of Oblivion required a hefty configuration if you wanted to play it at the highest detail level. Not only did it tax the video card, the CPU was quite important too.

    Madshrimps (c)


    We had to wait until the end of 2006 for a single VGA solution, the Geforce 8800 GTX, that could run the game at 1600x1200 with AA enabled at acceptable frame rates. We are now two and half years later, let’s see how today’s high end products handle this demanding game. Using FRAPS we chose an outdoor scene and walked a path several times to get repeatable results.

    So how does our 2 year old video card hold up against today’s 9800 GTX+?

    Madshrimps (c)


    The Sparkle 9800 GTX+ is ~33% faster at 1920x1200, offering min FPS close to the average of the 8800 GTX, impressive showing here.

    Madshrimps (c)


    With 4xAA throw into the ring though the lead is halved, the 8800 GTX catches up but is still ~15% slower. Overall the Sparkle races through Oblivion even at 1920x1200 with all bells and whistles enabled.


    Race Driver: GRID

    Race Driver: GRID

    This game is made by Codemasters, the creators of the TOCA/Race Driver series. GRID is a hybrid between arcade and simulator of mainly tarmac racing that consists of 43 cars. There are several types of competitions for different cars: GT races, open wheel races, demolition derbies, etc. There are also several tracks from different countries, including Japan, United States and European tracks such as Le Mans and Spa Francorchamps. A track through the streets of Milan is also available.


    The game runs on a modified in-house engine previously used by Colin McRae: DIRT. The similarities between the two games visually are noticeable. Grid adds a much more advanced damage model and several other improvements which make it one of the best looking race games on PC.

    Madshrimps (c)


    Its heavy use of shaders makes it very demanding for today’s video cards; at the lowest graphics settings it’s definitely far from good looking but does run on an entry level card as the Geforce 8500 GT, albeit not at high resolutions. We raced through the Okutama track several times with a Toyota Supra and recorded min/avg frame rates. The average result from several runs were included in the charts below.

    Madshrimps (c)


    With all in-game details set to Very High both cards run GRID fluently, min FPS never dip into the noticeable zone. Overall the GTX+ is ~20% faster.

    Madshrimps (c)


    With 4xAA enabled the outcome doesn’t change much, min FPS are still high enough, the lead of the GTX+ remains ~20%.

    Call of Duty 4 & Left 4 Dead

    Call of Duty 4

    The successor the Call of Duty 2 for PC as the COD3 was never released on our favorite gaming platform. Call of Duty 4 takes us not back in the past, but to the near future where you’ll find yourself fighting everybody and everyone in all out guerilla war inside cities.

    Call of Duty 4: Modern Warfare runs on a proprietary engine and with features that include true world-dynamic lighting, HDR lighting effects, dynamic shadows, and depth of field.[12] "Bullet Penetration" is calculated by the engine, taking into account factors such as surface type and entity thickness. Certain objects, such as cars and some buildings, are destructible. This makes distinguishing cover from concealment important, as the protection provided by objects such as wooden fences and thin walls do not completely protect players from harm. Bullet speed and stopping power are decreased after penetrating an object, and the decrease is dependent on the thickness and surface type of the object. The game makes use of a dynamic physics engine, not implemented in previous Call of Duty titles. Death animations are a combination of pre-set animations and ragdoll physics.


    Madshrimps (c)


    One thing that needs to be said about the Call of Duty 4 graphics engine is that it’s very optimized. It doesn’t offer a completely free roaming arena to play in, so there is less to render and calculate by your PC. While this limits re-playability it does make for a very cinematic experience and doesn’t require a monster PC to look good.

    That said, we tested at 1920x1200 with very high IQ setting to stress the VGA cards:

    Madshrimps (c)


    As you can see from the results, even with 4xAA enabled both cards plow through the game with very high average FPS. The Sparkle is about ~15% faster than the 8800 GTX.

    Left 4 Dead

    This co-op game meant to be played with friends over LAN/Internet is based on every Zombie movie you’ve ever seen; it’s a survival game of you (and friends) against the computer AI. A very tense game with a smart computer which makes sure you get surprised every time you play it.

    Left 4 Dead uses the latest version of Valve's Source engine, with improvements such as multi-core processor support and physics-based animation to more realistically portray hair and clothing, and to improve physics interaction with enemies when shot or shoved in different body parts. Animation was also improved to allow characters to lean realistically when moving in curved paths. Rendering and artificial intelligence were scaled up to allow for a greater number of enemies who can navigate the world in better ways, such as climbing, jumping or breaking obstacles. Lighting was enhanced with new self-shadowing normal mapping and advanced shadow rendering that is important to convey information about the environment and player actions.Wet surfaces and fog are used to create mood. Many kinds of post processing cinematic visual effects inspired by horror movies have been added to the game. There is dynamic color correction that accentuates details based on importance, contrast and sharpening to focus attention on critical areas, film grain to expose details or imply details in dark areas, and vignetting to evoke tension and a horror-film look.


    Madshrimps (c)


    The Source engine is known for its scalability and overall good performance no matter what graphics card you have in your PC. The cards tested today can be classified as high end when it comes to Source engine games:

    Madshrimps (c)


    We had to disable multi-core support on our Vista 64-bit to stop the game from crashing, but this game is not bottlenecked by a 3.3Ghz CPU, that’s for sure. Both 8800 GTX and Sparkle 9800 GTX+ come through with flying colors, although this source engine game is one of the more taxing ones. With 4xAA we see average FPS over 60; with the GTX+ in the lead by ~11%

    Devil May Cry 4

    Devil May Cry 4

    Devil May Cry 4 is the fourth installment of the Devil May Cry series. It was announced in March 2007 that the game would be released simultaneously for the PlayStation 3, Xbox 360, and PC. In the game, the player controls both Nero, and Dante, the game's protagonist and the series' title character respectively and fights enemies in close combat using firearms, swords, and other weapons. The characters Lady and Trish from previous games in the series make appearances, along with new characters Kyrie, Credo, Gloria, and Agnus.


    The game has both DX9 and DX10 render mode. We’re using the freely released performance benchmark of Devil May Cry 4 and running in DX10 under Vista.

    Madshrimps (c)


    The benchmark consists of four different scenes designed to replicate different aspects of the game. We tested with/without 4xAA, using DX10 mode.

    Madshrimps (c)


    Both cards fast enough for this game, even at 1920x1080, the GTX+ ~15% faster.

    Madshrimps (c)


    Enabling 4xAA drops the 8800 GTX scores below 60fps in some tests, the Sparkle remains above the mark without issue, again about 15% faster.

    Far Cry 2

    Far Cry 2

    Far Cry 2 is not made by team responsible for Far Cry 1, those guys made Crysis, a new team hired by Ubisoft developed Far Cry 2 using an in-house engine. The game is very open ended with plenty of things for you to do, how you see fit.

    Ubisoft has developed a new engine specifically for Far Cry 2, called Dunia. The Dunia engine was built specifically for Far Cry 2 by Ubisoft Montreal development team. It delivers realistic destructible environments, special effects such as dynamic fire propagation and storm effects, real-time night-and-day cycle, dynamic music system and non-scripted enemy A.I.
    The engine takes advantage of multi-core processors as well as multiple processors and supports DirectX 9 as well as DirectX 10.Only 2 or 3 percent of the original CryEngine code is re-used. Additionally, the engine is less hardware-demanding than CryEngine 2, the engine used in Crysis..

    Madshrimps (c)


    Far Cry 2 comes with an excellent benchmark tool which allows you to play back recorded timedemo in realtime, measuring average and minimum frame rates. Excellent!

    We chose the Small Ranch level in real-time mode with AI disabled. We set the game engine to render in DX10, and Very High quality, without AA it gives us this result:

    Madshrimps (c)


    Impressive results for both cards, as they can run Far Cry 2 even at 1920x1200 with Very High quality settings with DX10 enabled. The 9800 GTX+ has a ~9% lead, hardly noticeable.

    Let’s add 4xAA:

    Madshrimps (c)


    Wow, who would’ve expected this? In some of the previous benchmarks when resolution went up and AA was added we saw the 8800 GTX catch up, but here the roles are completely reversed. At 1600x1200 with 4xAA the 9800 GTX+ is ~12% slower, but at 1920x1200 4xAA it’s clear that the 768Mb and wider memory bus of the 8800 GTX is paying off, frame rates are still playable at ~30fps minimum; the Sparkle 9800 GTX+ on the other hand is about 50% slower, with min FPS dips down below 15fps!

    One thing left to address, DX9 vs DX10. Far Cry 2 is one of the first games where DX10 mode outperforms DX9 mode under Vista… impressive to say the least:

    Madshrimps (c)


    The above was run with the Sparkle 9800 GTX+ With/Without 4xAA , in DX9/DX10 mode.

    Noise, Load Temperature and Conclusive Thoughts

    Noise Levels, GPU Temperature, Power Usage

    We placed a dBA meter next to the side panel of the closed case and measured the maximum noise level obtained when the system was running 30min at heavy load as well as when the system was idling in 2D ; with the other components inside the case (and case fans) making noise we didn’t isolate the VGA card but did record its impact on overall systems noise.

    Without an actively cooled VGA card, the dBA meter recorded 46.7dBA at 5m from the side panel, ambient noise without a system running was 36.7dBA.

    Madshrimps (c)


    The Sparkle 9800 GTX+ cooler hardly surpasses the overall system noise, even when under load the fan was not noticeably louder. The 8800 GTX reference cooler is known for quiet operation at idle, but once the GPU temperature starts rising the fan speeds ramps up to make it quite audible.

    Madshrimps (c)


    The 9800 GTX+ GPU is build on 55nm process, this gives it the edge when it comes down to load temperatures, a 15°C difference is nothing to sneeze at, it means the cooler on the GTX+ doesn’t have to work as hard, creating a quieter VGA card as seen above.

    The other advantage of the reduced manufacturing process is power usage:

    Madshrimps (c)


    At idle the difference is only 17W, but under load it’s very noticeable, 41W less than the Geforce 8800 GTX paints quite a different picture if you keep your system running for hours on end.

    Conclusive Thoughts

    The Sparkle 9800 GTX+ did quite well in our games tested today, overall it proved about 15% faster than the previous generation high end 8800 GTX, luckily it doesn’t come with a very high price tag. We spotted it for €152 in Europe and $180 in the USA, and amongst other Geforce 9800 GTX+ offerings the Sparkle is very sharply priced. The direct competitor for the GTX+ is an overclocked Radeon HD 4850 512Mb card, while a reference clocked HD 4850 is noticeably cheaper (€130/$150) it’s also slower than the new GTX+. A factory overclocked HD 4850 starts at around ~€160/$200 making it more expensive than the card tested here. There’s no denying that the current 9800 GTX+ pricing seen is thanks to the ATI’s aggressive pricing.

    Until we’ve had an opportunity to put an overclocked HD 4850 against the GTX+ we can’t draw final conclusion, what we can say though is that the Sparkle 9800 GTX+ is a very capable graphics card able to run the latest games fluently at relatively high resolutions. Sparkle’s custom version adds a capable cooling and native HDMI support which makes it quite future proof. If you hunger for more they also have a Calibre P980X+ version which comes with higher GPU/MEM/Shader clocks (761/1161/1911).

    Whether you’ll find an ATI or NVIDIA based VGA card under the Christmas tree this year, you can be assured that it will run the best games out there at high detail, without ruining your budget.

    Madshrimps (c)


    We like to thank Vivian and Searching at Sparkle for allowing us to test the Sparkle 9800 GTX+, until next time!
      翻译: