Nvidia and MediaTek partnership could make G-Sync gaming monitors more affordable for everyone
MediaTek to incorporate Nvidia G-Sync tech into its scaler chips
Nvidia is finally addressing a perceived deficiency in implementing G-Sync in the best gaming monitors. This morning, the company announced a partnership with MediaTek that will bring G-Sync to MediaTek’s upcoming scaler chips, thus eliminating the need for a separate (and costly) Nvidia-spec G-Sync Ultimate module.
Nvidia G-Sync, like AMD FreeSync, allows a monitor’s refresh rate to sync with the rate at which a graphics card can render each individual frame in a game. Depending on the game, the frame rate can vary wildly (particularly during viewing intense, complex scenes), so technologies like FreeSync and G-Sync ensure that the monitor and graphics card are in lockstep to eliminate tearing artifacts that can occur.
While G-Sync and FreeSync offer similar functionality, and you’ll find many monitors that support both implementations, FreeSync has always held one critical advantage – it doesn’t require additional hardware support. On the other hand, if a monitor manufacturer wants to support Nvidia’s full suite of G-Sync Ultimate features, they need to add a separate module, which adds cost.
Thanks to Nvidia’s new partnership, MediaTek's onboard scaler chip includes the full complement of G-Sync Ultimate technologies. MediaTek is a popular manufacturer of scalars that go into monitors, so there is a potential for cost savings to be passed on to the customer. When G-Sync was first introduced over a decade ago, the extra hardware necessary to add full support cost upwards of $200. In addition, the FPGA for G-Sync Ultimate requires active cooling because the chip runs hot. MediaTek is using a custom ASIC for its new scaler, which will have vastly lower power demands and will not require active cooling.
However, we don’t know how much the true cost savings will be for customers, as no financial details or licensing terms were revealed in the announcement. For example, will MediaTek have to pay a licensing fee to implement FreeSync into its scalers? And if a license fee is required, how much cheaper will it be compared to opting for a separate G-Sync module?
Nvidia was quick to point out that its newly announced Pulsar technology, which provides a claimed 4x uplift in motion clarity to reduce blur further while preserving fine detail, is also supported in MediaTek’s new scalers. The first monitors to use the new G-Sync-infused MediaTek scaler will be the Acer Predator XB273U F5, AOC Agon Pro AG276QS2, and Asus ROG Swift PG27AQNR. All three are 27-inch 1440p monitors with a 360 Hz refresh rate.
The partnership between Nvidia and MediaTek isn’t new. Earlier this year, MediaTek announced that it would integrate Nvidia’s next-gen graphics IP into its Dimensity Auto Cockpit SoCs used in automobiles.
Stay On the Cutting Edge: Get the Tom's Hardware Newsletter
Get Tom's Hardware's best news and in-depth reviews, straight to your inbox.
Brandon Hill is a senior editor at Tom's Hardware. He has written about PC and Mac tech since the late 1990s with bylines at AnandTech, DailyTech, and Hot Hardware. When he is not consuming copious amounts of tech news, he can be found enjoying the NC mountains or the beach with his wife and two sons.
-
thestryker The most important part here is if they can ditch the FPGA once and for all. Nvidia could have properly designed the G-sync module at any time, but I'm betting there wasn't any money in it for them. The high power consumption, respective heat output and the idle power draw caused by it is what stopped me from buying a monitor with a G-sync module.Reply -
bit_user
I didn't even get far enough to consider those factors. For me, the main negative about G-Sync was that I never saw a true G-Sync monitor that also had any level of Freesync certification. I didn't want to be locked into one or the other, so I got a monitor that's G-Sync HD-compatible and FreeSync Premium Pro certified. I believe it's also one of the rare non- G-Sync monitors that has variable overdrive.thestryker said:The high power consumption, respective heat output and the idle power draw caused by it is what stopped me from buying a monitor with a G-sync module. -
thestryker
Due to the way the FPGA controls the ports I don't think it would be possible unless nvidia were to add support (which obviously would never happen).bit_user said:For me, the main negative about G-Sync was that I never saw a true G-Sync monitor that also had any level of Freesync certification.
At the time I was buying there weren't any good ultrawide options with Freesync support (LG had their 38" but LG also only carries 1 year warranty and no dead pixel protection) so I'd have rather gone G-sync. Though really outside of OLED 21:9 there are no good options still which is unfortunate because there are a bunch of great choices in 16:9 land.bit_user said:I didn't even get far enough to consider those factors. -
tennis2 Can someone explain to me why Nvidia is still putting/selling GSync modules in 2024?Reply
I get that their pulsar tech probably wasn't baked into their previous gen models (Asus has had ULMB-sync for a few years now), but jeez, know when the horse is dead.
At this point, it really feels like they only keep doing it to sucker people into Nvidia GPUs for the life of their monitor.