NVIDIA's G-Sync Attempts To Revolutionize Gaming

At an nVidia event in Montreal, Canada on Oct. 17-18, nVidia introduced a new technology called G-Sync designed to revolutionize gaming by addressing gaming lag, stuttering and screen tearing.

https://www.anandtech.com/show/7436/nvidias-gsync-attempting-to-revolutionize-gaming-via-smoothness

I believe this is what Mark Rein from Epic Games was referring to in his comment “Nvidia Is Preparing the “Most Amazing Thing” Ever Made”, here in this thread that I posted.

https://forums.blackmesasource.com/showthread.php?t=16801

Also in attendance at the nVidia event were industry heavyweight Devs John Carmack from id Software, Tim Sweeney from Epic Games and Johan Andersson from Dice. They spoke about their high excitement and the benefits about this new technology during the presentation, and following it had a panel discussion speaking further not only about G-Sync but other interesting gaming industry technologies, topics and trends.

https://www.pcper.com/news/Graphics-Cards/John-Carmack-Tim-Sweeney-and-Johan-Andersson-Talk-NVIDIA-G-Sync-AMD-Mantle-and-G

This does sound pretty cool. So it completely gets rid of the need for vsync? And going above 60 fps (on a 60Hz display) or below doesn’t result in any real hit in smoothness (besides just the general difference between say 40 fps and 60 fps, but it won’t be as bad)?

It’s just a chip that controls the monitor to only show a frame when it’s rendered? Kinda neat, I guess.

about fucking time

screen tearing is just a sad reminder that we are still running fully standard MS-DOS based hardware and every frame just gets “printed” on the screen like in console days

I always hated vsync anyway (due to delay and lagginess when going below 60fps automatically goes to 30fps)

I don’t think I’ve ever noticed tearing on my computer.

I don’t think I understand a bit of what is being said in this topic. But “yay” for technological progress.

So now you’ll need to buy this new card and a new G-sync monitor and a GeForce GTX 650 Ti Boost or better if you don’t already have one.

Now that you’ve spent $600-$700 for this setup you now have to wonder what NVIDIA means when they say:

“There’s a bunch of other work done on the G-Sync module side to deal with some funny effects of LCDs when driven asynchronously. NVIDIA wouldn’t go into great detail other than to say that there are considerations that need to be taken into account.”

Sounds like a neat idea but I don’t know if there’s enough demand for it. After all, this is simply a fix for frame rates between 30-60. It should allow more effects for developers but will developers have to make a G-sync version of their software as well as a normal version too?

Actually, the G-Sync card is built into a new G-Sync-capable monitor. At the tech press event, and elsewhwere, nVidia stated that they may make the G-Sync card available separately, for those that want to install it into their current monitor. They had a technically-challenged staff member install one and it apparently took him roughly 20 minutes.

=======================================

Here’s another blurb from nVidia’s website, with more info and some tasty industry quotes.

https://nvidianews.nvidia.com/Releases/NVIDIA-Introduces-G-SYNC-Technology-for-Gaming-Monitors-Tears-Stutters-Lag-Become-Artifacts-of-th-a41.aspx

As mentioned, major monitor manufactures, like ASUS, BenQ, Philips and ViewSonic, are already on board with G-Sync. Asus has one in the pipeline already.

https://www.tomshardware.com/news/vg248qe-g-sync-nvidia-asus,24766.html

And don’t forget that Nvidia and AMD are in fierce competition, so it’s only a matter of time before a similar solution from AMD is announced and released.

I have both kinds of tearing problems (noticeable tearing even with vsync at 60fps) It’s a side effect of running EyeFinity with 3 different screens, I hope it’s finally fixed soon.

It’s a fix for 30 fps to the max refresh on your monitor, which as demonstrated was 144hz/fps for their setup, but it even helps smoothness below 30 fps by duplicating frames. If you read the Anandtech article on it, he really makes it sound pretty awesome. It may be an investment to get a new monitor and video card but the testimonies from those who have seen it in action make it sound pretty damn sweet. And once it’s released we’ll get to have even more reviews of it as early adopters begin testing it. If it proves to be as good as it says then I’m pretty sure the gamers who can afford it are gonna want it. Then if it takes off, AMD will no doubt release their own version in order not to be left in the dust, and competition will no doubt make it even more accessible and affordable. Even if it’s more of a small step than a giant leap, it still promises to make a good difference in graphical fidelity and immersion.

Also, no, g-sync requires no extra work from developers. It simply syncs monitor refresh to the fps of the graphics card, so it should work on all existing games as well. You can see from the press release that they tested it on Tomb Raider.

At the nVidia tech press event, all three industry Devs elabotated on the difficulties they’ve had during development of recent games in ensuring that framerates are maintained as high as possible. They mentioned how games nowadays typically have so much more happening on-screen and how that drags framerates down. They spoke about past tough decisions they’ve had to make in removing certain aspects of a game (textures, draw distances, SFX effects and the like) in those areas where framerates were sluggish, something they didn’t want to do, but had to. These measures obviously negatively affects a game’s visual fidelity. They all emphatically stated that G-Sync virtually eliminates the need for them to do this, going forward, so games have the potential to not only play smoother, but also look even better, with the end product more closely resembling the Dev team’s original envisioning.

Here’s the first review of an ASUS monitor that’s equipped with G-Sync technology.

https://www.guru3d.com/articles_pages/nvidia_g_sync_review_guide,1.html

The review mentions that “For Europe, nVIDIA will be launching G-Sync with select retailers in chosen regions (France, Germany, Norway, Sweden and UK) on Monday, 16th December. Pre-orders will go live from tomorrow, Friday, 13th December.” It doesn’t state what launch activities nVIDIA has planned for the Americas, but expect something similar.

And it’s only a little over $600.00!

Seems you could take care of the issue by simply buying a good video card for a LOT less doesn’t it?

You missed the whole point of the feature. A good GPU still experiences screen tearing
without v-sync and stuttering with it.

Good videocards eventually become outdated, making screen tearing worse. G-Sync cuts the evil by the root by getting rid of screen tearing altogether.

You still might feel the effects of framerate fluctuation but at least you’re not massacring your eyes with terrible tearing.

tearing causes tearing

A better graphics card won’t solve these kinds of issues. Getting lower or higher FPS than your refresh rate causes tearing without vsync on, and with it on, you’re gonna get lag and stuttering anyways. Getting a better graphics card is still probably more worth it for the money right now, but it’s not ever going to do what G-sync does.

For what it is, it’s probably a little overpriced right now, but I’m still excited about this technology. Hopefully if it catches on and gains some competitors, it’ll start to become a cheaper, more widespread feature of gaming monitors.

Early adoption always cost out the ass. Think G-Sync monitors are expensive go buy a 4k monitor.

I think this is just gonna be the way though. Imo, the device should dictate the refresh rate anyway. Regardless of whether or not it’s for games.

But still need to wait till next year till I can buy one :frowning:

Founded in 2004, Leakfree.org became one of the first online communities dedicated to Valve’s Source engine development. It is more famously known for the formation of Black Mesa: Source under the 'Leakfree Modification Team' handle in September 2004.