Nvidia 2013 tech day ? OptiX sdk

from that article:

From there, the next reasonable question is to what extent higher-resolution monitors are invited to the party - we saw 1080p displays at Nvidia's event, but having moved onto 2560x1440 and recently sampled 4K, we really want to see the technology rolled out onto more pixel-rich displays. This is very likely to happen: sustaining that even, consistent performance on higher-resolution displays is a massive challenge, and G-Sync technology will prove even more advantageous there than it is with the 1080p displays we saw.

http://www.eurogamer.net/articles/digitalfoundry-nvidia-g-sync-the-end-of-screen-tear-in-pc-gaming

we'll see, i'm potentially interested at 1440p or 4k monitors with g-sync, or at least a module for my actual monitor (when i decide to buy an nvidia card :D )

Seriously, i will will wait to see it on a 1440p + monitors ( forget 4K )... But looking thoses monitors are not waited before 2014 and they are still the 1080p TN 144hz we know.. ( some said Q1, i got more the feeling reading Asus presentation, this is more middle 2014 ).. For higher resolution.. huum, i have some doubt we will see them soon.

First, because they target gamers, and gamers are not willing forcibly to put more of 800$ in a monitor. second, they want to sold thoses monitor with 120-144hz, i dont think we will see much it on 30" soon.

Its a gamer lineup on Asus, HP and Benq.. So they will not bring it on a simple monitor, they will be included with all the Nvidia stuff on the serie: 3Dvision 2.0, 144hz, low latency alternative mode. Lightboost 2.0.

30" and 4K monitors are not really set for gamers todays, they are more general purpose ( in general, its just "cheaper " version from the Pro lineup ), and i dont think they have ( yet ) any interest to set them for it.

G-sync is really interessant, even on a global aspect for future monitor ( because yet we have no idea of the real quality of the implantation from Nvidia, i wait to see what panel builders have to say about it )..

In theory, a simple update on the driver could make any series of nvidia working with this additional card ( at least, Fermi, kepler.. ) And even any gpu who have the driver compatible for it.
 
Last edited by a moderator:
Are even current top end cpu's up to the task of handling games running at 4k resolutions?

Seems early to adopt 4k monitors even if the upcoming highest/top end gpu's have the muscle to push such high resolutions, and the reason i mostly say this is the CPU's can barely handle current upcomming games such as battlefield 4 without often dipping into low 30fps and while g-sync is nice(although at what extra cost?), the input lag remains.

CPU's certainly in mainstream segment to me seem to be a much bigger impediment to 4k gaming, even if 4k monitors (with or without g-sync) become afordable. (ignoring the current world wide economic problems)

Am i being pessimistic?
 
Are even current top end cpu's up to the task of handling games running at 4k resolutions?

Seems early to adopt 4k monitors even if the upcoming highest/top end gpu's have the muscle to push such high resolutions, and the reason i mostly say this is the CPU's can barely handle current upcomming games such as battlefield 4 without often dipping into low 30fps and while g-sync is nice(although at what extra cost?), the input lag remains.

CPU's certainly in mainstream segment to me seem to be a much bigger impediment to 4k gaming, even if 4k monitors (with or without g-sync) become afordable. (ignoring the current world wide economic problems)

Am i being pessimistic?

What does the CPU care about the resolution of the display? That's purely a GPU load. Unless you are loading ultra resolution textures: buy an SSD and use Mantle.
Try a game at 320x200 on a Titan, tell me how CPU bound it is. That would be an interesting benchmark for game reviewers.
 
Amd needs to get in touch with all the other lcd makers and convince them to support some universal standard for gsync before nvidia locks them out.
 
Amd needs to get in touch with all the other lcd makers and convince them to support some universal standard for gsync before nvidia locks them out.

Not sure this is the question yet ... the gamers monitors line of Asus, benq and other are limited sellors so far... ( compared to the whole market of LCD monitors, most gamers just buy what they can afford, ).. On Asus, previous Nvidia enabled gamers monitors was limited to 2 models ( 24-27" ), same for Benq... And its not the one i see the most.
 
Last edited by a moderator:
Are even current top end cpu's up to the task of handling games running at 4k resolutions?
Oh yes absolutely, you have to remember pc gaming isnt restricted to the latest games
and there certainly are people playing games with 5x1 portrait 5400x1920
 
not a fair test i think
the non gysync lcd is on the edge of its viewing angle. they should of put the camera between the 2 monitors
 
The difference was very obvious in the first test with all the tearing, but honestly, I couldn't really notice anything with Tomb Raider.
 
The difference was very obvious in the first test with all the tearing, but honestly, I couldn't really notice anything with Tomb Raider.

Same here but I think it depends on the game being tested vs the performance of your GPU. A 760 is already plenty to push over 60fps in TR and if you're running on a 144hz monitor with no vsync that's going to be a very smooth experience and while there may be some tearing it's more or less unnoticeable in this game at those performance levels (I'm talking from experience here as I have a very similar setup).

Try the same test on a 650 or on a 60hz monitor with vsync enabled and the results would probably be a lot worse - although vsync doesn't actually seem to work on this game on my PC fo some strange reason so I'm unable to test. Good job I don't need it!
 
The difference was very obvious in the first test with all the tearing, but honestly, I couldn't really notice anything with Tomb Raider.

A bit too much obvious at my sense, they hit the extreme case there.
I know the video is slowed down to 25%, but even there, the only time i have see so much screen tearing was in 3DMark '03..

This said, i like G-sync, its for me an excellent solution, innovation and im sure the work it do for quality is excellent too.. .
 
Back
Top