AMD announces FreeSync 2

Is there a chance a form of VRR could ever be used for movies? Allowing directors to use 24fps for “story telling” moments and ramp up to higher frame rates for action sequences? Arty blur without the judder and clear fast action all without requiring billions of terabytes for storage? Nah….
 
Huh? how can you see any of that (differences) on the <240 Hz monitor you're watching the ~30Hz youtube video?
there is a sentence in my previous post saying that the video contains a comparison after the 7 minutes mark, it doesn't matter if the youtube video runs at 30 Hz or 60 Hz
 
Oh yes it does since that comparison cannot prove anything, except that the monitors are indeed running at the refresh rate they are supposed to be running.
 
It's a multiple of 24, perfect fit for movies too (then, so is 120 Hz, but 144 Hz is still more :D )

To add to the confusion, there is 144Hz and 143.86Hz, the latter is derived from NTSC timing and i 6x23.976Hz where the 23.976 is the 3:2 pulled down frequency of the 59.94 NTSC field frequency. Movies are normally 24Hz, movie content made for broadcast is normally 23.976.

Interestingly when I select 144Hz in Windows for monitor refresh rate, the Window desktop flickers, if I select 143.86 it is rock solid. Why? I have no idea. You'd think in 2017 we would be beyond this shit.

Cheers
 
Yah, like my dumb 59.95Hz dell monitor that seems to cause screen tearing issues every time I vsync a game. Hate it.
 
To add to the confusion, there is 144Hz and 143.86Hz, the latter is derived from NTSC timing and i 6x23.976Hz where the 23.976 is the 3:2 pulled down frequency of the 59.94 NTSC field frequency. Movies are normally 24Hz, movie content made for broadcast is normally 23.976.

Interestingly when I select 144Hz in Windows for monitor refresh rate, the Window desktop flickers, if I select 143.86 it is rock solid. Why? I have no idea. You'd think in 2017 we would be beyond this shit.

Cheers
Actually if I'm not mistaken all the "even Hz's" are more often those "bit under Hz's" than not in Windows, unless you create it manually. At least this used to be the case, 60 Hz being 59.94 etc etc but just rounded up. That wouldn't explain why you have 144 & 143.86 though
 
Is there any monitor that claims to be HDR that actually have max brightness of at least 1000 nits and min of 0.05 nits? because As I understand it that it the real reason to call a display HDR and I still sees an infamous 300 nits of brightness in "HDR" displays.
 
The Xbox 1X also uses FreeSync 2. There was also mention of some TV manufacturers (ie: Samsung) using the technology too, to appeal to the Console Gamer.
The part PC Gamers will like, is the purported lower latencies.
 
The point of HDR is higher images quality, images closer to reality not really high refreshes although companies prefer to add features and ask higher prices with higher margins. A 1080p HDR with 120Hz would be plenty and more than enough for me.
 
Correct. But there is also another added benefit to FreeSync2, other than just frames, it also reduces latencies using the FS2 protocol. And what seems like massive support from vendors to introduce this into their displays.

Again, Xbox 1X will be featuring some of this.


And speaking of 1X (& PS4) AMD technology, here is a small blurb I just read today:

"Next-Gen Compute Units (NCUs) provide super-charged pathways for doubling processing throughput when using 16-bit data types.1 In cases where a full 32 bits of precision is not necessary to obtain the desired result, they can pack twice as much data into each register and use it to execute two parallel operations. This is ideal for a wide range of computationally intensive applications including image/video processing, ray tracing, artificial intelligence, and game rendering."

-Speaking about the Playstation 4 pro.
 
In the meantime, nvidia needs to get into partnerships with OEMs to release ultra expensive Gaming TVs just to get variable refresh rate into larger screens.
Well, at least there are IPS/local dimming/HDR/VRR/4K gsync screens coming down the pipe (a few might have even launched already); I haven't seen any noise about local dimming freesync screens unfortunately... :(

Hopefully there'll be some in not too terribly long. *sigh*
 
Also related, FarCry 5 will have Freesync 2 support (aka it won't need tone mapping for display, the game will tone map it automatically for the used freesync 2 display to reduce input lag)
 
Back
Top