NVIDIA Maxwell Speculation Thread

Ghosting appears to be a monitor side implementation problem,
Since the GSYNC FPGA implements the full hardware of the monitor, it's fully responsible for that part. Compare two identical LCD screens, one with GSYNC and one with FreeSync and the difference is significant.

See the explanation in the Forbes interview:
"We have anti-ghosting technology so that regardless of framerate, we have very little ghosting. See, variable refresh rates change the way you have to deal with it." ( http://www.forbes.com/sites/jasonev...nc-display-tech-is-superior-to-amds-freesync/ )

G-Sync meanwhile is a closed standard, performance penalty, licensing fee required, more expensive solution all around.
Licensing has nothing to do with engineering. Those are business decisions. As for performance penalty: you're talking about this 1%, right? 1%...

Oh, and it maxes out at V-sync speeds, meaning you can't disable v-sync if you're going too fast.
In that same article, they say this can be easily added to the driver. It's a feature that's not linked to the monitor.

Meanwhile, you forget to mention that FreeSync reverts back to tearing or Vsync at the worst possible time: when refresh rates are the lowest, when it's most visible.

Assuming, as it looks now, that ghosting and etc. its just dependent on how the monitor vendor sets up support and some driver variable, that done correctly would eliminate ghosting, Freesync is the better solution in every other way possible.
As it looks now, the ghosting thing is something that's an explicit feature of the GSYNC hardware. It's possible that FreeSync hardware has this capability as well, but if so, why didn't they enable it?
 
Since the GSYNC FPGA implements the full hardware of the monitor, it's fully responsible for that part. Compare two identical LCD screens, one with GSYNC and one with FreeSync and the difference is significant.
I'm not arguing differences wouldn't be visible right now. But ghosting is a problem fully on the _monitor_ side of the equation. So it's more difficult to implement this correctly with some kind of adaptive sync? Well the monitor manufacturer just has to deal with it in the display electronics. Doesn't make sense you have to replace them with nvidia's solution. nvidia doesn't know anything about the particular panel the monitor manufacturer wouldn't know.
Arguably AMD chose to hand out FreeSync certified stickers for less than ideal solutions - but this doesn't mean they couldn't be every bit as good as GSync for cheaper (let's face it simply having another supplier in the chain is going to increase costs). I bet it won't take too long before better FreeSync monitors appear. (Though imho those display electronics guys aren't quite as innovative as they should be, for instance took them ages until they could finally drive 4k panels from a single DP link hence the need for the multi stream hack - so it's probably not too surprising nvidia, having zero expertise in that area, was able to provide a better working solution initially...)
 
I wonder if AMD realized the ghosting issue before VESA adopted this as a standard? :-?
This might turn out to be a short-lived standard ....
 
Since the GSYNC FPGA implements the full hardware of the monitor, it's fully responsible for that part. Compare two identical LCD screens, one with GSYNC and one with FreeSync and the difference is significant.

See the explanation in the Forbes interview:
"We have anti-ghosting technology so that regardless of framerate, we have very little ghosting. See, variable refresh rates change the way you have to deal with it." ( http://www.forbes.com/sites/jasonev...nc-display-tech-is-superior-to-amds-freesync/ )


Licensing has nothing to do with engineering. Those are business decisions. As for performance penalty: you're talking about this 1%, right? 1%...


In that same article, they say this can be easily added to the driver. It's a feature that's not linked to the monitor.

Meanwhile, you forget to mention that FreeSync reverts back to tearing or Vsync at the worst possible time: when refresh rates are the lowest, when it's most visible.


As it looks now, the ghosting thing is something that's an explicit feature of the GSYNC hardware. It's possible that FreeSync hardware has this capability as well, but if so, why didn't they enable it?

The main engineering drawback of G-Sync is the module itself: it adds a lot of cost, complexity and power.
 
I wonder if AMD realized the ghosting issue before VESA adopted this as a standard? :-?
This might turn out to be a short-lived standard ....
The ghosting that is caused by the display is AMD's fault?

Please remember;
#1 PCPer seems to have followed Nvidia's guide to comparing FreeSync and GSync
#2 PCPer didn't really describe their "testing," the settings used, or really anything at all so that it can be further investigated.
#3 PCPer doesn't realize that the panel in the Swift is not used in the BenQ display...
#4 There is obvious pixel overshoot on the Swift that they choose to not mention.

I personally will wait for some real hard data comes out from an in depth review but I am relatively sure something is sketchy with PCPer.
 
Last edited:
I don't want to get too OT, but if you don't trust PCPer, then try this PCLab.pl review (and they have their own videos). It's obviously something related to FreeSync since this review found:

Google Transulated ...
In addition, ghosting on the screen disappears immediately after the FreeSync function is blocked and the monitor will be able to operate with a fixed refresh rate. So what's the reason?
...
The problem is not entirely clear about the supported refresh rates. AMD poorly controlled this area, and although each monitor to be able to officially support FreeSync (get the badge "AMD FreeSync"), must be approved in the office, are admitted, unfortunately, the construction of the high set earlier this range. If the monitor provides FreeSync only 40 Hz, then at 38 frames / sec. Is not applied any video synchronization and you need to come to terms with the tearing and / or jams. In addition, monitors undergo tests that support FreeSync for example only in the 48-75 Hz, which we believe will disqualify equipment in the eyes of the player. AMD does not guarantee just the frequency range of action FreeSync and all leaves manufacturers of monitors, while buying a model with G-Sync module, we make sure that the image synchronization will work even at 30 fps.

https://translate.google.com/transl...sl=pl&tl=en&u=http://pclab.pl/art62755-4.html
 
The main engineering drawback of G-Sync is the module itself: it adds a lot of cost, complexity and power.
If variable frame rates requires special treatment to ensure quality at all rates, and existing hardware didn't support these kinds of features because nobody had foreseen such use, then an FPGA was the only way to get to market as quickly as they did: more than a year before anyone else.
Different solutions had different trade-offs. But it's hard to see how this was a case of bad engineering.
 
Last edited:
If variable frame rates requires special treatment to ensure quality at all rates, and existing hardware didn't support these kinds of features because nobody had foreseen such use, then an FPGA was the only way to get to market as quickly as they did: more than a year before anyone else.
Different solutions had different trade-offs. But it's hard to see how this was a case of bad engineering.

I wouldn't call it bad engineering, just a drawback of their approach.
 
I wouldn't call it bad engineering, just a drawback of their approach.

I would not call it a drawback if it solves the problems. Which it does.
Tom Petersen: “When we invented G-Sync, we determined very early on that in order to accomplish everything we wanted, we needed to be on both sides of the problem — at the front end where we’re controlling the GPU, and the backend inside of the monitor. As FreeSync comes to market, we’ll be able to compare the different strategies and see which one’s more effective. For us, having the module inside the panel allows us to deliver what we think is a very good experience across a full range of operating frequencies for refresh rate or framerate. We have some really significant technology inside that module dealing with the low end of refresh rates. So as a game transitions from 45fps down to 25fps and back, and games get really intense? During that transition our tech kicks in and delivers a smooth experience over that transition.”

Forbes: Let’s talk about the minimum response times that both G-Sync and Adaptive Sync support.

Tom Petersen: “First of all, the spec ‘Adaptive Sync’ has no minimum. Both have the ability to communicate any range, so there’s nothing about the base specs that are different. What’s interesting though, is the reason there are panel-specific refresh limits. LCD images decay after a refresh, you kinda paint the screen and it slowly fades. That fade is just related to the panel. The reason there’s an Adaptive Sync spec and G-Sync module is because that lower limit is variable depending on the technology inside the panel. But games don’t know about that! So what do you do when a game has a lower FPS than the minimum rate you want to run your panel? Because when they run below that minimum rate things start to flicker, and that’s a horrible experience.”

Forbes: So what specifically does Nvidia do to combat that?

Tom Petersen: “I can’t go into too much detail because it’s still one of our secret sauces. But our technology allows a seamless transition above and below that minimum framerate that’s required by the panel. PC Perspective wrote an article guessing how we did that, and they’re not that far off…”

http://www.forbes.com/sites/jasonev...-display-tech-is-superior-to-amds-freesync/2/
 
Without want to speak about G-sync vs Free-sync, i will not use any ROG swift comparison, as this monitor have certainly win the number one price of monitor "send back " under warranty to the brand who product it..... It was a real disaster ( i hope the problems are fixed now ).. i dont know how they have got so much perfect samples for reviewers, when consumers have need 4 to 5 samples for get a decent monitor who was more or less working as intended. I let you make a ride on the Asus Rog Forums, or Nvidia g-sync threads forums.. it is an experience in itself. Hopefully there's other model G-sync featured to use ...

( Understand, that i dont say it is related to the G-sync module, i dont know.. )

As for this " ghosting" problem, im not sure it is really related to Freesync in itself, Nvidia have need to correct many things with update to G-sync ( there was big problem, who was strangely ignored by reviewers, but have suddenly be corrected then by Nvidia ), why it cant be different for Free-sync ? ..
 
Last edited:
No, he's right: they are drawbacks. It's just that they outweigh the alternative.

If you take time to market under account, perhaps; but now that FreeSync is out, that doesn't matter anymore. The only thing remaining is ghosting-management, which some FreeSync models apparently do quite well, at least according to AnandTech's review. I think AMD should introduce some sort of FreeSync Gold/Premium/Whatever label to bestow on displays without ghosting issues, because the current uncertainty is a problem.

But between a good FreeSync display and a G-Sync one, I would take the one that is open, cheap, and power-efficient any day.
 
... but now that FreeSync is out, that doesn't matter anymore. The only thing remaining is ghosting-management

Really, only ghosting. When did AMD fix these problems?
If your game is rendering at 40 FPS, lower than the minimum VRR window, then you will again see the result of tearing (with VSync off) or the potential for stutter and hitching (with VSync on)

http://www.pcper.com/reviews/Graphics-Cards/Dissecting-G-Sync-and-FreeSync-How-Technologies-Differ

If the answer is this
As we discuss in the video, it is possible that AMD could implement a similar algorithm for FreeSync at the driver level, without the need for an external module. A Radeon GPU knows what frame rate it is rendering at and it could send out a duplicate frame, at a higher frame rate, to trick the display and have the same effect. It will require a great deal of cooperation between the panel vendors and AMD however, as with each new monitor release AMD would have to have a corresponding driver or profile update to go along with it. That hasn't been AMD's strong suit in the past several years though, so it would require a strong commitment from them.

AMD has not been strong with driver support so good luck with that solution (which introduces additional problems).
 
Last edited:
Its a little problem who will be quickly fixable, i know you are working for Intel and i know intel will use it too really soon ( i got it from Intel France ). Be happy that AMD is open the way .

I have absolutely no idea why you bring driver support... Firepro drivers are the most reliable driver you can find.. They have a return of error lower that any other brands ( Nvidia, intel ), so if you want to speak about drivers support, you should maybe look on who made the game, and not who have made the driver.
 
Last edited:
Its a little problem who will be quickly fixable

Define little.
Define quickly.

i know you are working for Intel

Well you are wrong.

Be happy that AMD is open the way

Why should I be happy with a broken solution that may have some sort of partial fix (that may introduce additional problems) sometime in the future.

I would rather pay for one that does not have these problems.

I have absolutely no idea why you bring driver support...

Because AMD's history on drivers and support of them has not been steller.
 
From Bits 'n Chips: "EVGA GTX 980 Hybrid: tests of liquid AiO before Titan 'Ultra'?" (original).

Nvidia itself is preparing the debut of a system AiO similar but based on the model GTX Titan X (then we talk about core GM200) to avoid being caught unprepared at the launch of the Radeon R9 390X by AMD. The kit should equip an unspecified version of the GTX Titan (Titan Ultra ???) accredited clock frequencies driven to + 250MHz (GPU) and a TDP close to 300W.
Perhaps we'll also see 8 Gbps memory?
 
Its a little problem who will be quickly fixable, i know you are working for Intel and i know intel will use it too really soon ( i got it from Intel France ).
Really?
I come here to get away from forum crap-flinging like that. B3D seems like an island of sanity in a world of petulant children that populate the forums of wccf/videocardz/semiaccurate/fudzilla et al.
 
Really?
I come here to get away from forum crap-flinging like that. B3D seems like an island of sanity in a world of petulant children that populate the forums of wccf/videocardz/semiaccurate/fudzilla et al.

He it was not an attack, i was really think hes part of Intel, as we have some peoples here who are part of AMD, some gaming company. i have just made a misstake on his name. I have friends who work for Intel, i have absolutely no problem with that. Why peoples want to think it is like an attack, i have no problem with that and it was not my purpose, what i mean is today monitor brand and scalar producer + AMD are investing how fix this. And i know Intel is now looking how they can take advantage of adaptive refresh rate.
 
Back
Top