Open letter yo AMD Staff - FreeSync and my misconceptions

Status
Not open for further replies.
Panels ship without scalars frequently - my 30" panel has native 2560x1600 and only subdivision modes of that (i.e. 1280x800) because it doesn't have a scaler. Although in this instance we're not talking about bypassing the scaler, but working with it.

So are you stating that this doesn't require a variable refresh control board? Because PCPer did a follow-up with Koduri, and they were given the details that free-sync, like g-sync, requires a variable refresh control board.

Please correct me if i'm under a mistaken impression here, but what you're saying is that it doesn't. Koduri indicated that it does. PCPer and several other websites did follow-ups stating that, so it is no different than g-sync in that respect (requires the monitor to have a variable refresh control board).

Can you reconcile your statement with the one that Koduri provided to PCPer, because apparently there is conflicting information here. Your head graphics guy told PCPer that it does require the monitor to have a variable refresh control board. And desktop monitors obviously do not use eDP.
 
Wow... where to begin...

I see. Well. Perhaps that should have been made more clear from the outset. Numerous websites reported on this technology demonstration and gave users the impression that a free g-sync alternative was on the horizon and would be coming soon.
Uhh, no. Clearly it gave *you* that impression but don't speak for everyone. Personally, when the pictures involve two laptops sitting on some random desk it's clearly in tech demo territory with nothing to do with products. But in case it wasn't clear, and since you mentioned the AnandTech article, let me quote:

AnandTech said:
AMD isn’t ready to productize this nor does it have a public go to market strategy, but my guess is we’ll see more panel vendors encouraged to include support for variable VBLANK and perhaps an eventual AMD driver update that enables control over this function.

That's pretty clear, and other web sites are too. What you read into this isn't what the media reported this time and believe me, I'm the first to criticize misreporting :)

AnandTech said:
I still have an issue with the term "free-sync".
Meh whatever. Cry about it and "OpenGL" and other names all you want... it's clearly a tech demo and an internal name. If it was a product name you *might* have some grounds to stand on, but even then it's "free" in roughly the same way that GL is "open".

Please correct me if i'm under a mistaken impression here, but what you're saying is that it doesn't.
There's no contradiction at all! Dave didn't say anything about not requiring a controller that understands variable VBLANK or anything else. You're putting words in other people's mouths that they are not saying and then bitching about contradictions.

Regarding your other issues... I guess you didn't get the memo that multi-gpu sucks hard, regardless of who it's from (CF may well suck more than SLI but make no mistake, they are both terrible). Sorry to hear it, but read some of the tech threads around B3D or some of the reviews at TechReport and such first next time.

And when you say stuff like "NVIDIA has never lied to me"... just lol. Come on dude...

Personally I make no secret of my general dislike of marketing, but AMD is no more evil than anyone else in that area.
 
If I misunderstood him, I apologize. That's what I interpreted, though - and this underscores the fact that the press was given sparse information about free-sync in the first place, and we're not having to fill in blanks.

I understand that marketing is doing their thing, but for what it's worth, after having used eyefinity for gaming on the AMD side for several years and then going AMD starting with their Kepler GPUs, it's pretty much a night and day difference in terms of software support and nvidia following up on issues. That was the point I was trying to get across in the OP - meanwhile, AMD hasn't fixed bugs that have existed for longer than 2 years. To me that's a clear indication that AMD's software team needs an overhaul.

I won't delve into that issue any longer, but the correlation is that I feel like, as a prior AMD user, that I was misled on numerous occassions - especially as an eyefinity user. It was just greatly disappointing. So when I saw free-sync It also felt misleading. But maybe it isn't AMD's fault, perhaps it is mis-reporting by the press like you said.

Even though it could be mis-reporting, I do feel that AMD should have been more forthcoming on details. I initially read about free-sync at tech-report. And my impression was, based on that, that an alternative was coming and it wouldn't require any new hardware. As we now know, that isn't the case.

Like I said. Misreporting, perhaps. Yet AMD could have prevented this by being more forthcoming with details. Better yet, I don't think AMD should have called this "free-sync" or publicly demo'ed it before it was more or less ready to go. With all due respect, this is just my opinion, i'm not trying to start an argument or anything.
 
I can't speak to eyefinity as I haven't used it. But as far as the press for the free-sync stuff goes, I don't really think it's fair to call anything AMD did misleading. Let's take the TechReport article that you mention now too:
http://techreport.com/news/25867/amd-could-counter-nvidia-g-sync-with-simpler-free-sync-tech

Even the title says "could counter"... not "announces new product" or something. Some relevant quotes from the article:
TechReport said:
During an impromptu meeting in a hotel ballroom this morning...
TechReport said:
The term "free sync" is already being spoken as shorthand for this technology at AMD.
TechReport said:
AMD is still in the early stages of cooking up a potential product or feature along these lines, and it has nothing official to announce just yet.
TechReport said:
PC enthusiasts and gamers who want to see "free sync" happen should make dynamic refresh support a requirement for their next monitor purchase. If monitor makers get the message, then it seems likely AMD will do its part to make dynamic display synchronization a no-cost-added feature for Radeon owners everywhere.

Does any of that sound to you like they are announcing a product that you can use today, or even soon? It doesn't to me. It's just a tech demo, and that's completely fine.
 
Keep in mind, that article is heavily modified from the original one that appeared on the 6th. It is noted at the bottom of the article. After reading it on the 6th, I had the distinct impression that a free g-sync killer would be coming.

In any case, i'm not disagreeing with you, it could be mis-reporting by various press outlets. But you do need to consider that PCPer in particular did a follow-up because initial reports didn't cover a lot of ground. We're having to fill in various blanks now. Know what I mean? I wish those details had been provided from the outset.
 
But at this point won't gsync also be free since you wont need a dedicated gsync module?

As I understand it's NVIDIA's proprietary technology, they're charging for it and they're not allowing non-NVIDIA graphics card to run it, so no. But I could have misunderstood something.

Of course, there's every reason to believe that NVIDIA will support FreeSync, which will effectively make GSync pointless when the former becomes widespread.
 
No, at CES they have monitors announced and ready , so they left the prototype stage already.
Announced and not yet available. What gets sold starting this week is the FPGA solution.
I didn't want to say it like this but AMD, on the other hand is acting like they are spreading FUD
Andrew L. said something to this already.
trying to spoll NVIDIA's parade, they deliberately jumped through 3 different misleading theories as to how this thing works and only spilled the truth after NVIDIA's reply.
Generally, how should AMD know how nV's stuff works, when nV never bothered to spill some details until after that demo of AMD (where Tom petersen basically confirmed that AMD does the same).
they have no plans to implement until the DP1.3 is finalized, they have no monitors,
Exactly! AMD doesn't produce monitors, so they will never implement it to monitors. But they said, that their GPUs are capable of it ;).
They expressed their opinion that a lot of DP1.3 devices will have variable vblank support, but as said several times before, DP1.3 is not necessary to get that feature, as even nV itself said.
no decent demos, no drivers .. nothing.
So you are bickering about that AMD didn't have a fancy demo at hand when they just took two cheap notebooks for an impromptu demonstration to show that their production GPUs can basically handle variable refreshrates? The purpose was probably to notify people of this and increase the momentum for getting support among monitor manufacturers.
Many people on many forums said it can't be done,
Who said one cannot bring support for variable vblank intervals to desktop monitors? It's probably true that the vast majority of monitors currently on the market don't support it. But there may be exceptions where the monitor electronics is already capable of handling it. And as this is a relatively simple thing to support (refresh locked PWM regulated backlights and backlight strobing likely need more effort), it has the potential that it gets picked up relatively fast if the monitor manufactures feel the demand, at least in the niche for gaming monitors. And one probably has not to wait for DP1.3 for support as the first gsync monitors are also not DP1.3 compatible and do just what's needed.
 
Of course, there's every reason to believe that NVIDIA will support FreeSync, which will effectively make GSync pointless when the former becomes widespread.

That's what I mean. Gsync appears to be a hardware+software solution. The software (drivers) are free. Nvidia asks you to pay for the hardware because right now no monitor implements its own variable refresh rate scaler and thus Nvidia developed one which it charges for. That sounds fair to me. But once monitors implement their own vrrs then the Gsync module wouldn't be required and Nvidia would pretty much have to support freesync if it doesn't want to be at a massive competitive disadvantage.

So we have AMD who say 'we'll give you variable refresh rates for free once the monitor vendors support it' and we have Nvidia saying 'we'll give you early access to variable refresh rates if you pay a premium for our hardware which nullifies the lack of such hardware in current monitors'. I don't really see a problem. And the fact is that without Gsync there probably wouldn't have been such a thing as Freesync on the horizon at all.
 
Announced and not yet available. What gets sold starting this week is the FPGA solution.
Yes, the scaler in these monitors has been replaced with the FPGA chip, so no duplicates or unfit solutions here .

Who said one cannot bring support for variable vblank intervals to desktop monitors?
That's not the point, it is that internal scalers can't be bypassed to allow for external scalers supporting variable rates to be added later.

And still all I see is maybe, probably , potential .. there are no definitive solid information just yet .. AMD is trying to formulate a concept that they still didn't exert much effort on it. they should come up with a solid plan instead of throwing punches in the dark to try to cover up them being late to the party .
 
Last edited by a moderator:
That's what I mean. Gsync appears to be a hardware+software solution. The software (drivers) are free. Nvidia asks you to pay for the hardware because right now no monitor implements its own variable refresh rate scaler and thus Nvidia developed one which it charges for. That sounds fair to me. But once monitors implement their own vrrs then the Gsync module wouldn't be required and Nvidia would pretty much have to support freesync if it doesn't want to be at a massive competitive disadvantage.

So we have AMD who say 'we'll give you variable refresh rates for free once the monitor vendors support it' and we have Nvidia saying 'we'll give you early access to variable refresh rates if you pay a premium for our hardware which nullifies the lack of such hardware in current monitors'. I don't really see a problem. And the fact is that without Gsync there probably wouldn't have been such a thing as Freesync on the horizon at all.

The problem is that you then end up with a (power-hungry) monitor that will only support variable refresh rates with GeForces, which locks you into NVIDIA hardware until you buy a new monitor. There was no need for that and it makes the prospect highly unappealing to me, but if some people are fine with it, good for them.

But NVIDIA certainly deserves credit for realizing that this technology no one was paying attention to had the potential to significantly improve smoothness in games and for marketing it aggressively. In fact, it's rather curious that no one (apparently) thought of it before. In any case, I think things are turning out for the best.

I'll take three ≥28" 4K IPS/OLED monitors with VRR support, please. :)
 
Announced and not yet available. What gets sold starting this week is the FPGA solution.

They expressed their opinion that a lot of DP1.3 devices will have variable vblank support, but as said several times before, DP1.3 is not necessary to get that feature, as even nV itself said.
[...] The purpose was probably to notify people of this and increase the momentum for getting support among monitor manufacturers.

Who said one cannot bring support for variable vblank intervals to desktop monitors? It's probably true that the vast majority of monitors currently on the market don't support it. But there may be exceptions where the monitor electronics is already capable of handling it. And as this is a relatively simple thing to support (refresh locked PWM regulated backlights and backlight strobing likely need more effort), it has the potential that it gets picked up relatively fast if the monitor manufactures feel the demand, at least in the niche for gaming monitors. And one probably has not to wait for DP1.3 for support as the first gsync monitors are also not DP1.3 compatible and do just what's needed.
This reminds me a little bit of GPU-Computing in the sense that there's something that could help the vendors generating revenue and people could make use of. Nvidia is pursuing the path to seeding the necessary tools in order to create that market for themselves in the course of which they try to protect their investment with proprietary technology.

AMD in the meantime is making their own products ready for the same market, shows it to people and makes it's part of the ecosystem as open as they possibly dare, hoping other companies pick up the crumbs and fill in the blanks.

Both approaches have their respective merits I think and while it is vital to have open access to standards in the long run, it is also critically important that the market is able to develop at all, for which "wait and see" is not always the right approach.
 
I'll keep an eye out for this, but as I said I have not seen any of those yet (except my own and the other 1st-gen 30-inch LCDs).

My HP 30" monitor also doesn't have a scaler (fantastic panel, cheap price at the time). I've also owned a few cheapo 24" IPS monitors that didn't have scalers.

Many people that aren't in business procurement likely won't see them as gaming monitors will almost always have a scaler.

Regards,
SB
 
This reminds me a little bit of GPU-Computing in the sense that there's something that could help the vendors generating revenue and people could make use of. Nvidia is pursuing the path to seeding the necessary tools in order to create that market for themselves in the course of which they try to protect their investment with proprietary technology.

AMD in the meantime is making their own products ready for the same market, shows it to people and makes it's part of the ecosystem as open as they possibly dare, hoping other companies pick up the crumbs and fill in the blanks.

Both approaches have their respective merits I think and while it is vital to have open access to standards in the long run, it is also critically important that the market is able to develop at all, for which "wait and see" is not always the right approach.

I had the same thought but there's a big difference: GPGPU needed (and still needs) colossal investments in hardware and software (languages, compilers, tools, education, etc.) that NVIDIA had to make because no one else would.

Here it seems to be a fairly simple problem that scaler and monitor manufacturers ought to be able to solve pretty easily by following the VESA standard.
 
Yes, the scaler in these monitors has been replaced with the FPGA chip, so no duplicates or unfit solutions here.
Putting in a quite expensive FPGA board with 768MB of RAM to replace some small ASIC with very little RAM is basically the definition of a rushed prototype like solution. That's exactly what FPGAs are widely used for: small quantity custom electronics where the base cost for an ASIC is prohibitive and prototypes ;).
I know it is expected that the FPGA will be replaced by an ASIC later on (then the solution will be virtually the same as with a "freesync" supporting monitor, the monitor electronics will simply support variable vblank but it will feature an additional nV vendor lock?). But then you can't make the argument, that nV is now ready with a polished product.
 
Last edited by a moderator:
PC Perspective enjoys the amount of hits they are getting for what is almost certainly is ridiculous extrapolation and plain fabrication. Not a direct quote to be found in the entire mess.

In my opinion all AMD showed is that the displayport protocol (1.0+) doesn't care about the vblank interval, at least one monitor doesn't either and their graphics cards are flexible enough to vary it. All this shit about scalers, displayport 1.3, etc etc ... all completely unattributed and in my opinion bullshit. Stuff like history based overdrive and strobing backlights need to interact with the vblank period, but not the scaler ... it just takes five during the vblank.
 
Thread closed as I consider whether joining B3D to post an AMD letter is really appropriate or not. I hadn't twigged this was velnias's only contribution to this board when I posted earlier.
 
Status
Not open for further replies.
Back
Top