AMD: Speculation, Rumors, and Discussion (Archive)

Status
Not open for further replies.
neural networks have much more impact than just cars ;) and that market will be bigger than HPC market. Cars are just the start of it.

Don't expect these GPU's to go into "affordable" notebooks, more like 1k and up. Affordable notebooks will stick with IGPU's. That won't change any time soon.

This was from an old comic strip, think it was calvin and hobbes

do you think mr. coffee knows more than it's letting on to?

It was pretty funny back then but the implications of that are pretty strong.
 
Last edited:
I'll ask here because can't find a better thread

Why HBM2 has only one frequency tier?
A single 8 stacks kgsd at highter frequency could be sufficient for a middle range gpu, and make a smaller simpler cheapr interposer
 
Why HBM2 has only one frequency tier?
A single 8 stacks kgsd at highter frequency could be sufficient for a middle range gpu, and make a smaller simpler cheapr interposer

As I understand it, the frequency of the HBM interface is chiefly limited by the interposer, not the attached modules. For higher frequencies you'd need active interposers, and that would get really expensive in a hurry.
 
You think that the people who trade in trillion dollar markets are waiting for cheap FLOPs?


Oh they are not, one of the project I was on at Credit Suisse 7 years back they were already working on that, but of course it takes time for the AI to learn how the financial models shift and with all the different markets and variables its quite complex. Taking something like this from a macro to a micro view is very challenging to processor power that is necessary to do it. But they did upgrade to Fermi and Cuda when it came out, I'm sure they are looking into expanding that endeavor, but it always take times. They are still not using it for main stream yet.
 
As I understand it, the frequency of the HBM interface is chiefly limited by the interposer, not the attached modules. For higher frequencies you'd need active interposers, and that would get really expensive in a hurry.
But the same passive interposer doesn't seem to have issues passing through PCIe lanes. So limited in what way?
 
I think there's not much need for much horizontal travel for off-package IO in the interposer. The PCIe interface is engineered to drive fewer long connections over PCB, whereas an HBM interface is not. I'm not sure how indirect the lines can be for HBM, but while short compared to a PCB, the wires might be long enough to be considered long for a metal layer connection.
Perhaps the idea for an active interposer in this case would be one with repeaters built into it for the various lines to reduce wire delays.

Interestingly, the disclosure for HBM2 also included a change in the physical dimensions of the stack packaging. These are larger than they were before, which might explain why all the pre-release mockups for on-package memory seemed so large compared to the HBM models we've seen, and possibly contributed to the spacing left between stacks.
Whether the bigger packages are going to constrain something in the future due to their impinging on the patterned area of the interposer, or requiring additional margins at the periphery remains to be seen.

edit: http://www.anandtech.com/show/9969/jedec-publishes-hbm2-specification
5.48mmx7.29mm vs. 7.75mmx11.87mm
 
I think the frequency limitations have more to due with the power/performance aspect, not anything limited by the interposer or anything else.
 
Charlie has been a lot chirpy on forums for a couple of days. Apparently, there are three chips from AMD like the earlier rumors, only the bigger one is not named Greenland anymore(wccftech ran a story about it being vega10). The two chips on zauba are the small ones and the biggest one wasn't on public display. otoh he was clueless about the Polaris 10,11 branding. So semiaccurate I guess.

I don't know how it could have been any clearer :SSee for yourself:
https://www.techinferno.com/index.p...ios-for-p377sm/&do=findComment&comment=127850
It's rarely done because flashing a new vBios means the laptops Bios will stop recognizing the graphics card, and there's no practical gain between using one Bios or the other.

So not much of a success.

Because AMD wanted to give the impression that something was new when they rebranded Pitcairn for the 99th time?

That'd be the first rebrand and apparently that's why the 280X isn't there. I think this discussion has run its course, considering you liked Dave's comment here, I don't think I should bother any further.

https://forum.beyond3d.com/threads/...ation-rumor-thread.55600/page-90#post-1851125
 
I am wondering, why not go with Y16Cr10Cb10 without chroma subsampling (CrCb also at full resolution). Would give much better dynamic range and takes identical BW/storage of R12G12B12.

Various_gamma_curves.jpg


10 bit PQ already supports 0~10000 cd/m2 of dynamic range, so code level wise, it's very thrifty. :D It's not without cons though. It means fixed reference for peak brightness, whereas gamma is relative to whatever the display is capable of, so if your display isn't capable of reaching 10000 cd/m2 or your content is graded at lower peak luminance, grayscale code level beyond that point will simply be clipped out, leaving you with much less. It's Dolby's way of encouraging industry people to move towards 10000 cd/m2 as soon as possible even though 10000 cd/m2 is overkill for 10 bit grayscale. The majority of UHD Bluray movies utilizing mandatory 10 bit HDR layer (HDR10) will only use up to 1000 cd/m2 for the same reason even though it can go even higher. Beyond, 12 bit grayscale makes more sense and that's exactly what Dolby's doing. Every Dolby Vision contents have been graded at 0.005 cd/m2~4000 cd/m2 already, and if you play these contents with today's displays, the highlights will be automatically tone-mapped to the maximum luminance the display is capable of.
 
Beyond, 12 bit grayscale makes more sense and that's exactly what Dolby's doing. Every Dolby Vision contents have been graded at 0.005 cd/m2~4000 cd/m2 already, and if you play these contents with today's displays, the highlights will be automatically tone-mapped to the maximum luminance the display is capable of.
Good to hear that, because luminance definitely needs more bits than chroma (even on LDR content, and even more so on HDR).
 
Good to hear that, because luminance definitely needs more bits than chroma (even on LDR content, and even more so on HDR).

I bet you had to cry whenever you had to cut lighting (luminance) resolution by 1/2 or 1/4 instead of cutting down color resolution. Human eyes are much more sensisitive to luma than chroma.

PQ is still non-constant luminance based, so more grayscale bits have been reserved to dark/shadow range as human eyes are most sensitive to luminance change in that range. In PQ, only 7% more code level is required going from 1000 cd/m2 to 10000 cd/m2.

I have a question to you though. When do you expect PQ based HDR to take off in games? I heard a rumor Dolby Vision may be added to Unreal Engine 4. Correct me if I'm wrong, but I think the biggest challenge in getting HDR mainstream is not with engines, but rather with artists and monitors they use. I really want to play your next game in HDR, but I'm worried because there will be a big hassle to support both 8bit SDR + sRGB and 10bit HDR + DCI/P3 displays. Currently, HDR video file encoded in PQ does not look correct on legacy SDR displays and a fix for this is coming on HDMI 2.1, until then, you have to be applying SDR tone mapping manually. I'm wondering what monitors do your artists use. They are at least 10bit and DCI/AdobeRGB correct? You will need to replace monitor that has an official support for PQ EOTF so this really is an issue with budget... I heard from an another console developer that starting from this console generation, games are being developed with future remastering in 4K (and higher) + HDR in mind, so forced 8bit swap chain / tone mapping can be intercepted more readily than in last generation. Do you have anything to add to this?
 
I'd expect that as soon as HDR monitors enter high-end mainstream IHV will gladly add driver hacks to make AAA titles output HDR images. It seems to my naive self modifying the PostFX HDR to SDR chain (for selected titles) is doable (throw interns at the problem) and a good PR exercise. It will be fragile. Maybe I'm underestimating the problem. And of course the independent middleware and vendor specific libraries should be quick to adopt it. Look how quick they added VR support (barely in the market, if one counts the Occulus pre-release hardware), and I wouldn't expect VR to surpass HDR market penetration on the long run. (And of course the holy grail of HDR VR 8k 165fps outputs, because I don't care about my energy bills).
 
Status
Not open for further replies.
Back
Top