AMD Vega 10, Vega 11, Vega 12 and Vega 20 Rumors and Discussion

What about FP32? It doesn't seem to have support for proper FP32 IEEE 754 ops, unless I missed something. But perhaps it's still usable in many cases.
FP33? Good question. I thought the second gen had it, but I may just have imagined that...
In any case, Google says they can use v2 for training as well. Maybe they have some custom format that isn't quite a complete as FP32.
 
Close to a 1080ti is a big statement, I doubt it right now... But, I agree itsmydamnation will enjoy the card anyway.
 
It will end up very close to the 1080ti Great buy and a fantastic card.

Or it could end up in a worse spot than now if forward rendering + MSAA makes a resurgence.

Buying hardware on vague gut feelings promising driver improvements is stupid and I expected people here, of all places, to call it out.
 
Or it could end up in a worse spot than now if forward rendering + MSAA makes a resurgence.

Buying hardware on vague gut feelings promising driver improvements is stupid and I expected people here, of all places, to call it out.
You want to place bets on that? I'm willing to put serious coin on it.


. Even if some method of forward rendering takes off will you still see rop based msaa. Sebbbi has even said he doesn't care about msaa because compared to current methods yet thinks it's inferior.
 
Yeah BTW, any news what's the deal with bad MSAA performances ?
Honestly, I don't care anymore. These days I disable MSAA in any game that features post-processing AA, and I no longer force it through drivers anywhere.

MSAA was always a half-botched solution to AA, in that it gobbles huge amounts of RAM and memory BW, while only adressing edge aliasing. It's not efficient.
 
Temporal anti-aliasing methods are where it's at. How does TXAA and similar methods perform on Vega I wonder?
 
You want to place bets on that? I'm willing to put serious coin on it.


. Even if some method of forward rendering takes off will you still see rop based msaa. Sebbbi has even said he doesn't care about msaa because compared to current methods yet thinks it's inferior.

VR. Forward rendering and ROP based AA makes sense when latency and blur makes your users puke. And with how much money is dumped into VR, I'd expect engine vendors to pivot towards serving this niche.

But that's missing my entire point, pulling a random worst case scenario for Vega is exactly as rooted in fact as conjuring performance increases over time extrapolated from previous generations (who am I kidding, it's always Tahiti vs Kepler) with a healthy dose of missing feature conspiracies.
 
Temporal anti-aliasing methods are where it's at. How does TXAA and similar methods perform on Vega I wonder?
TXAA is NVIDIA exclusive, TAA is available for both vendors but it only works on Frostbite and Unreal for now. Most games use post processing methods: FXAA, MLAA, SMAA.

DSR killed the MSAA star.
I agree for medium sized monitors like 720p and 1080p, but for 1440p and 2160p the cost of DSR is simply too high.
You want to place bets on that? I'm willing to put serious coin on it.
His post is directed to the claim of Vega 64 edging 1080Ti, your argument of Vega 56 edging 1070 in time is rather reasonable. Unless you meant the former.
 
VR. Forward rendering and ROP based AA makes sense when latency and blur makes your users puke. And with how much money is dumped into VR, I'd expect engine vendors to pivot towards serving this niche.
Yet current hardware excels at that and the VR push has stalled, primarily because the technology just isn't quite there. The latency issue is why a reprojected solution with reconstruction is the likely best solution. VR demands near zero frame times at the HMD's refresh. Async spacewarp and high priority async compute changed the game there and it's only an early test of the techniques.

His post is directed to the claim of Vega 64 edging 1080Ti, your argument of Vega 56 edging 1070 in time is rather reasonable. Unless you meant the former.
Is there really any doubt Vega will overtake 1080ti in the future? The features to do it in the short term are there with packed math, primitive shaders, and DSBR. Longer term bindless resources, Tier3 features, GPU driven rendering, and SM6. Not to mention the historical trend of Nvidia cards aging rather badly to encourage sales. That seems to be readily evidenced with all the cache added to both Vega and Volta and Volta's inclusion of hardware scheduling at various levels.
 
VR. Forward rendering and ROP based AA makes sense when latency and blur makes your users puke. And with how much money is dumped into VR, I'd expect engine vendors to pivot towards serving this niche.

But that's missing my entire point, pulling a random worst case scenario for Vega is exactly as rooted in fact as conjuring performance increases over time extrapolated from previous generations (who am I kidding, it's always Tahiti vs Kepler) with a healthy dose of missing feature conspiracies.

That's not going to happen until there's a market for VR that allows developers to make money. As it currently stands most VR developers making full game experiences lose money unless they are heavily subsidized by a 3rd party. And there is no VR market that can support AAA level VR development in anything other than as an add-on to an existing game (like Resident Evil).

VR is extremely niche right now although there is hope that it'll take off at some point.

This means that VR related idiosyncrasies aren't going to affect the vast majority of game development, especially for AAA games. In that arena, temporal AA remains far more attractive and is where the industry is going.

Regards,
SB
 
Is there really any doubt Vega will overtake 1080ti in the future? The features to do it in the short term are there with packed math, primitive shaders, and DSBR. Longer term bindless resources, Tier3 features, GPU driven rendering, and SM6. Not to mention the historical trend of Nvidia cards aging rather badly to encourage sales. That seems to be readily evidenced with all the cache added to both Vega and Volta and Volta's inclusion of hardware scheduling at various levels.
Dream on. Stuffing a bunch of non-functioning features inside an inefficient power hungry package that hobbles itself at high clocks really isn't going to catch a 1080ti. And if by chance it gets close the next dance will have already started with Volta.
Edit: Clarification
 
His post is directed to the claim of Vega 64 edging 1080Ti, your argument of Vega 56 edging 1070 in time is rather reasonable. Unless you meant the former.

Point taken, I never try to guess where something will exactly end up, but rather look at trends. So as to where Vega 56 will end up vs pascal 2 years from now i have no idea. But the 290 i bought did very well over its life, I wouldn't be surprised if Vega 56 does better then the 290 just based of the massive amount of flops it has.
 
Dream on. Stuffing a bunch of non-functioning features inside an inefficient power hungry package that hobbles itself at high clocks really isn't going to catch a 1080ti. And if by chance it gets close the next dance will have already started with Volta.
What dream? Multiplication is a fairly simple mathematical concept to grasp. We know current performance and what boost certain features provide. It's not difficult.

As for efficiency, you seem to disregard performance once the card is tuned. There are ample examples now showing huge drops in power with minimal performance impact without undervolting. Features that increase performance will obviously affect that as well. Unless you anticipate packed math or culling increasing power consumption past the maximum?
 
Back
Top