Sir Eric Demers on AMD R600

Status
Not open for further replies.

B3D News

Beyond3D News
Regular
Part of the myriad things we do here at Beyond3D when covering a graphics architecture is go as deep as we can with the personnel at the IHV that actually architected, concocted, designed, built and shipped the thing.


Read the full news item
 
Here's my candidate for "most surprising answer":

Does a 512-bit bus require a die size that's going to be in the neighbourhood (or bigger) of R600 going forward?

No, through multiple layers of pads, or through distributed pads or even through stacked dies, large memory bit widths are certainly possible. Certainly a certain size and a minimum number of “consumersâ€￾ is required to enable this technology, but it’s not required to have a large die
 
It appears that R600 lack of performance is due to drivers. The question is how long / when it takes to have good / performance drivers?
 
It appears that R600 lack of performance is due to drivers. The question is how long / when it takes to have good / performance drivers?

Well, that's one question. That performance increases says nothing one way or another about where the actual performance ceiling is once you reach it.

I've found it interesting that NV hasn't seemed to have gotten a generic perf bump of any noticeable (to me, anyway) size seven months in on G80. But then it seems pretty clear they were further behind on Vista drivers so've probably had to spend more focus on that.

So, net-net, his perfect storm comment is entirely true, but personally I think magic drivers should be accepted gratefully when they arrive, rather than confidently expected to arrive any day now.
 
What a question! DX10 basically decided everything in the R600. We kept DX9 in mind, and certainly wanted to make a great DX9 part and did (basically, we are generally 1.5 to 2x the performance of our previous high end part in DX9)
1.5 to 2x faster?! Where? When?
 
It appears they want to pin the lack of performance on drivers. Remember prior claims to the extent of:we're on top of the game when it comes to Vista etc., made in order to rag on nVs rather bumpy ride, only to come out today and to say that they`re pretty green(sic). Also notice how UVD makes its third different appearance, as being something low-powerish fully hardware for the lower end parts to something implemented in the shaders for the 2900, WITH BETTER QUALITY:-?(this has never been mentioned before AFAIR).

Sir Eric is a great guy, but I think that the interview was finely combed by marketing, so I wouldn`t put huge stock on the greatly optimistic claims being made. There`s hope, yes, but only time will tell. And how reassuring is it for today`s buyers to read that great performance is expected in the coming months and years?That`s quite a window....
 
New drivers are not going to make up for the lack of texturing power compared with G80. I'm also suspicious about how much they can mitigate the performance-crash R600 experiences once you start to use AA. But it would be nice to be proved wrong.
 
It appears they want to pin the lack of performance on drivers. Remember prior claims to the extent of:we're on top of the game when it comes to Vista etc., made in order to rag on nVs rather bumpy ride

He's making a clear distinction between stability and performance, so I don't think your comment is on point, frankly. NV didn't even have a G80 Vista driver when they were getting pummelled over drivers (i.e. that's what they were getting pummelled over, not performance. That whole class-action nonsense didn't say anything about performance. . . )
 
1.5 to 2x faster?! Where? When?
Many games at higher resolutions (above 1600x1200) without AA/AF (Doom 3, FEAR, LostCoast, Oblivion, Prey, Rainbow Six: Vegas, Chronicles of Riddick, Serious Sam 2, Splinter Cell 3, Tomb Raider Legend...)

New drivers are not going to make up for the lack of texturing power compared with G80. I'm also suspicious about how much they can mitigate the performance-crash R600 experiences once you start to use AA. But it would be nice to be proved wrong.
I'm not sure if R600's problem is the lack of texturing power. This could be true for a few games, but without MSAA enabled, R600 is pretty close to GTX, at average. Considerable problem is huge MSAA performance drop...
 
I found the commentary on the virtualization of chip resources interesting.
The chip does do some kind of renaming/mapping of registers in order for it to execute different threads.

I'm sure a big area of future "optimizations" is just finding ways to make sure different contexts being mapped to the hardware don't bump heads on shared resources and are overlapped to handle variable latencies better.

As for the die shot:
AARRGGHH!!
They had a picture of the metal layer?
That's like putting a burka on a hot chick.
Is it wrong that I want to see the transistor layer naked?

That's it, I issue a challenge to ATI to provide a high-res transistor layer die shot for me to ogle, or the terrorists have won.
 
It appears they want to pin the lack of performance on drivers. Remember prior claims to the extent of:we're on top of the game when it comes to Vista etc., made in order to rag on nVs rather bumpy ride, only to come out today and to say that they`re pretty green(sic).
As Geo suggests, you do need to separate an aim for stability first from performance later.

Also notice how UVD makes its third different appearance, as being something low-powerish fully hardware for the lower end parts to something implemented in the shaders for the 2900, WITH BETTER QUALITY:-?(this has never been mentioned before AFAIR).

This has, actually, always been the point of view. For the most part quality is video post-processing and the more processing you throw at it the more quality you can extract from it. Given the differences between the render capabilities of R600 and the rest of the line the expectation is that R600 would set the benchmark, right off, and it was a question to see how close the others get.
 
Many games at higher resolutions (above 1600x1200) without AA/AF (Doom 3, FEAR, LostCoast, Oblivion, Prey, Rainbow Six: Vegas, Chronicles of Riddick, Serious Sam 2, Splinter Cell 3, Tomb Raider Legend...)
Doom 3 benefits, bigtime, from the stencil op advancements Eric talks about. Oblivion, and other of its ilk, certainly likes the HDR improvements (along with many other parts of the architecture changes).
 
Or, said another way, UVD has little to nothing to do with quality directly. Indirectly it might a touch, only because you've offloaded another device (the shader core; the cpu) of that function and you can then ask that other device to do more on the quality end.
 
I'm not sure if R600's problem is the lack of texturing power. This could be true for a few games, but without MSAA enabled, R600 is pretty close to GTX, at average. Considerable problem is huge MSAA performance drop...
Things may have changed in the past couple of weeks, but the last benchmarks I saw suggested that R600 performance fell off a cliff as soon as you enabled any significant amount of anisotropic filtering. (By contrast 8800GTX performance drops off far less as you step up the AF level).
 
I found the commentary on the virtualization of chip resources interesting.
The chip does do some kind of renaming/mapping of registers in order for it to execute different threads.

I'm sure a big area of future "optimizations" is just finding ways to make sure different contexts being mapped to the hardware don't bump heads on shared resources and are overlapped to handle variable latencies better.

As for the die shot:
AARRGGHH!!
They had a picture of the metal layer?
That's like putting a burka on a hot chick.
Is it wrong that I want to see the transistor layer naked?

That's it, I issue a challenge to ATI to provide a high-res transistor layer die shot for me to ogle, or the terrorists have won.

Oi we'll have none of your smut here :D

This whole chain of posts made me sincerely laugh out loud....
...then I thought about it, and I'd like to see it naked too! :LOL:

I'm not sure if that is funny, sad or somehow the workings of several twisted minds :devilish:

We demand naked shots!
 
Things may have changed in the past couple of weeks, but the last benchmarks I saw suggested that R600 performance fell off a cliff as soon as you enabled any significant amount of anisotropic filtering. (By contrast 8800GTX performance drops off far less as you step up the AF level).
Even R580 was many times slower than G71 when it came to AF only benchmarks. But lower MSAA performance drop erased this insufficiency and placed R580 higher. R600's AF performace drop is similar to R580 AF performance dorp, or only slightly higher...
 
It's not that R600 is worse than R580 on filtering, it's that G80 is hugely more performant than G7x on filtering. You're basically getting that first aniso level for free on G80, and if you go back and look at the older benchmarks that's where the big performance hit typically was.
 
As Geo suggests, you do need to separate an aim for stability first from performance later.


Hi Dave. Sadly, stability is not there, at least in regards to Vista32 and Crossfire functionality. From starting fullscreen apps and getting just a black screen(requiring a reboot), to disabling Crossfire leading to a memleak in CCC(again requiring a restart, and only being able to properly restart if you have a cpu that can deal with the leak(think 3.4ghz COre2++) and still run OS, to numerous applications unable to use Crossfire, to slave cards running "full-screen 3D speeds" ALL THE TIME, I can hardly call any driver released for Vista and the new product line as stable.

I don't want to rain on your parade, but this absolutely sucks. I'll leave the video-process bit out yet...still not working, so I'll make no judgements until you guys can pull your head out of the sand and start being honest with consumers.:devilish:
 
Status
Not open for further replies.
Back
Top