Anand has the details about r520,rv530,rv515

Ailuros said:
I'll be the first one to stand up and applaud if it delivers higher image quality on every front, but with similar performance to the competition. Frankly I often don't understand that fixation many have with performance; the G70 isn't exactly "slow", rather the contrary.
Couldn't agree more. I'm "still" using an "old" 9800Pro, mostly because I have not seen any huge increase in image quality from both ATI or Nvidia that would justify the high cost of current video cards. Sure I could keep bumping up the resolution and AA/AF, but my LCD limits me to 1280x1024 and after a while I no longer can see the difference between higher levels of AA/AF.
 
Ailuros said:
I'll be the first one to stand up and applaud if it delivers higher image quality on every front, but with similar performance to the competition. Frankly I often don't understand that fixation many have with performance; the G70 isn't exactly "slow", rather the contrary.
Well, once game developers start to get their hands on some game engines with shader writing interfaces that don't need programming, I expect we'll have all of that nice fillrate used up pretty dramatically to push stunning visuals at merely medium-high resolution on the fastest of hardware.
 
Slides said:
Couldn't agree more. I'm "still" using an "old" 9800Pro, mostly because I have not seen any huge increase in image quality from both ATI or Nvidia that would justify the high cost of current video cards. Sure I could keep bumping up the resolution and AA/AF, but my LCD limits me to 1280x1024 and after a while I no longer can see the difference between higher levels of AA/AF.

Besides improvements in AA and AF what can a nvidia or ati do? Everything is going to be going through DirectX9-10 and SM3.0. It doesn't give a company much elbo room to have some cool API call that only they do. Hail Microsoft and their now all but final monopoly on 3d graphics. R.I.P. OGL. </sarcasm>
 
Chalnoth said:
After playing some FarCry with HDR enabled, I have a hard time understanding this position. I mean, I've tried to go back and play the game without HDR. I just can't do it. The difference is so dramatic. Edge aliasing is a comparitively minor issue that I sometimes notice when the action slows down. But the difference between HDR and not just permeates the entire game.

Woiuld you accept going back to bilinear filtering for something like HDR? I wouldn't. Too many years training (for lack of a better word) my eyes to appreciate the image quality the feature brings to most games. Same with good anti-aliasing.
 
http://www.theinquirer.net/?article=26276

[oldfogey=on]

Is Theo that young that he thinks GF3 was the first time? I can't be the only one that remembers Rage128 had been crowned "performance champ" for about a week when the original Detonators hit the street. . .followed shortly by TNT2 and "eat my dust" for several years.

[oldfogey=off]

Here's an interesting point tho to pick at from the article --Will we see any dual-core testing in R520 reviews? How good will NV's dual-core opts be? How long until dual-core starts to pass tippity-top single-core graphics scores?

Edit: Decrying right pixel-stained wretch.
 
Last edited by a moderator:
Junkstyle said:
Hail Microsoft and their now all but final monopoly on 3d graphics. R.I.P. OGL. </sarcasm>
Heh, as if you won't have a nearly identical interface in OpenGL. You can be sure, for instance, that nVidia will release DX10-like extensions for their hardware once they get DX10-level hardware out. The only question is how long it will take the ARB to catch up....we all know that ATI will want to go the ARB route (assuming they follow past history), so they'll be pushing for an extension as quickly as possible.

The OpenGL version of the shaders will likely be put forward as an expansion (or set of expansions) to the existing GLSL language when it makes it through the ARB.
 
John Reynolds said:
Woiuld you accept going back to bilinear filtering for something like HDR? I wouldn't. Too many years training (for lack of a better word) my eyes to appreciate the image quality the feature brings to most games. Same with good anti-aliasing.
Ah, but we already have! Normal texture filtering doesn't work properly with bump mapping, and so we get tons of aliasing on regular bump-mapped surfaces in games (Doom3 has a number of these surfaces).

And what about alpha-tested surfaces? Until very recently there's been no AA support for those, in most games.
 
geo said:
... Is Faud that young that he thinks GF3 was the first time? ...
It's Theo, not Fudo, you old fogey! :p I became suspicious when he threw in the WC3 reference, so I checked the byline. Fuad doesn't go in for that sort of thing, AFAIK.

Anyway, Det80 sounds good. If I owned a 7800, an X2, and an HDTV, I'd be (even more) psyched. :)
 
Last edited by a moderator:
Skrying said:
Hmm, thinking about how ATi is releasing info about AVIVO right now, it makes me wonder if R520 is really full of other interesting bits to the point that ATi needed to seperate some of it.

That's what I was thinking. The features in the AVIVO technology are definately impressive. :cool:
 
I'm going to try something different and not quote what I'm replying to. Don't panic.

... didn't last long... ;) wish more people would do the same.

AVIVo seems to be worth the painful wait i have endured untill the h.264 mystery was solved.

the avivo flash demo was a little underwhelming to say the least. Looks like someone needs to wipe the sleep out of thier eyes on the left???
 
Chalnoth said:
Well, once game developers start to get their hands on some game engines with shader writing interfaces that don't need programming, I expect we'll have all of that nice fillrate used up pretty dramatically to push stunning visuals at merely medium-high resolution on the fastest of hardware.

Where have I have I heard something similar over and over again in the past years? Try to explain to me first how on God's green earth games in 2004/5 can still have an ungodly amount of alpha tests, while the majority of GPUs out there have been "fine-tuned" for multisampling AA.

I'm willing to invest a fair amount of hope, but I won't get rid of the feeling that IHVs again will have to compensate with additional features like Transparency AA for the fore mentioned issue (until of course I see it).
 
Chalnoth said:
Ah, but we already have! Normal texture filtering doesn't work properly with bump mapping, and so we get tons of aliasing on regular bump-mapped surfaces in games (Doom3 has a number of these surfaces).

And what about alpha-tested surfaces? Until very recently there's been no AA support for those, in most games.

iBingo ;) In accordance to my former post, will ISVs cater for shader AA or will IHVs just have to invent a more advanced form of adaptive AA that also antialiases shaders?
 
Could someone explain in a few words what the supposed advantages of Avivo over PureVideo should be? Or is it ~ the same in red? (nV also claims h264 acceleration in GF7, though not yet 100% confirmed)
 
http://www.club-3d.com/productshow_vga.php?gpu_brand=&ordercode=CGAX-P132VDD&show=PCX&p=&filter=

x1300pro from club3d

Radeon X1300Pro 512MB

Performance:
• MHz ATI™ R515Pro GPU
• MHz 128BIT memory
• 512MB GDDR3 memory
• 400MHz RAMDAC
• DirectX® pixel-pipelines
• x16 lane PCI Express™


Features:
• DirectX® 9.x compliant
• Shader Model 3 support
• High Dimension Floating-Point Textures
• Multiple Render Targets
• Up to 16x Anisotropic Filtering
• Up to 6x Multi Sampling Anti Aliasing
• Up to 12x Temporal Multi Sampling Anti Aliasing
• Full Screen Anti Aliasing Gamma Correction
• Avivo™
• Video Gamma Correction
• 3Dc™ Normal Map Compression
• Video in, Video Out
dual dvi

nothing to interesting though
 
Avivo's a bit more than what PureVideo (unless NV expand the meaning) is, but here's the overview I think you want.

Encode as well as decode support for H.264 AVC (and others, VC-1, H.264 SP/ASP, H.262 etc) is the big one. GeForce will only (announced anyway and supposedly in a leaked driver, but there's no DXVA interface currently and you might need NVDVD to see it regardless of that update to DirectX being made) support decode acceleration.

10bpc display pipeline (8-bit in GeForce), dual dual-link TMDS in Avivo-capable parts (only a single dual-link TMDS on the GPU with G7x).
 
Chalnoth said:
Ah, but we already have! Normal texture filtering doesn't work properly with bump mapping, and so we get tons of aliasing on regular bump-mapped surfaces in games (Doom3 has a number of these surfaces).

And what about alpha-tested surfaces? Until very recently there's been no AA support for those, in most games.

Those are limited portions of the screen. I know you could argue that edge AA is also a limited portion of what's rendered, but as you suggested no AA whatsoever for HDR I was merely suggesting no better form of texture filtering than bilinear whatsoever.

P.S. I wasn't that impressed with HDR in Far Cry or Chaos Theory. :cool:
 
Ailuros said:
I'll be the first one to stand up and applaud if it delivers higher image quality on every front, but with similar performance to the competition. Frankly I often don't understand that fixation many have with performance; the G70 isn't exactly "slow", rather the contrary.

FEAR would like a word with you - I'd go as far as to call the performance there unacceptable, especially in higher resolutions.
 
Back
Top