Anand has the details about r520,rv530,rv515

The question on H.264 support is how much can be supported - NVIDIA say that Pure Video are basically programmable units, but that would give them a finite capacity. Can Pure Video on current units support as much of the H.264 pipeline as Avivo can? It'll be interesting to test the CPU utilisation of them both.

I think the quality of the Xilleon TV Out will be interesting to see as well.
 
geo said:
Whimper!

Tho it is the bottom feeder, and ATI is famous for pulling out bits and pieces on the way down the line. Yeah, that's it. Keep the faith, baby; put on that brave face!

I'm right there with 'ya. Perhaps it's not too late for a joint offering to the silicon gods?
 
Why oh why does an X1300 need 512 MB or RAM on board? And I like how the clock speeds are mysteriously blank. :D
 
John Reynolds said:
I'm right there with 'ya. Perhaps it's not too late for a joint offering to the silicon gods?

Unfortunately, we're probably kidding ourselves tho. You saw Rys & Wavey upstream mention only one "secret" left, and they may "gloss over" it at launch. That sounds awfully software-y to me, rather than fundamental hardware changes (maybe a bit of hardware, like NV did with G70 for Transparency AA). You can't gloss over the options for AA support are 4x, 8x, and 12x. Well, I suppose if you were really stupid you could try, but there would be yells of "Huh? What?! Back up, back up!" all over the room. ;)
 
As a side comment, I find it unlikely that ATI would also provide a form of adaptive supersampling for this architecture. Almost never do such features occur at nearly the same time from both companies. If ATI decides to support the feature, it likely won't appear until sometime next year (Probably R600).
 
Slides said:
Couldn't agree more. I'm "still" using an "old" 9800Pro, mostly because I have not seen any huge increase in image quality from both ATI or Nvidia that would justify the high cost of current video cards. Sure I could keep bumping up the resolution and AA/AF, but my LCD limits me to 1280x1024 and after a while I no longer can see the difference between higher levels of AA/AF.

uh... there is no way you are playing games released in the last year or last few demos at anything higher than 800x600 with *mostly* turned up settings. If you want to play at 1024x768 then i gurantee you are setting the game to medium settings.

I know, I was using a 9800pro for a while till i upgraded to this X8850pro about a month ago.

the difference is low 20's at high settings with dips into the teens at 1024x768 (without AA) and 40-60 at high settings at 1024 with AA.
 
Hellbinder said:
uh... there is no way you are playing games released in the last year or last few demos at anything higher than 800x600 with *mostly* turned up settings. If you want to play at 1024x768 then i gurantee you are setting the game to medium settings.

I know, I was using a 9800pro for a while till i upgraded to this X8850pro about a month ago.

the difference is low 20's at high settings with dips into the teens at 1024x768 (without AA) and 40-60 at high settings at 1024 with AA.
QFT... I find 1600x1200 with AA looks *slightly* better than 1024 no aa and maybe lower detail :cool:
With a 6600GT with far cry riddick and doom 3 I play at 1280x960 :smile:
With far cry I actually go for 1024 or 1152 with 4x fsaa singe jaggies are so obvious, same with HL2.
 
geo said:
Unfortunately, we're probably kidding ourselves tho. You saw Rys & Wavey upstream mention only one "secret" left, and they may "gloss over" it at launch. That sounds awfully software-y to me, rather than fundamental hardware changes (maybe a bit of hardware, like NV did with G70 for Transparency AA). You can't gloss over the options for AA support are 4x, 8x, and 12x. Well, I suppose if you were really stupid you could try, but there would be yells of "Huh? What?! Back up, back up!" all over the room. ;)
someone remind me why R(3/4)x0 can't support more than 6x FSAA? can't find the post.
 
3 loops of 2-each on the rops. Personally, based on nothing much at all, I figured it made more sense to up the minimum to 4 per pass rather than add more passes.

Edit: Well, and frame-buffer limits at 256mb were a bit tight-ish for 8x, as I recall Sireric saying somewhere around here (which might be what you are pointing at).
 
Last edited by a moderator:
Hellbinder said:
uh... there is no way you are playing games released in the last year or last few demos at anything higher than 800x600 with *mostly* turned up settings. If you want to play at 1024x768 then i gurantee you are setting the game to medium settings.

I know, I was using a 9800pro for a while till i upgraded to this X8850pro about a month ago.

the difference is low 20's at high settings with dips into the teens at 1024x768 (without AA) and 40-60 at high settings at 1024 with AA.

Depends, if you're not looking at your Fraps counter all the times then you can easily be fine with even a 9700 Pro at 1024x768 at mostly high settings in games like HL2, Farcry, and be close to this in many other games, slightly better on a 9800 Pro. I know its possible, I did it for a very long time.

Also, I never did get how a AIB can leak info onto a website about a product like that. I dont know if I should call them stupid, or wonder if they did it on on purpose.
 
geo said:
Unfortunately, we're probably kidding ourselves tho. You saw Rys & Wavey upstream mention only one "secret" left, and they may "gloss over" it at launch. That sounds awfully software-y to me, rather than fundamental hardware changes (maybe a bit of hardware, like NV did with G70 for Transparency AA). You can't gloss over the options for AA support are 4x, 8x, and 12x. Well, I suppose if you were really stupid you could try, but there would be yells of "Huh? What?! Back up, back up!" all over the room. ;)

Rys did seem overly enthused about it though - then came the damper from Dave....
 
Junkstyle said:
I think that single level HDR tech demo from Valve called: Lost Coast is going to ship at the same time ATI releases their R520s. I think Valve still owes them for taking all those millions from ATI and delivering the game late, insead of with the release of the video card. Remember the coupons?.

Been saying that since. march/april/may

Remember Valve showing the first HDR screenshots on Ati hardware? that's about the time they hadn't fixed it for all SM2.0 cards yet ;)

yes.. I'm a coupon user too :D
 
geo said:
Whimper! :cry:

Tho it is the bottom feeder, and ATI is famous for pulling out bits and pieces on the way down the line. Yeah, that's it. Keep the faith, baby; put on that brave face!

A bottom feeder with 512MB? bring on the TurboCache/HyperMemory I say..
 
Yeah, but for the low end, vendors understand that many people still differentiate video cards by the amount of memory, when the amount of memory is quite possibly the one measure of a video card's performance that means the least. So, some people will go to the store and see 512MB for $80, and go all ga-ga over it. Never mind that they could probably have paid $80 for a 128MB card and gotten better performance.
 
Back
Top