AMD: R8xx Speculation

How soon will Nvidia respond with GT300 to upcoming ATI-RV870 lineup GPUs

  • Within 1 or 2 weeks

    Votes: 1 0.6%
  • Within a month

    Votes: 5 3.2%
  • Within couple months

    Votes: 28 18.1%
  • Very late this year

    Votes: 52 33.5%
  • Not until next year

    Votes: 69 44.5%

  • Total voters
    155
  • Poll closed .
With regards to Vector Adaptive Deinterlacing I'm told that it does work on 5570, however its on a later driver than reviewers had. It is also available on 5450, but not with the "Enforce Smooth Video Playback" option enabled and some of the other post processing features may need to be traded off.
 
I still don't understand how we have enough data on anything DX11 based to say that the 5570 will have it only as a 'checkbox' feature. Last I recall, Dirt2 ran faster in DX11 mode than it did in either DX9 or DX10 modes. How would a device have 'checkbox only' capabilities in DX11 when it would likely be faster than the DX9 and DX10 renderpaths doing the same things?
 
I still don't understand how we have enough data on anything DX11 based to say that the 5570 will have it only as a 'checkbox' feature. Last I recall, Dirt2 ran faster in DX11 mode than it did in either DX9 or DX10 modes. How would a device have 'checkbox only' capabilities in DX11 when it would likely be faster than the DX9 and DX10 renderpaths doing the same things?

Of course Dirt 2 runs faster in DirectX 11 then DirectX 10 .. seeing as there is no DX10 option iirc.. even at 1FPS it's faster than the 0fps that DX10 yeilds ;-) Haven't seen any new (last week or so) Dirt 2 benchmarks with DX11 vs DX9 but I seem to recall that DX9 was faster then DX11 (THG Dirt 2 DX 11 vs DX9), on avg at least 30% faster in DX9 vs DX11. Of course things may have changed in last month with updated drivers and such. The results seem to mirror what is at pcgameshardware - dirt 2 direct x 11 vs direct x 9 benchmarks as well. Of course this is to be expected, given that DX9 has had years to mature, devs becoming increasingly comfortable working with Direct 3D 9.x and that DX11 adds (at a significant penalty) increased visuals.
 
Last edited by a moderator:
I still don't understand how we have enough data on anything DX11 based to say that the 5570 will have it only as a 'checkbox' feature. Last I recall, Dirt2 ran faster in DX11 mode than it did in either DX9 or DX10 modes. How would a device have 'checkbox only' capabilities in DX11 when it would likely be faster than the DX9 and DX10 renderpaths doing the same things?

ATM there are only DX11 ports that show how benefical CS can be, and even then, it's mainly in the highest SSAO setting although it would benefit lower quality SSAO even more.

With about 40% improvements with CS5 SSAO over the same effect using PS4 in STALKER, it's some big improvement for Redwood and Juniper, the latter being faster than a quite similar RV790 by a fair margin and the former probably coming close to RV740.

Cedar is another story as it really lacks some shading muscle.
 
no, but I'm sure you'll write an article on it. I'm also fairly certain that they'll introduce a new socket for the integrated graphics parts and a new socket for the non-integrated parts.

I have seen bits about the new socket, but since I don't have anything hard or detailed, this is from hazy memory. :) I think it is called AM3R2 or something close to it. I am pretty sure that the main changes are to power delivery rather than any data pins. I *THINK* that AM3R2 is backwards compatible to AM3, but I can't recall where I heard that, so it may very well be wrong.

If you think about when AMD started talking about 'Fusion', it was quite a while ago, I think during the S939 days. It was supposed to be implemented much earlier than now, so it would not surprise me in the least that AM2/2+/3 had pins reserved to do graphics.

Also, since Llano won't be a major update to the K10h chips, and the clocks won't go WAY up (10%?), memory bandwidth for the cores use should be more than satisfied with DDR3/1066 for any low to mid-range parts. I have seen slides saying up to DDR3/1600, so that should provide more bandwidth to an on-die GPU than the discrete parts have had to date.

This is the long way of saying them very well might not need to change the socket.

-Charlie
 
Of course Dirt 2 runs faster in DirectX 11 then DirectX 10 .. seeing as there is no DX10 option iirc.. even at 1FPS it's faster than the 0fps that DX10 yeilds ;-) Haven't seen any new (last week or so) Dirt 2 benchmarks with DX11 vs DX9 but I seem to recall that DX9 was faster then DX11 (THG Dirt 2 DX 11 vs DX9), on avg at least 30% faster in DX9 vs DX11. Of course things may have changed in last month with updated drivers and such. The results seem to mirror what is at pcgameshardware - dirt 2 direct x 11 vs direct x 9 benchmarks as well. Of course this is to be expected, given that DX9 has had years to mature, devs becoming increasingly comfortable working with Direct 3D 9.x and that DX11 adds (at a significant penalty) increased visuals.

You're not comparing DX9 to DX11, you're comparing different quality settings. Unless you can control all the fidelity settings, using DiRT2 as a benchmark to show how "slow" DX11 is is nuts. Unless someone comes with a benchmark that just enables a DX11 codepath and not a slew of other features (tessellation, SSAO) it's simply invalid to say that DX11 is slow and DX9(or 10, 10.1) is not.
 
You're not comparing DX9 to DX11, you're comparing different quality settings. Unless you can control all the fidelity settings, using DiRT2 as a benchmark to show how "slow" DX11 is is nuts. Unless someone comes with a benchmark that just enables a DX11 codepath and not a slew of other features (tessellation, SSAO) it's simply invalid to say that DX11 is slow and DX9(or 10, 10.1) is not.

I think the major point is if the low end 5450 and 5570 can actually play the game with dx 11 turn on. If it can't play the dx 11 games with dx 11 turned on. Then whats the point. Its just a check point feature.


Now with that said I find nothing wrong with that as most cards in this price point through the history of the cards couldn't play the newest dx games with the features turned on.

I'm more concerned with performance in dx 9/ 10 games. Seems to be sub par.

I don't understand where any of these sub 5670 cards are meant to be. And I say this because I understand that the 5670 is price at about 80-19

http://www.newegg.com/Product/Produ...4&cm_re=radeon_hd_5670-_-14-131-334-_-Product

512 meg verison is $85 after rebate and the 1gig shown below is $100

http://www.newegg.com/Product/Produ...7&cm_re=radeon_hd_5670-_-14-127-477-_-Product

So what price point do these enter and what price point does the 5450 enter ?

The only people i can see that would buy these are those that need a low profile card and even then I think we will see 5670s with low profile.
 
You're not comparing DX9 to DX11, you're comparing different quality settings. Unless you can control all the fidelity settings, using DiRT2 as a benchmark to show how "slow" DX11 is is nuts. Unless someone comes with a benchmark that just enables a DX11 codepath and not a slew of other features (tessellation, SSAO) it's simply invalid to say that DX11 is slow and DX9(or 10, 10.1) is not.

First DX11 GPUs will be "slow" with real DX11 games of the less foreseeable future but that mostly because it'll take several years until they appear. What we have today IMO is mostly games with a DX9 "backbone" with additional paths (10, 10.1, 11) added to them.

If an ISV would release any time soon a "full" DX11 game I doubt anyone would have the feeling that DX11 is "slow".

It wasn't any different in the past either. Take Crysis in DX9 and try to run it on a 9700PRO f.e. It will run of course but you'll need quite a bunch of sacrifices. When the first ever DX9 GPU was released all we saw for a very long time were games with a DX7/8 backbone and some DX9 features/effects added to the mix.

It has absolutely nothing to do how "mature" an API is (if that even makes sense heh...); it's more like ISVs coding games with the lowest possible denominator in mind. If an ISV would now release a pure DX11 game with no DX9 fallback mode it would narrow the game's selling potential tremendously. I often have the feeling that folks tend to forget how things evolve in the graphics market; hardware has to be several steps ahead than software. That "minor" detail isn't ever going to change.
 
First DX11 GPUs will be "slow" with real DX11 games of the less foreseeable future but that mostly because it'll take several years until they appear. What we have today IMO is mostly games with a DX9 "backbone" with additional paths (10, 10.1, 11) added to them.

If an ISV would release any time soon a "full" DX11 game I doubt anyone would have the feeling that DX11 is "slow".

It wasn't any different in the past either. Take Crysis in DX9 and try to run it on a 9700PRO f.e. It will run of course but you'll need quite a bunch of sacrifices. When the first ever DX9 GPU was released all we saw for a very long time were games with a DX7/8 backbone and some DX9 features/effects added to the mix.

It has absolutely nothing to do how "mature" an API is (if that even makes sense heh...); it's more like ISVs coding games with the lowest possible denominator in mind. If an ISV would now release a pure DX11 game with no DX9 fallback mode it would narrow the game's selling potential tremendously. I often have the feeling that folks tend to forget how things evolve in the graphics market; hardware has to be several steps ahead than software. That "minor" detail isn't ever going to change.

If someone would make a game just dx11 only and use adaptive tesselation and direct compute postprocessing in such range that it would run at playable framerates with wide range of cards than maybe noone would bitching about tesselation fps with on/off or dx9/dx10 vs dx11.

But of course as AAA games takes now almost 2-3 years to finish with the higher levels of visual and expectations (in future even more with better visuals). Games that have and will have dx11 in near future all use dx11 as just a aditional path for more efects and better graphic.(so just more slowdowns for dx11)
 
If someone would make a game just dx11 only and use adaptive tesselation and direct compute postprocessing in such range that it would run at playable framerates with wide range of cards than maybe noone would bitching about tesselation fps with on/off or dx9/dx10 vs dx11.

But of course as AAA games takes now almost 2-3 years to finish with the higher levels of visual and expectations (in future even more with better visuals). Games that have and will have dx11 in near future all use dx11 as just a aditional path for more efects and better graphic.(so just more slowdowns for dx11)

Which recent AAA title you're aware of took only 2-3 years in total development time? Most of them don't even have a DX11 path.
 
Which recent AAA title you're aware of took only 2-3 years in total development time? Most of them don't even have a DX11 path.

Now thats the reason why 4-6 hour linear half complete fps and action games are only coming out. No one dares to make masive open world games like Gothic 3(good old pc only), Oblivion or Fallout 3. Bethesda started to work on Fallout 3 in 2004. And i dont even mention blizzard games. They take some mighty time on development but at least those games work on 100%.
 
I think the major point is if the low end 5450 and 5570 can actually play the game with dx 11 turn on. If it can't play the dx 11 games with dx 11 turned on. Then whats the point. Its just a check point feature.
Remember CS can be used to offload CPU work outside of games too.

Cedar is bad at 3D rendering, but its raw computational throughput is already quite high, in a moderate power budget.

Compared to a 3GHz quad's FPUs, the HD5450 already has about the same throughput, which means it could be a good candidate for massively parallel code for cheap (its price is nowhere near that of a 3GHz quad).

The only problem I see here is the price, as Redwood is almost as cheap and 5 times faster. I really think it was a bad idea to design Cedar, but it's still somewhat useable and the problem lies in the price grid.
 
So what price point do these enter and what price point does the 5450 enter ?

You can already get 5450's at Newegg for just over 40 USD. I'm just waiting until they have free shipping before picking one up.

I still wish there was a 256 meg version. 512 is just a waste of memory chips, IMO.

Regards,
SB
 
You can already get 5450's at Newegg for just over 40 USD. I'm just waiting until they have free shipping before picking one up.

I still wish there was a 256 meg version. 512 is just a waste of memory chips, IMO.

Regards,
SB

So what do you use the 5450 for that the extra $45 isn't worth spending ? Whats more what where does the 5570 fit price wise ?
 
First DX11 GPUs will be "slow" with real DX11 games of the less foreseeable future but that mostly because it'll take several years until they appear. What we have today IMO is mostly games with a DX9 "backbone" with additional paths (10, 10.1, 11) added to them.

If an ISV would release any time soon a "full" DX11 game I doubt anyone would have the feeling that DX11 is "slow".

It wasn't any different in the past either. Take Crysis in DX9 and try to run it on a 9700PRO f.e. It will run of course but you'll need quite a bunch of sacrifices. When the first ever DX9 GPU was released all we saw for a very long time were games with a DX7/8 backbone and some DX9 features/effects added to the mix.

It has absolutely nothing to do how "mature" an API is (if that even makes sense heh...); it's more like ISVs coding games with the lowest possible denominator in mind. If an ISV would now release a pure DX11 game with no DX9 fallback mode it would narrow the game's selling potential tremendously. I often have the feeling that folks tend to forget how things evolve in the graphics market; hardware has to be several steps ahead than software. That "minor" detail isn't ever going to change.

Is that really fair though. The 9700pro came out qtr 3 of 2002 crysis released qtr 4 2007. Thats a bit diffrent playing high end games over 5 years after the part came out vs playing games already out or within the first 6 months of a parts life.

I used the 9700pro for alot of dx 9 games. EQ2 and Farcry and others all ran well on it.


I'm sure that the 9700pro will run crysis much better than a radeon 5570 will run the pcs premier showcase game in 5 years.
 
Is that really fair though. The 9700pro came out qtr 3 of 2002 crysis released qtr 4 2007. Thats a bit diffrent playing high end games over 5 years after the part came out vs playing games already out or within the first 6 months of a parts life.

I used the 9700pro for alot of dx 9 games. EQ2 and Farcry and others all ran well on it.


I'm sure that the 9700pro will run crysis much better than a radeon 5570 will run the pcs premier showcase game in 5 years.

If you bother for a change to read the entire post yes it is quite fairly worded; and to save you some time re-read the following:

First DX11 GPUs will be "slow" with real DX11 games of the less foreseeable future but that mostly because it'll take several years until they appear.
 
If you bother for a change to read the entire post yes it is quite fairly worded; and to save you some time re-read the following:

Mabye because I fail to understand what a dx 11 game is to you. Doesn't using dx 11 features make it a dx 11 game.

Was farcry a dx 9 game ?


Have we even seen a dx 10 game according to you yet ?
 
Mabye because I fail to understand what a dx 11 game is to you. Doesn't using dx 11 features make it a dx 11 game.

No it doesn't at least to me. It's still a DX9 game with a DX11 path added.

Was farcry a dx 9 game ?

Not at least as much a DX9 game as Crysis is.

Have we even seen a dx 10 game according to you yet ?

DX9 games with DX10 paths yes. Any other question?
 
We've alread had one case of a game implementing a technique in DX10 and DX11 and the DX11 path is considerably faster, and that was Battleforge.

As for the notion of "true DXxx games won't come until x years after the release" I think thats becoming an increasingly meaningless notion with capabilities of the API somewhat blurring between the generations and the notion of downlevel support in DX.
 
We've alread had one case of a game implementing a technique in DX10 and DX11 and the DX11 path is considerably faster, and that was Battleforge.

Since I can feel how talented some of the developers of that one are, it never came as a surprise.
 
Back
Top