AMD: R8xx Speculation

How soon will Nvidia respond with GT300 to upcoming ATI-RV870 lineup GPUs

  • Within 1 or 2 weeks

    Votes: 1 0.6%
  • Within a month

    Votes: 5 3.2%
  • Within couple months

    Votes: 28 18.1%
  • Very late this year

    Votes: 52 33.5%
  • Not until next year

    Votes: 69 44.5%

  • Total voters
    155
  • Poll closed .
If it isn't the drivers, could it be game engine limitations? I mean is it possible for a game engine to not be able to utilise all processing cores of a chip and if not, why not?
There's a couple of generalisations you can start from:

  • Games that are fillrate bound will favor the the 5850 less, games that use shaders more will place more onus on the shader core / math power hence favor the 5850 more.
  • As games move up the DX levels tend to use shaders more.
They are generalisations, but there is truth to it. Of course, other things will get in the way on an individual basis (i.e. CPU limitations, other limitations, etc., etc). Similar to some of the threads around here saying "wow, look at the X1900 performance" when reviews on modern games turn up with X1900 and 7800 GTX results in.
 
There's a couple of generalisations you can start from:

  • Games that are fillrate bound will favor the the 5850 less, games that use shaders more will place more onus on the shader core / math power hence favor the 5850 more.
  • As games move up the DX levels tend to use shaders more.

Dave, that line doesn't work this time. The 5850 used in this test had 60% more fillrate and 56% more texture rate than the 260. It's the bandwidth advantage that's only 20% but AMD's stuff is supposed to be vastly more bandwidth efficient no? :D
 
There are a couple of things that stood out to me. The first thing is just how many games are seeing high levels of performance even on cheapo hardware like a 260. Except for a few titles, software really is lagging. Makes you wonder why we're constantly waiting impatiently for the latest and greatest hardware.

For insane levels of AA/AF. I wouldn't mind seeing the option of x32/64 AF even if the quality improvements are minimal and not noticeable to most.

As for benchmarks, what's the point to sticking with 4xAA once you have over 100 FPS? Bump it up to 8xbox for Christ's sake.

While I'm not always fond of Hard[OCP]'s method of reviewing. I do like the fact that they bump up AA/AF whenever you start getting into silly FPS range...

Comparing at 4xAA when your card is doing 100-160 FPS is just silly. Bump it up to 8xAA please... After that unfortunately there are no apples to apples comparisons.

Still wouldn't mind them kicking it into SSAA/Edge Detect (12/24) for ATI hardware or something...

BTW - what's with VR-Zone constantly trying to load something in my browser? Noticed that when I looked at PSO's benches. Click...click...click...click...click...click...

Regards,
SB
 
Well of course it's not absolute. But I thought your post was an attempt to explain the absolute numbers in psolord's analysis.

For insane levels of AA/AF. I wouldn't mind seeing the option of x32/64 AF even if the quality improvements are minimal and not noticeable to most.

Yeah but that's the equivalent of spinning your wheels for most people. What we need is for developers to start using the available horsepower to generate tangible improvements in IQ.
 
I hoping the refreshes in Juneish ? Bring 5850 performance to the $200 range . Though if r900 is due in the fall I may wait. depends on what SW TOR does to my 4850. But man I'd love to have 3 monitor support
 
For insane levels of AA/AF. I wouldn't mind seeing the option of x32/64 AF even if the quality improvements are minimal and not noticeable to most.
Yeah but that's the equivalent of spinning your wheels for most people. What we need is for developers to start using the available horsepower to generate tangible improvements in IQ.

Agreed, higher than 16xAF is wasted on me, and I'm about as big an IQ whore as can be.

If devs aren't going to increase the quality of assets past console levels, the least they could do is support some fancier AA modes (better compatibility with HDR, better compatibility with deferred shading, selective supersampling, etc.)
 
I hoping the refreshes in Juneish ? Bring 5850 performance to the $200 range . Though if r900 is due in the fall I may wait. depends on what SW TOR does to my 4850. But man I'd love to have 3 monitor support

The 5850 itself better be down to $200 or less by June. Initial MSRP was $259. The prices we see now are vastly inflated due to supply constraint.
 
Following up my previous post...

Did a quick search for Acer 5942G and got some hits in european stores...

They are listed as available from January 11 2010, guess Intel and or AMD or Acer itself wants to launch their products at CES beforehand.

From here:
First, in volume production is the most high-end Mobility Radeon HD 5800 series a long time. CES 2010 will held January 7-10 in Las Vegas meeting, while the AMD Mobility Radeon HD 5000 series scheduled for release on the opening day
From the CES web site AMD is listed as exhibiting(central plaza and north hall, next to the nvidia booths) but could not find an opening day event scheduled :(

Guess they will launch all three chips to try and maximise the publicity. For anyone attending dont forget a usb stick full of benchmarking utilities and a friend to distract the staff for 10 minutes ;)
 
The 5850 itself better be down to $200 or less by June. Initial MSRP was $259. The prices we see now are vastly inflated due to supply constraint.

i hope so. Will be interesting to see what happens. Right now there just isn't neough to justify upgrading
 
Ah cool. Thanks for the explanations guys.

@Dave. Is there a "default" profile in Catalyst, that specifies how the card should behave, if no specific profile is found?

I am asking this because today I saw something strange, once more. I've been trying a little game called Greed - Black Border and even if the game started OK, early at the beginning of the game, I saw the framerate drop to 40fps and once again I started wondering what was the problem.

I headed back to the gpu usage meter of MSI Afterbunred and I saw it hovering around 30%. The card was at stock clocks.

So why, oh why, do I have unplayable framerate, on a simple game, while the card operates at 30%? Cpu is very relaxed too.

Heres a pic of the game at a section it showed 40fps. Framerate is constantly bad after that part. Sorry for the large size, but I wanted it to be as clear as possible.



There is no AA option, so I didn't select anything strange there, although I found some strange options like

Vertex processing :
-Software only
-Soft & Hardware
-Hardware only

FF-Pipeline:
-Don't emulate
-Emulate with shader if necessary
-Always emulate with shader

The performance is bad no matter what selection i make. Software, gives the worst results. The framerate drops to 20fps and the gpu usage at 18%!

Sorry for asking these questions here, but I believe they are relevant to R8XX and how it operates! I mean, sorry, but I can't excuse the fact, that the second fastest single core card on the planet, gives unplayble framerates on a silly game like this.:S
 
It's probably an error on the programmers side, Have you looked at the minimum requirements? A Geforce 6200 for nvidia users or a 9700 Pro for ATI users :weird:
 
So did you set vertex processing to "hardware only" and FF Pipeline to "Always emulate with shader"?

Yes sir. Bad performance!


No Catalyst Ai and no .exe renaming does any good either.


Neliz, I could agree that this is a developer's fault, but the worrying thing is, that the faultering developers seem to be quite a few and ever increasing...!:???:

Ok, I have a stupid question. Is there a possibility that the Cypress is a dual core chip with internal crosffire, that needs driver instructions to function correctly?

I remember a chip layout I had seen, that showed the processing cores to be in two large groups!:?:
 
Which game is this? If there's a demo or something I could d/l and run on my 8800 GT and at least help you figure out if it's an ATi-specific issue.
 
Here's a link for two download locations of the demo.

Thanks. I'll try it out after dinner and let you know how it goes. It's 3:30 now and it's snowing out so it may be a few hours before I get home, go shopping, make dinner, and get to the PC but I'll let you know when I do.
 
Ok, I have a stupid question. Is there a possibility that the Cypress is a dual core chip with internal crosffire, that needs driver instructions to function correctly?

Atually.. it has 1600 cores, but don't tell nvidia that!
 
Back
Top