AMD: R8xx Speculation

How soon will Nvidia respond with GT300 to upcoming ATI-RV870 lineup GPUs

  • Within 1 or 2 weeks

    Votes: 1 0.6%
  • Within a month

    Votes: 5 3.2%
  • Within couple months

    Votes: 28 18.1%
  • Very late this year

    Votes: 52 33.5%
  • Not until next year

    Votes: 69 44.5%

  • Total voters
    155
  • Poll closed .
Thanks. I'll try it out after dinner and let you know how it goes. It's 3:30 now and it's snowing out so it may be a few hours before I get home, go shopping, make dinner, and get to the PC but I'll let you know when I do.

Thanks man. Take your time. This is a forum. Responses do not have a strict time limit, heheh. Thanks for caring! ;)

(Ooooh, snow, I'm jealous :p)


Atually.. it has 1600 cores, but don't tell nvidia that!


Well, you know what I meant.:p With just two cores it could just run pacman!

I don't know. I'm getting suspicious. I remember seeing this kind of behavior with my 4870X2 when a game was not supported. Take Prototype for example. It was running at 40fps and it had like 25% gpu usage. The same with Ghostbusters.

How sure are we, that the Cypress is not just two rv770s glued together with a crossfire bridge in between, with some additions/improvements?And if it is, why does it need drivers to function?

It may be a coincidence, but the performance increase of the Cypress over the rv770 is about 60% and that's about what crossfire gives you. Some people say it is the bandwidth, some people say it is the scheduler or just that ATI's architecture can't go any further! I'm probably paranoid and don't know what I am talking about here though! :rolleyes:
 
I don't know. I'm getting suspicious. I remember seeing this kind of behavior with my 4870X2 when a game was not supported. Take Prototype for example. It was running at 40fps and it had like 25% gpu usage. The same with Ghostbusters.
This is not a valid comparison. You can't make assumptions about an architecture solely based on an invalid hypothesis. For example, what does 4870x2 scaling in Ghostbusters or Prototype have to do with Cypress performance in Greed? Performance with different cards in different games isn't really comparable.
psolord said:
How sure are we, that the Cypress is not just two rv770s glued together with a crossfire bridge in between, with some additions/improvements?And if it is, why does it need drivers to function?
Where did you get this idea?! Cypress is a single ASIC, no Crossfire support needed.
psolord said:
It may be a coincidence, but the performance increase of the Cypress over the rv770 is about 60% and that's about what crossfire gives you. Some people say it is the bandwidth, some people say it is the scheduler or just that ATI's architecture can't go any further! I'm probably paranoid and don't know what I am talking about here though! :rolleyes:
Something in this quote is correct ;)
 
Thanks. I'll try it out after dinner and let you know how it goes. It's 3:30 now and it's snowing out so it may be a few hours before I get home, go shopping, make dinner, and get to the PC but I'll let you know when I do.


If you want to play with hardware, let me know, and I can probably set up a system for you to screw around with if you want to. I know you are local to me, so it probably wouldn't be much of a hassle.

-Charlie
 
This is not a valid comparison. You can't make assumptions about an architecture solely based on an invalid hypothesis. For example, what does 4870x2 scaling in Ghostbusters or Prototype have to do with Cypress performance in Greed? Performance with different cards in different games isn't really comparable.

It may not be a valid comparison, but I see a pattern here, that's why I am asking.

Where did you get this idea?! Cypress is a single ASIC, no Crossfire support needed.

I got this idea, after seeing games like, Trackmania, Oblivion, FUEL, Jericho, UT3 and others, not performing anywhere near one should expect. I actually saw and video recorded very bad gpu usage in most of them.

So if you run trackmania for example and you see 40% gpu usage, what conclusions can you draw, over a single core product? What if you see this pattern appearing again, in new games?

It's good to be reassured from someone that knows things though! ;)
 
I think it's completely paranoid and the developer is at fault here.

Again, what the hell is this developer doing to make his game run smooth on a 6200, but you need at least a 9700Pro to participate if you're an ATI user?

What kind of CPU are you actually using? what are the numbers on that? The description you are giving is that you're CPU limited. (wait, I can look at your vr-zone post!)

I'm trying to compare your results to THIS review which seems to have almost the same set-up (Q9550@4.12)

What I did read about Trackmania is that there are(were?) issues with Asus boards and the OpenAL driver. disable audio in your bios and uninstall the OpenAL driver to see if that affects performance in any way.
 
Last edited by a moderator:
Ah cool. Thanks. This is a nice benchmark suite and from what I see, our results are pretty close.

I will try disabling the sound, but I also used Steam's Trackmania, which is supposed to keep games up to date and the results were the same. If there was some issue with the sound, wouldn't they have fixed it by now?
 
gpu usage is a vague concept to me.Far Cry2 will see gpu usage of a constant 100% during the benchmark but crysis at varying 60-80 will still heat up the card more.same with ati tool and furmark, both at 100% but furmark will be about 10C ahead.:?:
Doesn't polling the hardware during gaming have a detrimental effect on the performance? I remember when the 48xx cards came out there were complains on the stuttering in some games when the gpuz ran in the background.
 
Fud is predicting 5890 to fight against nv360
When will be announced R900 to fight against skynet?

Correct meif I'm wrong butwasn't Skynet quoted in T3 as processing at around 550 TFLOPs "per second"? :LOL:

If thats the case then we already have Skynet smashing based supercomputers today.
 
I've been trying a little game called Greed - Black Border and even if the game started OK, early at the beginning of the game, I saw the framerate drop to 40fps and once again I started wondering what was the problem.

I headed back to the gpu usage meter of MSI Afterbunred and I saw it hovering around 30%. The card was at stock clocks.

So why, oh why, do I have unplayable framerate, on a simple game, while the card operates at 30%? Cpu is very relaxed too.
It's an old game. It's likely that none of the cards ran it faster than 30 fps during it's launch, so the developer didn't test the scaling beyond that point. A good explanation could be that they are for example updating their vertex/texture data per frame by CPU, and locking some graphics resources. DX cannot lock a resource that is in use by the graphics card, so it has to wait. Many old games have problems like this (locking used resources wasn't that critical when the cards were slow).
 
It's an old game. It's likely that none of the cards ran it faster than 30 fps during it's launch, so the developer didn't test the scaling beyond that point. A good explanation could be that they are for example updating their vertex/texture data per frame by CPU, and locking some graphics resources. DX cannot lock a resource that is in use by the graphics card, so it has to wait. Many old games have problems like this (locking used resources wasn't that critical when the cards were slow).

While I was reading your response, I thought, ok Trackmania is old, but what about Greed? Lol, then I realised you were talking about Greed. Well, Greed just got released (late November 2009)!

Overall your explanation holds solid ground for most old games I guess.

Anyway I will send them an email, instead of busting your balls guys. Yet I insist. Things could be better. Oh what the heck, even AMD said that the 5800 series is not yet at full speed, so I guess in a couple of months everything will be resolved! Maybe they are waiting Fermi, to release some Bag Bing drivers themselves (as per Nvidia's Big Bang! :oops:)
 
Err what?
(the 5570 has 2GB of GDDR5. What the hell, who are they bluffing anyway)
9dec0998y4tbx-1260357593.jpg
 
Err what?
(the 5570 has 2GB of GDDR5. What the hell, who are they bluffing anyway)
Where does it say GDDR5? I think 2GB would be a strong hint to a 128bit ddr3 interface, though in this case it would need to be some redwood based card. Maybe 5570 instead of 5650? Or with a couple of disabled units?
 
Correct meif I'm wrong butwasn't Skynet quoted in T3 as processing at around 550 TFLOPs "per second"? :LOL:

If thats the case then we already have Skynet smashing based supercomputers today.

Heh, it would be interesting to have computers that had acceleration in the computations it performed. I'm imagining an engine block w/ upward pointing exhaust valves and a pedal, or maybe a horse drawn Radeon... Maybe if Babbage's Analytic Engine had been mass produced and yoked to combustion / steam engines w/ a foot-pedal, they'd have a tachometer for the thing that let you accelerate its pace.
 
Correct meif I'm wrong butwasn't Skynet quoted in T3 as processing at around 550 TFLOPs "per second"? :LOL:

If thats the case then we already have Skynet smashing based supercomputers today.

In T3 skynet was a peer to peer program that infected all the computers on earth and used that processing power to launch the attacks and do what it needed to do.


Whats the Petaflop rating of all pcs , laptops , iphones , smart phones , gps , ds , psp , consoles and so on and so forth in the world ?

Of course when T1 was first made back in the 80s TFlops was huge. it was the 80s after all.
 
I am asking this because today I saw something strange, once more. I've been trying a little game called Greed - Black Border and even if the game started OK, early at the beginning of the game, I saw the framerate drop to 40fps and once again I started wondering what was the problem.

Same here, GTX 285 @ stock 1920x1200 windowed 4x16x max settings. Nothing changes performance much though, disabling AA/AF or lowering all settings doesn't improve things a whole lot. Doesn't matter since the game is shite IMO.

Red is default vertex/ff settings, Blue is forced hardware vertex/ff.

greedf.png
 
Back
Top