AMD: R8xx Speculation

How soon will Nvidia respond with GT300 to upcoming ATI-RV870 lineup GPUs

  • Within 1 or 2 weeks

    Votes: 1 0.6%
  • Within a month

    Votes: 5 3.2%
  • Within couple months

    Votes: 28 18.1%
  • Very late this year

    Votes: 52 33.5%
  • Not until next year

    Votes: 69 44.5%

  • Total voters
    155
  • Poll closed .
In T3 skynet was a peer to peer program that infected all the computers on earth and used that processing power to launch the attacks and do what it needed to do.


Whats the Petaflop rating of all pcs , laptops , iphones , smart phones , gps , ds , psp , consoles and so on and so forth in the world ?

Of course when T1 was first made back in the 80s TFlops was huge. it was the 80s after all.

It does spread out by the end of the film but when its first activated in T3, presumably on some central server, its quoted as operating at around 550 TFLOPs I think. So it must have become self aware at that level before making the decision to spread out across the web.
 
Same here, GTX 285 @ stock 1920x1200 windowed 4x16x max settings. Nothing changes performance much though, disabling AA/AF or lowering all settings doesn't improve things a whole lot. Doesn't matter since the game is shite IMO.

Red is default vertex/ff settings, Blue is forced hardware vertex/ff.

greedf.png

Ah cool. That's refreshing! So there's nothing wrong with the architecture of my uber card. :D

Actually I contacted the developers and they said that the problem is my cpu. Once I asked how can my Q9550@4.0Ghz be at fault here, they said that Greed is absolutely cpu single threaded. So I guess that explains why I get 25% cpu usage and that's why the gfx card cannot unfold and give more frames than it currently gives.

I am beginning to think, that there are more cpu limited games in my benchmarks, than what I initially thought.

Its really sad though, to see your expensive hardware performing at 25% and you being stuck with low framerate and unable to do anything about it.

I have to give it to Clockstone though, they have the fastest support I have ever come across in my whole PC lifetime. The game is not too bad though. Just needs some refinement.

Oh, btw, regarding Trackmania, I unistalled openlAL and nothing changed. I also set my ATI card as the primary sound output, thus bypassing the mobo's onboard sound chip and rerun the benchmark. Again the same results were produced.

I cannot say that the cpu is the primary suspect here though, since my GTX 260 gave better results, with the same cpu!
 
In the first terminator, Arnie runs on a 6502 processor as evidenced by the code listings in his vision.
a popular CPU among self-aware robots.

Bender_6502.jpg
 
Sorry for not responding earlier, I couldn't get the game to download but I think that's a problem with the machine I tried to use.
 
It does spread out by the end of the film but when its first activated in T3, presumably on some central server, its quoted as operating at around 550 TFLOPs I think. So it must have become self aware at that level before making the decision to spread out across the web.

I haven't seen the movie in awhile , but if i recall it was already infecting other computers before it took over the military.

Also didn't the robot girl sent back help skynet take over. Isn't that what was happening. Hmm I think i have the dvd some where. I might watch it tonight or tommorow.
 
I haven't seen the movie in awhile , but if i recall it was already infecting other computers before it took over the military.

Yes. It was actually being run by the military and had taken over many systems. The critical moment was when the military put Skynet in charge of every one of their systems (which was supposed to be it's main function).

Also didn't the robot girl sent back help skynet take over. Isn't that what was happening. Hmm I think i have the dvd some where. I might watch it tonight or tommorow.

Her primary mission was to kill all the key people who ran the resistance with John Connor. Connor himself was a higher target of opportunity, but he'd dropped out of sight and Skynet didn't know where to find him at that time. The terminatrix only ended up at the Skynet research facility because she was chasing Connor and his future wife to her father's office (he was in charge of the whole Skynet project).
 
My 5850 has been delayed yet again, this time until February!!! Thats the final straw, I've cancelled the order and intend to wait for Fermi now since by then we should certainly have some solid information and possibly only be a few weeks away from availability.
Ironically though this is probably a good thing for ATI since I've decided to get a 4890 to tide me over :) thats if dabs.com finish screwing up my order!!
 
My 5850 has been delayed yet again, this time until February!!! Thats the final straw, I've cancelled the order and intend to wait for Fermi now since by then we should certainly have some solid information and possibly only be a few weeks away from availability.
Ironically though this is probably a good thing for ATI since I've decided to get a 4890 to tide me over :) thats if dabs.com finish screwing up my order!!

Overclockers have got Powercolor 5850s listed as in stock.
 
Overclockers have got Powercolor 5850s listed as in stock.

Cheers but the price is too steep for me there. The one at Dabs is overclocked and sub £200 with CM: Dirt. Besides I can't really afford an upgrade at the moment and the only thing that pushed me to do it was a full HDD, Win7 and my broken GTS 640. I figure I can spend a modest ~£190 now for a new harddrive plus a pretty serious GPU update and build my Win7 system around that.

So I'll still be stuck on a C2D but this system might just tide me over until Sandybridge and the Fermi refresh. Worst case though it will see me through until I have more money and can afford a cheap Nehalem and Fermi derivative.
 
Cheers but the price is too steep for me there. The one at Dabs is overclocked and sub £200 with CM: Dirt. Besides I can't really afford an upgrade at the moment and the only thing that pushed me to do it was a full HDD, Win7 and my broken GTS 640. I figure I can spend a modest ~£190 now for a new harddrive plus a pretty serious GPU update and build my Win7 system around that.

Yeah, Dabs can offer to keep you hanging on with a lower price - makes no difference to them if they haven't got the stock. OCUK's prices have gone up since launch. My friends build got a 5850 at launch for £199. I can't believe they have gone up so much due to shortage, and part of if must be retailer price gouging.


So I'll still be stuck on a C2D but this system might just tide me over until Sandybridge and the Fermi refresh. Worst case though it will see me through until I have more money and can afford a cheap Nehalem and Fermi derivative.

C2Q isn't too bad if you overclock a bit. Price/performance is still pretty good, mainly because i7 CPUs/motherboards/RAM are still carrying a significant price premium over last year's C2Q gear (last time I checked).
 
C2Q isn't too bad if you overclock a bit. Price/performance is still pretty good, mainly because i7 CPUs/motherboards/RAM are still carrying a significant price premium over last year's C2Q gear (last time I checked).

My $169 Q9550 @ 4GHz in conjunction w/my OC'd GTX 285 still plays every game on the market w/max. settings @ 1080P. So yes, overclocking is your friend :)
 
C2Q isn't too bad if you overclock a bit. Price/performance is still pretty good, mainly because i7 CPUs/motherboards/RAM are still carrying a significant price premium over last year's C2Q gear (last time I checked).

Its a 2.4Ghz DUO though ;) Its still fine for the vast majority of games but there are quite a few these days which benefit greatly from a quad, some to the point where the duo holds the game back to a noticable degree regardless of GPU power,
 
Its a 2.4Ghz DUO though ;) Its still fine for the vast majority of games but there are quite a few these days which benefit greatly from a quad, some to the point where the duo holds the game back to a noticable degree regardless of GPU power,


Yeah, that's why I'm sneakily suggesting you upgrade to a Quad if your motherboard can take one as a drop in replacement! ;)
 
IF you have a duo the quad core q9550 is a cheap and fast upgrade. It will be 2.8ghz default but they all hit over 3.5ghz on oc and there are many new games out that will greatly benfit from the extra cores. Dragon age is one of them
 
So we presume the rest of RV8XX will launch (some paperish) at CES?

Well the twisted reviews people have the specs for the various Acer 8942Gs and the Aspire 7740G and they contain 5650s with availability in January. Intel likely wont let them release any earlier even if they were ready.

In the above all seem to be shipping with DDR3. For the 8942G supposedly has availability this week, guessing they mean only the i7 model with 5850.

So guess would have to launch Broadway(Juniper) and Madison(Redwood) then, no choice really. Thus not likely to leave Park(Cedar) sitting on the shelf on its own, too much hassle/expensive to hold a whole separate event for it only a month or two later.
 
I wonder if the design of the current chips are so that once the next generation GPU architecture comes out they can slot in the current 5xxx chips a tier down in the next generation after a shrink.

It seems that both Juniper and Cypress have the die area to allow for a bus width larger than what ATI opted for. So perhaps it was due to the fact they wanted to shrink the current designs to slot in under the next generation architecture. HD 58xx should become HD 67xx and the HD 57xx should become the HD 66xx and so on.

Also I suspect that whatever r9xx architecture they have in the works is going to be released at the high end first and will form the 68xx range of cards, but from general speculation here I feel they may not release R9xx derivatives until 2011.

Lastly I wonder if they are going to release an HD 5890 series refresh like they did with the RV790 to combat the Fermi architecture when the time comes? If they do this then perhaps its possible they may fit a wider bus and go for slightly higher clock speeds as they certainly have some headroom in both departments if they are willing to tolerate the thermals for a single GPU design. Theres no shame in an Ati Ultra, right?
 
Back
Top