The LAST R600 Rumours & Speculation Thread

Status
Not open for further replies.
(unless you're a particularly ugly example that happens on rare occassions, but I really don't see any reason yet to go there)
Yeah, I don't think so either. I still think that the main problem that ATI is facing is the loss in marketshare and mindshare from nVidia having the clear lead for so long.
 
Not impossible, but now we're back to my original point. . usually stuff is late so it won't be broke. :smile: Why pay the price of lateness if you don't get the benefit of not-brokeness?
Suppose (hypothetically) you found yourself having to choose between three options:

A: On time, but so hopelessly broken that no PC fitted with the product would even boot to Windows.
B: Partially fixed and 6 months behind the competition.
C: Completely fixed and 18 months behind the competition.

Under those circumstances don't you think it would be highly likely that a company would go for option B rather than option C?
 
I'd put 3 months for B and 6-8 for C, then it might be more realistic.

OT: geo, ...err, Geo, what's up with that capital G now? l-b will go nuts over this methinks :LOL:
 
I'd put 3 months for B and 6-8 for C, then it might be more realistic.

OT: geo, ...err, Geo, what's up with that capital G now? l-b will go nuts over this methinks :LOL:

Names are changing all over the place. :smile:
 
Do you have any solid reason to believe that a hypothetical G81 is going to close the bw deficit (as opposed to marginally narrow)? I don't. So I think such a scenario, should it come to pass, could have variable results across different settings. We'll have to see first with R600 exactly how far up the resolution/settings food chain you have to go for the bw advantage to start to shine.

Edit: Of course, having said that, looking at that ROP/memory implementation they did, I don't see any particular theoretical challenge for them to step up to 512-bit if they are willing to spend the silicon. It's obviously modular, as GTS attests. But are they willing? And at what process will they refresh: 90, 80, or 65 --and the implications for timing.


I've no idea if a "G81" even exists or even further what it would stand for, but I don't think it needs much effort or special inside sources to recognize that a 8 ROP partition / (possible) 10 cluster refresh chip (well I'd rather guess G9x here) would have or better need a 512bit bus. Assuming they truly would go that type of route in the foreseeable future, the smallest available high performance process would also help increase clockspeed at the same time.
 
Tom's Hardware:

http://www.tomshardware.co.uk/2007/02/02/amd_r600/

Chicago (IL) - AMD's graphics division, formerly known as ATI, is gearing up for the launch of its next-generation graphics card, code-named R600. Industry sources told TG Daily that the company will be holding a "Tech Day" event from March 11 to March 13, in Amsterdam to brief press and analysts about its new technology.

ATI's tech days historically have been rather small events, but the merger with AMD apparently enabled the company to expand its briefing: We hear that AMD will be flying in journalists from 45 countries. Expect the R600 to launch at the very end of Q1.
If it's in Amsterdam then there won't be any need for NDAs because the journos will all too be stoned to remember anything about the event anyway....
 
  • Like
Reactions: Geo
I've no idea if a "G81" even exists or even further what it would stand for, but I don't think it needs much effort or special inside sources to recognize that a 8 ROP partition / (possible) 10 cluster refresh chip (well I'd rather guess G9x here) would have or better need a 512bit bus. Assuming they truly would go that type of route in the foreseeable future, the smallest available high performance process would also help increase clockspeed at the same time.

Sure, but I'm not seeing that in April, are you? Maybe in the summer, or fall.
 
I've no idea if a "G81" even exists or even further what it would stand for, but I don't think it needs much effort or special inside sources to recognize that a 8 ROP partition / (possible) 10 cluster refresh chip (well I'd rather guess G9x here) would have or better need a 512bit bus. Assuming they truly would go that type of route in the foreseeable future, the smallest available high performance process would also help increase clockspeed at the same time.

I'm betting that Nvidia will play with die shrinks, core clockspeed(s) and memory type/speed for "G81", just like they did with the G70/G71 transition.
Further additions to core functionality and extra execution units/ROP partitions will probably be reserved to a future, broader update ("G90" ?).

Frankly, i believe that will be more than enough to stay -minimally- competitive with R600 for the next 10/12 months or so.
They didn't exactly get hurt by the G71 vs R580 comparison for that long time, and now G8x has much better base features than anything before it in NV's range.

The ball is in ATI's hands now. Let's hope they apply some pressure, prices have to come down.
 
Last edited by a moderator:
Sure, but I'm not seeing that in April, are you? Maybe in the summer, or fall.

Pick which answer is more likely and then you'll have your answer when the entire G8x line truly needs more buswidth. The common mistake most of us make is that we rely on way too simple speculative equasions; I'm sure IHVs conduct endless series of tests before they define what is good enough and for what exactly.

As for ATI on the other hand: assume the buswidth for the high end is let's say 6 months or so ahead of it's time (in a relative sense always)...so what? Wouldn't they on the other hand have theoretically better chances for a more efficient kickstart in the mainstream department if midrange G8x comes along initially only with 128bits? I'm just picking stuff here out of the air, but it could very well be that there are immediate advantages to their design choices we haven't thought of yet.
 
I'm betting that Nvidia will play with die shrinks, core clockspeed(s) and memory type/speed for "G81", just like they did with the G70/G71 transition.
Further additions to core functionality and extra execution units/ROP partitions will probably be reserved to a future, broader update ("G90" ?).

Frankly, i believe that will be more than enough to stay -minimally- competitive with R600 for the next 10/12 months or so.
They didn't exactly get hurt by the G71 vs R580 comparison for that long time, and now G8x has much better base features than anything before it in NV's range.

The ball is in ATI's hands now. Let's hope they apply some pressure, prices have to come down.

Agreed to most of the above; except the last sentence whereby a healthy pricedrop from NV's side is IMHO for the next couple of months clearly mostly to their advantage.
 
Fuad confirming CJ's info. Plus:
Some Nvidia green is about to get smoked as we heard R600 with some software tweaks will actually be faster than we all anticipated.
And AIBs being happy with a GDDR3 version enough to start the hype train? choo choo?;)
 
As for ATI on the other hand: assume the buswidth for the high end is let's say 6 months or so ahead of it's time (in a relative sense always)...so what? Wouldn't they on the other hand have theoretically better chances for a more efficient kickstart in the mainstream department if midrange G8x comes along initially only with 128bits? I'm just picking stuff here out of the air, but it could very well be that there are immediate advantages to their design choices we haven't thought of yet.
Sorry, are you saying that giving the high-end product more bandwidth than it actually needs somehow gives the midrange product a competitive advantage? :unsure:
 
With some weed I think anyone/thing can get quite smoked...maybe that`s Fudo`s idea, software as in ATi/AMD dealing soft-core recreational weed to NV and thus "smoking" them?:D
 
Incidentally there's an interesting little post scriptum on one of those Inq links:
The first R600 with GDDR3 memory was a huge card, the biggest ever built but the GDDR4 version will end up a bit shorter. The first R600 was significantly longer than Nvidia's G80.
That might suggest that the 12" or 13" cards that have been rumoured were actually only pre-production prototypes, and that the actual production GDDR4 cards are all sensible sizes...?
 
Status
Not open for further replies.
Back
Top