Inquirer spreading R420 info

Can I find it just a bit hilarious all the assumptions people are making about both IHVs when y'all start extrapolating down-the-road scenerios based on some rather uncertain information?

I don't mean this as a flame, I guess I'm just burnt on the whole second guessing thang when we're about a week away from getting the other side of the story. ;)
 
engall said:
Just like R9800 and R9800se. Do you know what I mean?

yeah, that why I used " :!: "

I just think it's not proper to call X800Pro a 180M chips because one of the quad is disabled.
 
digitalwanderer said:
Can I find it just a bit hilarious all the assumptions people are making about both IHVs when y'all start extrapolating down-the-road scenerios based on some rather uncertain information?

I don't mean this as a flame, I guess I'm just burnt on the whole second guessing thang when we're about a week away from getting the other side of the story. ;)

True but it is fun and hey, it passes the time! I think it puts it on the line too. I'll look back at hte end of the month and see how close or far I came to any level of accuracy!

All I can say is let ATI prove me wrong, their entire R350 road map didn't unfortunately and I still feel pained!

:(
 
I don't think you are the one who could say that. You have a good record on assumptions, don't you think? People can make assumptions, that doesn't mean you have to agree with them or that it will be the truth (or wrong). I think wwe will have to wait for the 4th to see what's going on ;)

digitalwanderer said:
Can I find it just a bit hilarious all the assumptions people are making about both IHVs when y'all start extrapolating down-the-road scenerios based on some rather uncertain information?

I don't mean this as a flame, I guess I'm just burnt on the whole second guessing thang when we're about a week away from getting the other side of the story. ;)
 
engall said:
Druga Runda said:
The question is whether ATI will slap a dustbuster on the XT version to make the clocks so high, or have they refined the process so dramatically that they can clock up to up to 1/3 more comparing to Nvidia as it seems to be the case.

If so that will be jaw dropping performance, even more so than NV40.

That would leave the XT version @ least with 500mhz clocks and a bit more bandwidth to boot.

Somehow it seems to good to be true 8)

But if true NV40 would have to be a lot more efficient to match that - which it doesn't appear to be.

I don't think we ever had so much power in hardware vs the current games crop and current top end CPU's as we have now.

AFAIK,800XT is much more like Radeon 9800XT ,the similar cooler and PCB.at GDC, 800XT is 500MHZ/1000MHZ.So I think the final version is the same.

uh ah.... :)

I surely cannot wait to see the numbers for that beast... well R420pro will give us a good guess on how will it fare.

I would imagine that the final clocks/memory used will be determined to be the lowest - ie cheapest, but to reasurringly beat NV40 by 10% or more... So if 500mhz clocks will be enough for that (and it looks like it will be) 500/1000 sounds reasonable.

But whoaaaa at least double strenght R360 @ 500 mhz... now that is a monster.

NV40 came in with so good results that it is a little hard for me to believe R420XT might actually beat it and that with a margin...

It seems that $499 for this crop suddenly sounds quite good... :) you actually gain quite a bit more going to the last hurdle for the ultimate card.
As opposed to the last few generations.

We'll need monitors too for those ultra high resolutions :)

I just hope ATI delivers again too...
 
Ive come to the conclusion that the INQ is like a psychic, they produce every possible scenario and are bound to get some right and there will always be those that believe its true, just like a psychic. :oops:

The only thing good about the Inq is its humour value :LOL: :LOL:
 
991060 said:
engall said:
Just like R9800 and R9800se. Do you know what I mean?

yeah, that why I used " :!: "

I just think it's not proper to call X800Pro a 180M chips because one of the quad is disabled.
What about R9800SE?
how many transistors do R9800SE have ?
Do you have any idea?
 
How bout some more rumors.. Over in the Futuremark forum under the same topic heading someone wrote this..
Glock said:
Hmmm..... [suspicious]

"ATI Radeon R420 Graphics ATI X800 Series
Manufacturing Process .13µ
Transistor Count(millions) 160
Core Speed(MHz) ~500
Memory Speed(MHz) ~500
Memory Interface 256bit
Rendering Pipelines 16
TMUs Per Pipeline 1
Peak Memory Bandwidth(GB/s) ~32.0
Pixel Fillrate(million pixels/sec) ~6,400
Texel Fillrate(million texels/sec) ~6,400
DX9 Pixel Shader Version 3.0
Vertex Shaders(version) 8 or 6 (3.0)
Memory Type GDDR3
FP / Internal precision 32
MAX Memory Size 512MB
AA Sample 8x
Native PCI Express Support Yes (R423)
Native AGP 8x Support Yes (R420)
DirectX Version Support 9.0c "

Source: it´s a secret. [wink]

What do you all think. And can anyone (in the know) confirm or deny or at least give a hint (to you nda guys) if its true??
 
Re: Hmmm, sounds slow to me?

Seiko said:
Unfortunately I think it looks like a 475mhz speed for the Pro won't be that much faster than the NV40 Non ultra (assuming Nvidia can also push the NV40 with 12 pipes a little higher than 400Mhz)

It is interesting...ATI looks to be increasing their clocks for their "full, 4 quad" board relative to the 3 quad, while you're anticipating nvdia doing just the opposite.

While that may indeed turn out to be the case, nvidia would be more or less shooting themselves in the foot IMO. The closer it is in performance to the Ultra, the less they can charge for the Ultra.

12 pipes at 475 Mhz (X800Pro) would be within about 10% of the 6800 Ultra at 400 Mhz, in terms of pixel shading power, and about 20% slower in AA situations based on lower bandwidth of the Pro.

I think if EITHER the X800Pro or the 6800 non-Ultra is that close in performance to the 6800 ultra (while at the same time requiring less stringent cooling and power), the 6800 Ultra is all the sudden a very, very tough card to buy assuimg a $100-200 price difference.

It also doesn't bode well for the supposedly 600Mhz XT version.

While I agree that a 600 Mhz version sounds unlikely given this info...I've had that opinion from the start. ;) I've always been expecting about 500 Mhz, which I think bodes well for the XT.

1) I think ATI will now be very evenly matched in performance compared with Nvidia.

I think this is the one area will ATI will end up having a decided advantage.

2) ATI will still retain a now very slight IQ advantage.

Agree.

3) ATI will loose in the features department.

Agree.

4) ATI will be ahead in the low power consumption department.

Agree.

5) Overall most sites will hand this round to the NV40 family due to it's features and comparable performance/IQ!

I think it will be about evenly split. Some seing the performance increase as not outweighing the lack of PS 3.0...others seeing it the other way.

All in all, both companies will be so evenly matched neither will be able to declare all out victory.

Agreed!

*Sighs, why oh why won't ATI unleash a genuine monster!

Honestly, I think a 500 Mhz, 16 pipe part is a monster....as long as it's paried up with at least 550Mhz+ memory.
 
Bry said:
"ATI Radeon R420 Graphics ATI X800 Series
Manufacturing Process .13µ
Transistor Count(millions) 160
Core Speed(MHz) ~500
Memory Speed(MHz) ~500
Memory Interface 256bit
Rendering Pipelines 16
TMUs Per Pipeline 1
Peak Memory Bandwidth(GB/s) ~32.0
Pixel Fillrate(million pixels/sec) ~6,400
Texel Fillrate(million texels/sec) ~6,400
DX9 Pixel Shader Version 3.0
Vertex Shaders(version) 8 or 6 (3.0)
Memory Type GDDR3
FP / Internal precision 32
MAX Memory Size 512MB
AA Sample 8x
Native PCI Express Support Yes (R423)
Native AGP 8x Support Yes (R420)
DirectX Version Support 9.0c "

Hmmmm....wouldn't 500MHz x16 pixels = ~8 Gigapixel peak pixel fill rate, and ditto for megatexels...? ~6400 would indicate 400MHz clock, right?
 
engall said:
at GDC, 800XT is 500MHZ/1000MHZ.So I think the final version is the same.

well if this is true, and the albatron thing is true about the 6800U being 600/1000 (http://www.albatron.com.tw/english/news/news_detail.asp?news_id=77) then the 6800ultra would overpower the x800xt core clock by 100mhz.

but this is all speculation. i wont beleive anything until i see them in stores, or at least some confirmation from official sources.

:idea:
 
Bad_Boy said:
engall said:
at GDC, 800XT is 500MHZ/1000MHZ.So I think the final version is the same.

well if this is true, and the albatron thing is true about the 6800U being 600/1000 (http://www.albatron.com.tw/english/news/news_detail.asp?news_id=77) then the 6800ultra would overpower the x800xt core clock by 100mhz.

but this is all speculation. i wont beleive anything until i see them in stores, or at least some confirmation from official sources.

:idea:
6800U being 600/1000?
It is definitely impossible
 
Re: Hmmm, sounds slow to me?

Joe DeFuria said:
12 pipes at 475 Mhz (X800Pro) would be within about 10% of the 6800 Ultra at 400 Mhz, in terms of pixel shading power, and about 20% slower in AA situations based on lower bandwidth of the Pro.

But we don't yet know how similar the shader cores are from different designs.
 
engall said:
AFAIK,800XT is much more like Radeon 9800XT ,the similar cooler and PCB.at GDC, 800XT is 500MHZ/1000MHZ.So I think the final version is the same.

Dident dave say something along the lines, that neither Nvidia or ATI would be the same as the ones which were seen at GDC, we have already seen that Nvidia's clocks are lower.
 
Re: Hmmm, sounds slow to me?

Joe DeFuria said:
Honestly, I think a 500 Mhz, 16 pipe part is a monster....as long as it's paried up with at least 550Mhz+ memory.

Yeah, it's funny, both companies is releasing cards that are twice as fast or more as the predecessors. If that isn't monster cards then what is ?

(Ok, none of them are going to be perfect but then that's impossible to achieve also).
 
Re: Hmmm, sounds slow to me?

991060 said:
But we don't yet know how similar the shader cores are from different designs.

True....I am going with the assumption though, that the R420 shader pipeline is no worse than the R3xx pipeline. I think that's a reasonable assumption...but yes, it is an assumption and could turn out to be wrong.
 
Back
Top