Did ATI hold back on R520 to make room for R580?

geo said:
You have something to share with the group about X1700 launch date? Please do! ;) I haven't heard anything that would lead me to expect there is a chance we'd see it before 7600.

well.. depends on when the 7600 will be launched but as far as I know, 1700 will also be at cebit...
1600's prices won't drop, x1700 will sit on top of that (so $249 to $399) methinks x1700 will replace the 1800xl and the 1900's take the rest of the 1800's..
 
Last edited by a moderator:
neliz said:
well.. depends on when the 7600 will be launched but as far as I know, 1700 will also be at cebit...
1600's prices won't drop, x1700 will sit on top of that (so $249 to $399) methinks x1700 will replace the 1800xl and the 1900's take the rest of the 1800's..

Well, what does "be at cebit" mean? Sure, I can imagine X1700 being there in a back room. . .but launched?

As best as I can tell, we are expecting 7600 to launch in February, based on the revised "errr, did I say January? I just meant production, actual launch in February" CFO time line.
 
Last edited by a moderator:
geo said:
Well, what does "be at cebit" mean? Sure, I can imagine X1700 being there in a back room. . .but launched?

As best as I can tell, we are expecting 7600 to launch in February, based on the revised "errr, did I say January? I just meant production, actual launch in February" CFO time line.
Why didn't 7600 launch in time for the Chinese new year? If it's such a big sales opportunity and presumably with X1600XT selling happily, doesn't NVidia feel the need to compete?

Jawed
 
Jawed said:
Why didn't 7600 launch in time for the Chinese new year? If it's such a big sales opportunity and presumably with X1600XT selling happily, doesn't NVidia feel the need to compete?

Jawed

I dunno. Is a big sales opportunity for a $99 part the same as a big sales opportunity for a $250 part in that market? My impression is maybe not so much.

And there's a bit of a difference not launching your low-end in North America/Europe first, versus doing that with your mainstream part --so far as the amount of heat you're going to take from the online press and community. Clearly, it seems to me, the 7300 launch was rushed, which pretty much guarantees they only had so much stock to go around, vs they'd had hundreds of thousands of units sitting in the warehouses for months as some have half-way suggested might be the case.

Timing considerations might have a bit to do with it too --if they are about to get their tushies a little reddened tomorrow, they might like the opportunity to come back with a little smackdown action of their own in the mainstream afterwards to help them bridge the trash-talking gap between here and G71 launch. :smile:
 
I don’t understand how they can just keep ignoring Double Z like that. They get hammered in those benchmarks. Which was at least ok when they had a good lead in Shader heavy games. Now they are generally getting beaten in those also.

What on earth has to happen for them to take this seriously and develop a feature complete card?
 
boltneck said:
I don’t understand how they can just keep ignoring Double Z like that. They get hammered in those benchmarks. Which was at least ok when they had a good lead in Shader heavy games. Now they are generally getting beaten in those also.

What on earth has to happen for them to take this seriously and develop a feature complete card?
um the R520 is quite competative with the 256MB gtx in quake 4 and doom 3 :???:
 
boltneck said:
I don’t understand how they can just keep ignoring Double Z like that. They get hammered in those benchmarks. Which was at least ok when they had a good lead in Shader heavy games. Now they are generally getting beaten in those also.

What on earth has to happen for them to take this seriously and develop a feature complete card?
where do you get all your hyperbole? I see the "benchmarks" and i see the game play with MY card, and my buddy's 256gt and 256gtx. granted my card is a 1800xt with 512mb, but i play at 6aaa and 16af... they dont. And higher rez... Tho Fear isnt on my game comp right now. What " feature " is missing? AA on shadows in EA games? Not geting 120fps, only getting 80? I could not see buying anyother card right now BUT a 1800xt 512mb unless i was a Linux user full time, Or the gtx 512mb was the same price.
 
karlotta said:
where do you get all your hyperbole? I see the "benchmarks" and i see the game play with MY card, and my buddy's 256gt and 256gtx. granted my card is a 1800xt with 512mb, but i play at 6aaa and 16af... they dont. And higher rez... Tho Fear isnt on my game comp right now. What " feature " is missing? AA on shadows in EA games? Not geting 120fps, only getting 80? I could not see buying anyother card right now BUT a 1800xt 512mb unless i was a Linux user full time, Or the gtx 512mb was the same price.
everyone knows (except him apparently) that the 1800XT is faster than a GTX 256MB in all but a few cases.
A 1900XT appears to be faster than a 512MB gtx in 3dmark 05, 06, splinter cell 3 and a wee bit faster in quake 4, for cheaper too :D
Will be waiting for the fear results...
 
The GTX512 is considerably faster.

Are the GTX and XT similarly priced?

Look at fear, Riddick, Look at Source just in the link provided above. Its pretty self evident what i am refering to.
 
karlotta said:
where do you get all your hyperbole? I see the "benchmarks" and i see the game play with MY card, and my buddy's 256gt and 256gtx. granted my card is a 1800xt with 512mb, but i play at 6aaa and 16af... they dont. And higher rez... Tho Fear isnt on my game comp right now. What " feature " is missing? AA on shadows in EA games? Not geting 120fps, only getting 80? I could not see buying anyother card right now BUT a 1800xt 512mb unless i was a Linux user full time, Or the gtx 512mb was the same price.

Last time I checked, "Benchmarks" were the measuring stick for sales, not "what some guy gets at his friends house".

Features like Double Z, Correct hardware for soft shadows that affect more than one major game on the market.
 
boltneck said:
Last time I checked, "Benchmarks" were the measuring stick for sales, not "what some guy gets at his friends house".

Hmm. I suspect both sell a lot of cards.
 
boltneck said:
The GTX512 is considerably faster.

Are the GTX and XT similarly priced?

Look at fear, Riddick, Look at Source just in the link provided above. Its pretty self evident what i am refering to.

Does it matter if it is faster if you can't buy it.
 
boltneck said:
Last time I checked, "Benchmarks" were the measuring stick for sales, not "what some guy gets at his friends house".

Mouth to mouth advertisement is better than greasy powerpoint slides from the marketing department.

Features like Double Z, Correct hardware for soft shadows that affect more than one major game on the market.

shadow edges don't need AA, they need distortion.. at least, according to ATi.

But that's the problem.. the biggest feature on nV cards since the 6800 is it's power to render shadows, yes ... you read that right.. g70 is ridiculously fast when calculating and drawing a whole lot of black pixels!
 
neliz said:
Mouth to mouth advertisement is better than greasy powerpoint slides from the marketing department.
I would rather have word of mouth, I don't like giving greasy PR people CPR, much less kissing them. :)
 
thatdude90210 said:
I would rather have word of mouth, I don't like giving greasy PR people CPR, much less kissing them. :)
argh.. my non-native tongue gave me away! I will go spin my dredel now..

and wait for HardOcp's review of the x1900...
 
Unfortunately, a whole mess of black pixels seems really important to some developers these days. ;)

I guess i am blind but i am also seeing the ATi hardware falling behind in shader performance as well. When a 700mhz core barely outperforms a core clocked what 200mhz lower there is a problem in my book.

Please don't tell me that the cores are "different designs" either. That 700mhz Core has 48 pixel shaders so even in theory it should be demolishing the GTX512.
 
boltneck said:
Unfortunately, a whole mess of black pixels seems really important to some developers these days. ;)

I guess i am blind but i am also seeing the ATi hardware falling behind in shader performance as well. When a 700mhz core barely outperforms a core clocked what 200mhz lower there is a problem in my book.

Please don't tell me that the cores are "different designs" either. That 700mhz Core has 48 pixel shaders so even in theory it should be demolishing the GTX512.

can someone please tell me again what it is called when one blindly accepts a product based upon brand name rather than FACTS ??
 
boltneck said:
Please don't tell me that the cores are "different designs" either. That 700mhz Core has 48 pixel shaders so even in theory it should be demolishing the GTX512.

.... a theoretical value requiring that there's no other bottleneck in the system, there's enough work to do considering texture fetches/latency, and that the drivers are working reasonably well. R580 is only going to see 3x the pixel shading of R520... well, in a perfectly efficient world, and one that's feeding it enough work to do as well. Otherwise, it could easily do no better than R520, since it has the same number of TMUs.

Besides, are the benchmarks over at driverheaven screwed up, because I see R520 beating 7800 in most cases (though many have 4xAA already), 7800 512 with AA, and even when said 512 has been overclocked.

But, it's possible it's just me. That seems to be at times.... (though I beg someone to point it out, if so)
 
Back
Top