The article that sparked this whole "G72 = R520 killer" line doesn't explicitly state anything of the sort, and in my opinion meshes pretty well with the previous G72-as-NV44-replacement rumour.
Well, it is unknown how ATi plans to attack the 512MB category, is it not? At first I had the impression that ATi wanted to standardize this memory capacity, but now I am not so sure. I conclude this from the fact that the XLs that were released and benchmarked were of the 256MB variety and not 512MB as hinted at before release. That is not to say that there won't be a 512MB XL, but surely ATi would have submitted 512MB XLs for benchmarking if it was planned as the standard SKU.geo said:Another hint towards 512mb 7800 'round Nov. 1. Mem clock will be interesting, as will gpu clock if they decide to bump a bit, or mucho. I would think this would be their shot at an Ultra-like board. I can't imagine they'd do a 512mb release and then another on top of it.
That, I would agee with, but it doesn't solve the XL conundrum. A single SKU with 512MB is not going to bring about the "512MB Revolution," that's my point. Should we expect a similar $50 differential on a 512MB XL?geo said:Assuming the differential roughly holds at Street, I think all you need to know about whether ATI intend the XT 512mb to be the prevalent flava of that card is that there is only a $50 price difference vs 256mb. How many people at this level are going to save that $50 and give up 256mb of high-speed memory? Damn few approaching zero, I think.
According to Samsung's disclosed order manifest, apart from ATi, nVidia has also placed order for the 1.26ns GDDR3 memory chips, therefore we expect to see nVidia release its new 1.26ns GDDR3 featuring product in the near futre, which is believed to be the weapon against the X1800XT. The size of the Samsung K4J52324QC-BJ12, which is currently used on the ATi Radeon X1800XT, is different from the more commonly used GDDR3 memory we've seen, because the packaging used has been changed to 144 Ball FBGA unlead from 136 Ball FBGA, the specs are 2M x 32Bit x 8 Bank,voltage 2.0V. BJ12 means the speed is 1.26ns,the highest Samsung official clockspeed is 1.6GHz, much higher than the 1.6ns on the Geforce 7800GTX. If this memory is to be used, the PCB will have to be modified.
In additon,there appears to be a new G70 definition in the recently released Forceware 81.84 beta driver. This could well be the 1.26ns GDDR3 Geforce 7 part. If the 1.26GDDR3 is used along with an increase in core clockspeed, it's not difficult to surpass the ATi Radeon X1800XT in performance.
Begin? Heh. And yes, R580 does seem quite kickass, although remember that there are a number of things it just doesn't improve. Things that NVIDIA will improve in its G70 refresh. So a clear superiorityis unlikely. What matters now, financially-speaking, is mindshare. And margins. Anything else is just "details" nowadays. Let the fans and OEMs speak.geo said:I begin to suspect that from a performance aspect, that X1800XT is going to look merely "okay" in retrospect a little down the line
Uttar said:, although remember that there are a number of things it just doesn't improve. Things that NVIDIA will improve in its G70 refresh.
So a clear superiorityis unlikely. What matters now, financially-speaking, is mindshare. And margins. Anything else is just "details" nowadays. Let the fans and OEMs speak.
I did start a thread on this subject:geo said:Ah ha! Interesting times ahead.
It seems all three, RV515, RV530 and R520 are going to look like "version 1.0", which you don't buy because v1.1, which'll be along shortly, will make them look a bit silly.I begin to suspect that from a performance aspect, that X1800XT is going to look merely "okay" in retrospect a little down the line, unless they pull a signficant increase out of the drivers in optimizing the memory bus (which certainly may happen).
And RV540, RV560.Which is the long way around to "I really want R580", but that really isn't news at this point, is it?
Jawed said:
I'd love to be optimistic about there being 20%+ latent driver performance increases in games across the board with these cards, but I just don't see it happening. [OT ramble deleted]
trinibwoy said:I had the impression that the "bugged' libraries were something that ATi licensed from a third party for their own design. No where did I see it being referred to as a TSMC problem.
nagus said:http://www.digitimes.com/news/a20051005A7033.html
Nvidia schedules G72 GPU release in early 2006
Charles Chou, Taipei; Jessie Shen, DigiTimes.com [Wednesday 5 October 2005]
Nvidia has scheduled its G72 graphics processing unit (GPU), which will be manufactured using 90nm process technology, to be introduced in early 2006, according to sources at Taiwan graphics card makers. The G72, a 90nm version of the G70 GPU, will compete with the 90nm R-series from ATI Technologies, which will announce the launch of its long-awaited R520, RV515 and RV530 GPUs on October 5, U.S. time, indicated the sources.
The 90nm G72 will have a much smaller size than the 0.11-micron based G70 allowing for multiple GPUs to be utilized on one graphics card through SLI technology, providing an effective and attractive alternative to those who do not want to fork out money for two graphics cards, the sources claimed.
Graphics card makers are ready to ship R520- and RV515-based products, following ATI’s official release, according to sources at the makers. RV530-based graphics cards will be volume shipped at the end of this month, indicated the sources.
geo said:Damn, I must have been in full R520 orgy mode and this went right past me that it was Digitimes (and, thus, "respect must be paid").
Oh woe. "G72". Ick.
What about that SLI thingie? Back around G70 release, you may recall, there was much hinting (and misuse of terminology before someone slams me --not that it will stop them ) at "dual-core" from NV, and "almost like 512-bit bus" kind of comments with a wink from a few in-the-know. I think even Tom's hinted in that direction (Oh, Lars. . .where are you now? ).
Could we be on the verge of seeing an official SLI-on-one-board reference model from NV and sold as a regular model across the full line of AIB instead of just the occasional AIB-on-a-lark thing?
That article doesn't say it's a high end part, so they may indeed be referring to G72 as we know it.geo said:I hate them calling it "G72" anyway.
Fodder said:That article doesn't say it's a high end part, so they may indeed be referring to G72 as we know it.
geo said:You are being even more Jesuitical than I can be at times. It says twice it's the 90nm version of G70.
I know I being mr. skeptical here but I dont want to get any hopes up. :| Also Digitimes has been wrong before on multiple occasions. Remember they were also propagating the 24 pp R520 theory.Nvidia has scheduled its G72 graphics processing unit (GPU), which will be manufactured using 90nm process technology, to be introduced in early 2006, according to sources at Taiwan graphics card makers. The G72, a 90nm version of the G70 GPU, will compete with the 90nm R-series from ATI Technologies, which will announce the launch of its long-awaited R520, RV515 and RV530 GPUs on October 5, U.S. time, indicated the sources.