NVIDIA GT200 Rumours & Speculation Thread

Status
Not open for further replies.
Though clearly the underpinnings of CUDA are very, very important to NVIDIA and I basically agree with your thoughts, it is unlikely that he was just referring to just this since the process of designing and bringing these things to the market is so complex. But obviously they have some interesting stuff going on to exploit their processors in the future.

One thing that I am curious about is CUDA and their internal design and architecture tools. These tools have always been a key part of NVIDIA's success, but it seems like it is only a matter of time before they can harness heterogeneous computing to make some big efficiency gains here.

Well said. And speaking of heterogeneous computing, I really like the fact that NVIDIA is clearly spelling out the fact that most PC systems today are very unbalanced in their CPU and GPU. Spending less money on CPU and more money on GPU gives a much much much better bang for the buck in most scenarios related to visual computing (ie. gaming, high def video, etc). It's amazing that only now is one of the big players in the computer/graphics industry speaking up about it: http://www.nvidia.com/object/balancedpc.html

[H]OCP has been on to this trend for a long time, and I think the success story of the Gateway laptop (the one that sold out at Best Buy within two weeks) with slower CPU and faster GPU has provided some vindication for them. I hope that Dell and HP are taking notice too.
 
Last edited by a moderator:
Maybe because cost of GDDR3 is still substantially lower? If NV can have as good or better performance with GDDR3 than their competition with GDDR4/5, then I can understand why they would want to use it. With lower cost memory modules and still class-leading performance, then they can increase their pricing margins.
 
Or maybe GDDR3 could be used in the midrange 9900 GT product while the highend 9900 GTX and 9900 GX2 get GDDR4.

I could understand the use of GDDR3 in all the upcoming high-end products if GDDR4 was the bleeding edge of memory but it's not, GDDR5 is what's new. That's why I was surprised to not see GDDR4 in use.
 
I don't think GDDR4 has any advantages over GDDR3 at this point. They're both just as fast and GDDR3 is probably cheaper and more widely available. I haven't seen any evidence of lower power consumption from GDDR4 either.
 
http://www.dailytech.com/NVIDIA+AMD+Set+to+Square+Off+Once+Again+This+Summer/article11451.htm

Original NVIDIA roadmaps put the GT200 launch in late Fall 2008. However internal memos sent out to board partners in early March detail that the GT200 processor has already been taped out. The same document alludes to the fact that the GT200 chip is very stable, and has been ready to ship for reference designs for several weeks already.

Channel partners indicate that the 9800 GTX and GX2 will begin phasing out next month in preparation for the GT200 launch. Both companies have made promises to show demonstrations of their next-gen cards at the Computex Taipei trade show on June 3, 2008.
 
According to Jen Hsun on financial analyst day, the tools and methods and simulations in use today by NVIDIA are so good that when a chip tapes out, he just "knows" that it is going to work properly.
 
According to Jen Hsun on financial analyst day, the tools and methods and simulations in use today by NVIDIA are so good that when a chip tapes out, he just "knows" that it is going to work properly.
Good grief, enough with the cheerleading already. What do you expect "JHH" to say? You think he's going to say, "Our next chips are gonna suck."?

-FUDie
 
Cheerleading? LOL, don't be a hater, I'm just reporting back on what was actually said during the analyst day (which you obviously didn't bother to listen to). The times of waiting months and months to fix something that is broken with a chip after tape out are clearly going to be a thing of the past as simulation tools get better and better. Duh.
 
Cheerleading? LOL, don't be a hater, I'm just reporting back on what was actually said during the analyst day (which you obviously didn't bother to listen to). The times of waiting months and months to fix something that is broken with a chip after tape out are clearly going to be a thing of the past as simulation tools get better and better. Duh.

Until they fail, obviously ;)
 
Since it is supposed to be called D10U internal, I would think it will be a GeForce 10.
GeForce 10 D880 and GeForce 10 D860 maybe... ;)

GeForce 9900 seems more to be a pimped up G92@55nm-card with perhaps GDDR5, fighting against RV770XT and released in 3rd quarter.
I have doubts, that RV770XT supposed to come with GDDR5 will be available in big quantities in Q2.
 
Last edited by a moderator:
Cheerleading? LOL, don't be a hater, I'm just reporting back on what was actually said during the analyst day (which you obviously didn't bother to listen to). The times of waiting months and months to fix something that is broken with a chip after tape out are clearly going to be a thing of the past as simulation tools get better and better. Duh.

That would be very nice. But the fact is we're actually waiting longer this time for new architectures so something else (development) is more than taking up the slack.
 
True, to the end user the only thing that matters is availability, performance, and cost of a new graphics card, and not whether or not there is a smooth tape out. I guess my point was that having a stable chip after tape out is becoming less and less a stroke of luck as the simulation tools get better.
 
I guess my point was that having a stable chip after tape out is becoming less and less a stroke of luck as the simulation tools get better.

I'm going to call low number statistics. How many successes does it take to make failure unlikely/impossible?
 
I would have to say that there is probably no way to predict with 100% certainty that there will be no chance of failure, but with better tools there should be a much better chance of avoiding failure.

On a side note, I have to admit that it will be interesting to see how GT200 compares to the 9800GX2. I imagine that when all is said and done, the GX2 will struggle to keep up in certain scenarios. I'm still waiting for the day when we see the monolithic 1024MB GPU.
 
Status
Not open for further replies.
Back
Top