Nvidia GT300 core: Speculation

Status
Not open for further replies.
Domell: No, no... that was just an instant idea, how to make it even more closer to NV30 :???:

Anyway, regional NV PR said it (that there is no GT300) many weeks ago. But majority of his information is bullshit, so I ignored it :smile:
 
I think it's called G300 now. And no, it's not pushed back into 2010 at the moment no matter what anyone says. Whether it will be or not is unknown right now.
 
I think it's called G300 now. And no, it's not pushed back into 2010 at the moment no matter what anyone says. Whether it will be or not is unknown right now.

Do you work at inq/fud/bison?

it's not true (which might or might not be right.)

If the first tape-out of G3P0 was "acceptable" it should be under performing for it's size, if not, it would require a respin. for the simple reason they have yet to release any 40nm parts and it would require a lot of sacrificed virgins to get it straight right of the bat while G80/GT200 40nm parts are a b***h to produce at 40nm.
 
I've marked that you're not impressed (like everyone else except maybe for the rather good power consumption) by the GT21x@40nm mobile parts in the other thread just 2 minutes after that post above :p
 
I've marked that you're not impressed (like everyone else except maybe for the rather good power consumption) by the GT21x@40nm mobile parts in the other thread just 2 minutes after that post above :p



We are waiting for 40nm node of performance mainstream GPU from NVIDIA (identical to GT200 performance ) with 150 USD ( initial price ) as well as 1024MB GDDR5, or even higher configuration.
 
I've marked that you're not impressed (like everyone else except maybe for the rather good power consumption) by the GT21x@40nm mobile parts in the other thread just 2 minutes after that post above :p

GT200 has better power consumption versus G80 so if they really wanted to have a good power consumption figure out there. They'd have to drop a lot (~30W) on the total card. but then. .maybe that's why they're taking their time and are on a 3rd spin for adequate power consumption/yield/performance.
 
I've marked that you're not impressed (like everyone else except maybe for the rather good power consumption) by the GT21x@40nm mobile parts in the other thread just 2 minutes after that post above :p

Must have missed the leak - where are they? ;)
 
Let's make a list then of how many times all the usual suspects of the rumor mongering websites have been correct and how many wrong. We could turn around this conversation endlessly like a perpetuum mobile and still not reach a reasonable conclusion. Oh no just wait a sec wasn't Charlie the guy that had it in good authority from a Sony lady employee that the next generation Playstation will contain LRB? Keep a note on that one and we can laugh about it in a couple of years.

I'll tell you what the consensus of the current reasoning exactly is: since Microsoft isn't aware of a final tape out on NV's X11 chip, it cannot have had a tape out. Now feel free to throw around a couple of PMs and e-mails and let me know if you can find anything better at the moment.

I'm not saying or claiming anything, I'm merely saying that I don't tend to believe either/or side just yet.

***edit: and yes B3D is such a nice place because you don't see that kind of BS on its front page.
The point I was trying to make is Theo's credibility in my book is not even zero, its below that. I give Fuad some credit even though his most stories are regurgitated from around here or some other forum, he has some sources in the AIB partners and he did break few things first than anybody else, same with Charlie. Now Theo .. lets email CJ. ;)
 
I find it amusing that Charlie ends his pieces with the moniker "SA", for those with long memories here at B3D.

Blasphemy!

The point I was trying to make is Theo's credibility in my book is not even zero, its below that. I give Fuad some credit even though his most stories are regurgitated from around here or some other forum, he has some sources in the AIB partners and he did break few things first than anybody else, same with Charlie. Now Theo .. lets email CJ. ;)

Make a start and reread my former post it might help asking the right questions. And no this isn't about any juvenile pissing contest about which wannabe online "journalist" is better or worse, it's more about having any definite information on the matter. You won't find anything else than a mighty pile of assumptions and guesswork from all sides; that's fine by me but it's in the end still reading a coffee cup vs. looking into a crystal ball.
 
If the first tape-out of G3P0 was "acceptable" it should be under performing for it's size, if not, it would require a respin. for the simple reason they have yet to release any 40nm parts and it would require a lot of sacrificed virgins to get it straight right of the bat while G80/GT200 40nm parts are a b***h to produce at 40nm.
You may think whatever you like but i'd recommend that you at least find out when G300 was taped out first (but don't search Inq archive, it's pointless).
Here's another hint for you: it might not be NVIDIA's fault if G300 will get pushed back into 2010 in the end.
NVIDIA's fault might be that they went with G300 first instead of some kind of G302. We'll see.
 
NVIDIA's fault might be that they went with G300 first instead of some kind of G302. We'll see.

If I take that to mean G302 would be a G300 shrink then I reaaally have no idea what you're trying to say. Don't tell us that you think G300 is a 55nm part now......
 
Make a start and reread my former post it might help asking the right questions. And no this isn't about any juvenile pissing contest about which wannabe online "journalist" is better or worse, it's more about having any definite information on the matter. You won't find anything else than a mighty pile of assumptions and guesswork from all sides; that's fine by me but it's in the end still reading a coffee cup vs. looking into a crystal ball.
You are just ignoring the whole basis of the link you posted and taking the discussion into another area (who is more reliable over who), my point in case here is that Theo is utterly unreliable when he's asking CJ to comment on his "news story".
 
Here's another hint for you: it might not be NVIDIA's fault if G300 will get pushed back into 2010 in the end.
NVIDIA's fault might be that they went with G300 first instead of some kind of G302. We'll see.

Haha, wow. you've recycled some other users posts in there to get to the same point. if G300 was ready at the supposed time frame, why does nVidia get influenced by TSMC's restarted 40nm process while AMD still stays on course?

G300 did not fly out of the gates or we would have some leaks of it at Computex.
 
Status
Not open for further replies.
Back
Top