Official: ATI in XBox Next

Status
Not open for further replies.
So the NV30 being slower(pissing gates away on features they can't use or won't enable), having dramatically inferior AA quality and performance, being more expensive to manufacture, drawing significantly more power, and putting out enough heat to double as a Foreman Grille qualifies it is a superior design? Not to mention that the NV30 is in a nutshell the same layout as the NV20. Nvidia is in a bad light for lots of reasons(they seem to go out of their way to look like assholes), and people are certainly as justified for disliking Nvidia for those reasons as you are for favoring them. As far as technology is concerned, I'd say that Nvidia and Ati are on more or less equal footing, how they implement the latest technologies into their chips(cost) is what has and will separate them.
 
Steve Dave Part Deux said:
So the NV30 being slower(pissing gates away on features they can't use or won't enable), having dramatically inferior AA quality and performance, being more expensive to manufacture, drawing significantly more power, and putting out enough heat to double as a Foreman Grille qualifies it is a superior design? Not to mention that the NV30 is in a nutshell the same layout as the NV20.

< throws up hands and runs away is distress > Can no-one here distinguish between theory and praxis? Can someone tell me how, I'll even use ATI language, many equivalent "Vertex shader pipes" the NV30 has?

PS. You do realize that with the "qualities" you listed above, many processors such as Stanford's Image Processor and PixelFusion suffered from. Needless to say they're still more 'advanced' designs.
 
Vince said:
Fox5 said:
So why can't they just focus almost all resources on a pc chip, and then just make the xbox 2 a pc in a box, and let microsoft worry about the rest?

Sony, IBM, Toshiba. $4Billion dollars.


I understand Cell could be quite powerful, but why bother making 2 seperate full featured chipsets, when you can just focus all your efforts on one and make it the best it can be, and then just shove it into the xbox 2. It may not come out as cheap or small as microsoft would like, but it would certainly meet the power needed.(or come closer to meeting it than designing two seperate cores) Microsoft could just make the xbox 2 a PC, load it up with direct x and windows, and worry about improving performance later.

BTW, it's always possible that cell may not work as expected and be a dismal failure. What if they can't get it finished in time for ps3, and ps3 runs on advanced ps2 hardware, or is delayed for a few years until cell is ready, with the PSP as a temp stand in.
 
Here, I'll even go out on a limb and explain what I'm thinking. Microsoft views Sony as a distruptive force in the forthcoming connected home, thus giving birth to XBox. We know from comments by people like Andy Grove and Gabe Newell that the above is true, that Microsoft fears the livingroom being connected by electronics companies utilizing competing OS's - such as the Panasonic/Sony Linux derivative or whatever. Microsoft think's XBox will enter them into this marketplace using the same trojan horse as Sony - the game console. So, on some level - XBox is pretty important and it's obvious that Microsoft wants to kill off PlayStation. To do this next generation you need to beat two things:


PlayStations/Sony's Hype.

Cell Architecture.

You won't outright beat Sony's Hype, this is a joke to even ponder. So, you need to beat Cell - which is basically a silicon embodiment of the above pervasive computing paradigm. I'm a big supporter of Cell, this is true, but then learn from me (my perception, understanding, thoughts) and apply it to this situation with your ideologies as a counter-balance. And as "Cell-believer" I was 'concerned' about only one obstacle this entire time - nVidia. Which I'll get into now:

Regardless of how Cell turns out, one thing can be inferred about the architecture and more specifically the IC used in PS3. STI is a venerable lithography and process powerhouse, perhaps even surpassing Intel as second to none. To beat Cell, you need to capitalize on lithography and push it to it's very limits - for I don't believe simple architectural differences/routine optimizations will cut it.

We're nearing a point as outlined in Suzuoki's Cell patent for SCE (and can be seen in the NV3x architecture) where graphic processing is becoming computationally limited and it's performance is limited by logic, thus pushing the advancement burden back to Moore's Law rather than Bandwidth or other such barriers. The future is an advancement following NV3x's direction, or more like Cell - where you have almost full computational flexibility threw the pipeline except where the task is excessively iterative and dedicated logic is the way to go.

To cause a 'borderline performance revolution' with type of architecture relies on bleeding-edge lithography and it requires the design team to push the process to the edge and beyond into the realm of poor-yields with the understanding that future lithography will bring the yields and costs under control. It requires massive investment like STI is doing ($8Billion in total) and it requires technologies like SOI/SS, Low-K, 65/45nm and lower lithography and other such advancements that are pushed hard.

When I saw IBM and nVidia team up and basically gain access to STI's advancements combined with some comments I heard a while back from a little bird - I thought it was over. nVidia has the balls to push and stick with it. When 3dfx was in the corner touching itself with .25um, nVidia was on Cu utilizing 180nm and utterly destroying 3dfx in everyway, performance, features, per IC cost, yields.. it goes on and on.

And I see the same now. While nVidia is testing with 130nm Low-K dielectrics, ATI is off pissing in the wind on a 150nm process. Sure, nVidia had problems this time, but it's the exception. What's ATI going to do when nVidia is utilizing a derivative of STI/AMD's 10S or 11S (11S is Cell's 65nm process slated for 2H 2004/1H 2005 production) process at IBM? Have you fanpeople tell us that SOI isn't necessary? That the thermal or 30% performance increase seen on Power4 isn't that big of a deal? That TSMC's sub-par roadmap and execution of <100nm is adequate? Don't even get me started on UMC, are they serious in going alone for 90nm and below then everyone is concentrating their R&D? HA! Give me a break.

Today is the first day I can say that Sony will be alright, that if I was Okamoto or Ken, I'd be happy as a pig in shit.


that was one helluva killer post :oops:
 
Fox5 said:
Who could enter the high end 3d graphics market?
Intel could bring back their chipsets....
3dlabs is doing something.
Matrox almost looked liked they wanted to make a comeback.
Imgtech is content not to do much.
Bitboys seems to have quieted down, but I figure they'll announce several new products in the future and never release any.
Some former 3dfx employees might be hanging around a bar somewhere...
Come this winter, S3's gonna do something ;)
 
I'd like to see "The most theorectically advanced 3-D graphics ever seen in a console." or "A design so powerful on paper, we couldn't fit it anywhere else." as bullet points at the unveiling of the next Nvidia chip. If Nividia were to give a presentation like that I would buy their top-end card on the first day of its' avaliability. We're talking about who can deliver. A good design is something you can put on a shelf that does what the writing on the box says it does. I can see where you're coming from though.
 
ATI not to ditch Nintendo

ATI already produces chips for Nintendo (news - web sites) Co. Ltd.'s (7974.OS) GameCube under a royalty agreement. Bergman said he did not expect the Microsoft deal to affect its Nintendo relationship and stressed it's not unusual for chip companies to supply products to competing firms.

"We've had a good strong long history with Nintendo and we don't expect any impact...we're comfortable with our relationship with Nintendo and I think we're great partners," he said.


The ATI executive declined to comment on whether the Microsoft agreement included conditions that could be triggered in the event of a takeover bid for ATI.

Securities filings showed Microsoft's deal with Nvidia gave Microsoft the right of first and last refusal in the event of a takeover bid for the company.
 
http://www.extremetech.com/article2/0,3973,1220430,00.asp

Don't expect the graphics capabilities of future Nintendo and Microsoft products to be exactly the same, however, the ATI spokesman said. "Yes, we have different design teams working on them, with different requirements and different timetables," the spokesman said.
That pretty much confirms ATI is working on the next Nintendo console for those who doubted it.

The part about different time tables is interesting although it could refer to ATI's internal production schedule (different teams, different cores, different levels of completion).
 
Interesting. Now we just need to find out what Nvidia is doing. Maybe nothing else then PC gfx cards. I'm still thinking that Sony will do the rasterizing and IQ part themself. They know what they did wrong the last time, maybe even the PSP will show that they are on par now with Nvidia on the IQ part.

Fredi
 
McFly said:
Interesting. Now we just need to find out what Nvidia is doing. Maybe nothing else then PC gfx cards.

well, they just bought MediaQ, so the next step for them is the low-power integrated market (cell phones, PDAs, etc)

I'm still thinking that Sony will do the rasterizing and IQ part themself. They know what they did wrong the last time, maybe even the PSP will show that they are on par now with Nvidia on the IQ part.

maybe they're on par with Nvidia, but are they on par with ATI? :D :D :D :D :D :D
 
Vince, the differentation you make between the "advancedness" of nv30s design and its somewhat sub-par performance really is a bit beyond my grasp. We're not talking about processor families, where you have to be very careful in introducing new features, because of their potential influence on all further designs from said family. To me the main differences between both designs was that ATI choosed a tabula-rasa like approach to dx9 and went with a pure 24-bit fp architecture, while nvidia went with a design that was primarily geared towards performing dx8 class content fast (12 bit fixed point units everywhere) with adding additional datapaths for "fp-pixels" on the pixel shader side (which probably represents the lion share of each ic's die size as well as engineering efforts). Nvidia has traditionally used its new generation chips as feature introductors(nv5 ->32bit colour, nv10 t'n'l, programmable register combiners...), and relied on later gens to make these really usable for real time content (as in bringing them to acceptable speed), which was fine as long as competitors lagged them a full year and released those features after nv's refinement parts. This is their real problem because if nv30 had been released first, nearly all developement would have concentrated on its architecture and ati would have been in a similiar situation as nvidia is now (releasing a card later that doesn't bring anything fundamentially better to customers) as then noone would have cared about nv30's bad fp performance as it isn't really required for the software you can buy nowadays anyway... Just to come full circle i'd say ATI and nVidia are pretty much on par with the "IP" they have. Nvidias bet was just way off this round and now they're paying...
 
Vince said:
What you just did is revisionist history.

What you're doing is looking at the x-box chip in a vacuum.

X-box cemented nVidia as the DX8 defacto standard, infused nvidia with lots of cash, and fueled R&D heavily.

Like I said. X-Box is one significant factor out of several significant factors that lead to nVidia's dominance in the DX8 era. But feel free to misrepresent my opinion as "x-box is the only reason why nVidia was successful". Misrepresentation is what people do when they don't have a legitimate response.

With all due respect Joe, I proceed to explain in pretty basic economic terms, with historical precedence, why I took this stance and proposed two pretty simplistic sceanrios that demonstrate this (keeping it simple and to the point the entire way).

With all due respect Vince, even nVidia was vying for the contract. It's obvious that nVidia saw the value in the deal. This doesn't mean the the winner will ultimately be successful. It means that it is a significant opportunity, and it offers more than your "vacuum look" of just selling a "handful" of console chips while draining away resources from the "real money makers."

You can only do so much, labor is a finite resource. I already addressed this and didn't imagine that it would be resurrected.

Obviously it's a finite resource.

And the number of chips you can sell in the PC market is also finite. The number of chips you can sell in the PC + Console market is also finite, but more than the PC sector alone.

Where's the R400 Joe?

Who needs the R400 at this juncture, Vince

This is the benefit of being the technology leader. ATI is firmly in that position at the high end, and likely will remain there through the R360/NV38 launch.

Well, I'll talk to you in 2005 bud. We'll see what's up.... as I already said, time's on my side.

How is time on your side again?

PS. Mfa is right, ditch your stock before PlayStation3.

I'm just happy I ditched my nVidia stock months ago.
 
Vince said:
Um, lets see what you said:
whql said:
What ATI achieved with .15u probably a far better feat - first to 100M transistors and with speeds that exceeds the limits of what NVIDIA have told people .15u can do!

Again, did I say "in the world", no, you did. Considering the discussion is oncerning ATI and NV and 3D graphics I'd have thought you might have been able to grasp this, but evidently not.

The sky is blue, Elephants are big. Any other off-topic, useless things you want to throw in here?
You're the one who threw the analogy in Vince, its not my fault it the analogy doesn;t fit.

It also is catering to the legacy ideals of "X pipes * X TCUs" which is useless - it serves us no good.
Yes it does, because we move into a more sahder enabled age, multitextureing will become less useful and more wasteul. Single texture pipelines (with one or many shader units) will increasinly become the norm. Again, if NV40 does move to an 8x1 design it will also prove that the talk of "how we discuss legacy units" was laso just PR rubbish again - if it was unimportant then they will stick with a 4x2 (or Xx2) design.

Besides, the NV3x will continue on, of course it'll be all FP (isn't the NV35?)
Accoring to the review here, it seems that NV35 is pretty much the same as NV30.

actual architecture may very well eventually become a hybrid architecture, which was begun in the NV3x vertex shader architecture.
What are you talking about? Hybrid architecture? hybrid of what and what? the VS is just a vs unit - there nothng "hybrid" about it.

It has less meaning because there's no reason to have such a static architecture consisting of TCU's in this fixed manner.
And yet, NV30 does have TCU's in a fixed manner - you've just successfully argued for the obselescence of NV30. Bravo!

Eventually architectures will be like a pseudo-NV30 front-end that allows for resource sharing by tasks between fragment and vertex.

there is no resource sharing between the VS and PS in NV30, what are you talking about? Even NVIDIA doesn't describe this as taking place in the pipeline diagrams:

http://suif.stanford.edu/~courses/cs343/l10.pdf

But, this doesn't mean that ATI can just design an open ended processor that can be used at XXX Fab or YYY Fab at a moments notice...
Do you understand anything about IP licensing? Dependant on what MS asks for all ATI might to is hand over a bunch of IP to MS (or their fab partner) and let them do the layout and design. Go and talk to PowerVR - they've got lots of licensees for a sinlge product, di you think they are all made at the same fab?

The processor will be designed for a specific fab/line with specific libraries utilized.... what the are you thinking, I'm confused?

Apparently so. Don't think processors, think IP. Its going to be up to MS to desice the nuts and bolts of how that IP is put into a physical process that may or may not invlve ATI - this is where there are big questions still to be answered, and we won't get those answeres until more of the detail on XBox2 drops into place.

ATI is intimatly bound to the eventual fab during the design, unless Microsoft is doing the synthesis and back-end stuff themsevles, but that's highly doubtful.

No, MS might choose someone else to do it, as Nintendo do with Flipper. NEC or ST are examples of companies that might do this - dependant on the deal being offered, Intel might be persuaded.

Can someone tell me how, I'll even use ATI language, many equivalent "Vertex shader pipes" the NV30 has?
the question is, what does it matter? So far ATI's VS has proven itself to be feature rich and more powerful (with considerably less clockspeed) so how cares its its an aray or not - as for that matter, there was an NVIDIA document that stated they used 3 parallel VS, so who can say what they really have. Given how far their pixel pipes have proven to be out of whack with what they told us they were I wouldn't want to make any guesses as to the truth of their VS organisation.
 
Great post about the importance of lithography in the near future Vince (I missed it earlier, fucking power outage :?)

I'm also an advocate of nVBox2, but it seems that's now a wash. IMO the best thing MS can do for itself now, is to form a strong partnership with Intel for the board design, chip layout, and most importantly -> fabbing. Good luck getting that to happen though, I'm sure in 2005 Intel will be enjoying the $180 margins (for their own CPUs) on their 65nm/90nm lines o_O

Bestcase scenario would have been NVIDIA/IBM/Intel, but hopefully ATI/Intel will be a close second.
 
zurich, IMO this isn't the final deal. Assuming that the ATI offer is a r5XX derivative then I really don't think MS is interested. MS wants something that match or exceed the PS3 and it's Cell computing, which I doubt can be achieved with a 90nm part. Perhaps ATI is going to make a 65nm r6XX derivative or whatever, but in that case one would have to seriously consider the enormous development on ATI and whether they can handle 2 massive console GPU projects and still make something in the PC graphics market.
 
MS wants something that match or exceed the PS3 and it's Cell computing
The part about different time tables is interesting although it could refer to ATI's internal production schedule (different teams, different cores, different levels of completion).

My guess is MS was told 2005= we match cell, 2006=we b#tch slap cell... and they went for the later, while the humble nintendo went for the former... If my hunch is correct... next gen. ;)
 
There's a possibility that Nvidia can still get the contract. I remember that MS gave 3DFX a chance to provide their own GPU(that exceeds the competition) within a specific time-table if they wanted the contract.

Edit: I think I confused this with 3DFX and PVR....
 
No I think your right about MS offering 3Dfx a chance! See, originally, MS was going with GigaPixel's GP4 architechure for Xbox. After MS turned to Nvidia, i think they offered 3Dfx a shot too, or perhaps they had an on going relationship since back in 1999 when MS was concidering graphics providers. I dont know if they also offered PowerVR a chance, but that would have made sense, because PowerVR, MS and Sega were all together on the Dreamcast.
 
Status
Not open for further replies.
Back
Top