Predict: The Next Generation Console Tech

Status
Not open for further replies.
Of course AMD made money on that sale, there is no doubt about that. The question is why would they settle for a lot less when they obviously have no trouble selling their chips for more than any console manufacturer would want to pay for them.

You believe they will make alot more on those chips sold to retail. That isn't exactly true at all.
 
You believe they will make alot more on those chips sold to retail. That isn't exactly true at all.
Source?

I go by the price of the AMD chips at retail vs. the $37 cell of 2009. I have no doubt the 360 cpu cost the same or less before it was integrated with the gpu.
 
Source?

I go by the price of the AMD chips at retail vs. the $37 cell of 2009. I have no doubt the 360 cpu cost the same or less before it was integrated with the gpu.

Why are you comparing cell of 2009 vs amd chips of 2011.

Why don't you instead compare a amd chip made in 2005/6 like cell to what it cost for the same chip for amd in 2009.

Remember cell was a 90nm chip and is now on 40nm .

http://arstechnica.com/old/content/2006/02/6216.ars

According to them in 2006 the xenon was $106 and the gpu was $141 .

So for MS at least they were willing to devote at least $250 for the cpu and gpu alone.

The cpu price for an amd chip you see at retail once again has serveral layers of mark up that the chips amd/ibm/whoever sell to ms and sony.

For a retail amd chip you have the cpu cost , you have the packaging cost , you have the heatsink/fan cost you have the the shiping cost. Then you have the retailers portion.

These all add up over the cost of what sony buys a cell chip for or ms buys a xenon for.


You also forget that prices of cpus in the market change drasticly and quickly.


So lets think for a second. Lets say MS goes with a 4/8 bulldozer. Over the course of the 6-10 years AMD is going to make this chip for MS it will change process nodes hopefully many times (lets hope we can keep going smaller or find new ways of making chips faster and cheaper and bigger) each time it costs amd less to make the chip.

With any deal MS would pay the cost to manufacture plus then an agreed upon fee per chip. This could be small , $10 or so per chip mabye more. It could also start higher and decease over time depending on how the companys want to play it.

Both companys here make out as each process shrink the chip costs less for MS to buy but amd is making the same amount of money and with each shrink it takes up less fab time for the same amount of chips.

Now in the short term amd may make more by selling these bulldozers on the market. I wont disagree with that. However

Bulldozer 4/8 comes out this year and i think we all agree we wont see new consoles this year. So lets say the new system coms out next year. That means the bulldozer 4/8 cpu has been out ofr a year already. Lets say this year we see 3ghz - 3.5ghz speeds. MS went with 3.2ghz speeds for the console in 2012.

So in 2011 amd is already pricing these at diffrent levels. 2012 comes around and intel has new chips and so now amd has to price drop again. The ASP of the bulldozer chips have already changed before the ms system launches. Yields have gone up and the cost ot make them for amd has gone down.

now enter MS . MS needs millions to launch and AMD agrees to make them and as amd makes them they tweak the process , yields go up and costs go down . So now because of that the cost per bulldozer sold to the pc market is less . So now amd is making more per bulldozer chip.

But 2013 comes out and those 4/8 bulldozer chips are now mid range chips instead of high end chips. So now those chips have an even lower ASP. But now Amd moves to a new micron process and they are making 8/16 chip bulldozers. Once again the asp of the 4/8 chips have gone down. The 8/16 chips are now their big sellers. But they are still making the same 4/8 chips for ms tand they can make more 4/8 chips per wafer than the last micron process . Now while the bulldozer chips in the pc market are considered low end chips amd is still making the same profit per bulldozer sold to MS .

Amd's costs of supplying MS have dropped. a micron drop allowed them to fit more 4/8 dulldozers on a wafer which brings down the cost per chip . MS is still paying them that fixed rate over the cost of the chip. Since more 4/8s can be made per wafer amd is devoting less time to making ms's chips .

On MS's side the costs for the 4/8s have gone down , the fee they pay AMD per chip is the same but the cost of the chip has decreased due to better yields and a smaller micron process.


Now we skip to 2016 . 4/8 bulldozers are old news and no one cares about them. They don't even sell bulldozers to the pc market anymore they have a whole new cpu design out there. But now they have droped another 2-3 micron processes and those 4/8 chips are dirt cheap but they are still collecting that fee from MS and they are still able to sell ms millions of 4/8 bulldozer chips to MS and collect that fee from ms.

In 2016 producing these chips would take very very little fab capacity from amd , they would be netting x amount per bulldozer that long since stoped making them money in the console sector.

Now a 2012 console would only be in its 4th year in 2016. It would have the potential to continue selling till 2020 -2022 and continue to bring in revenue to AMD

Do you see where there is the upside for AMD in this situation ?



The only way this wouldn't work in AMD's favor is if they odn't have the fab capacity to create the chips , which i don't think is the case.
 
Last edited by a moderator:
I think we should forget about those iSuppli estimates as well, not sure why they keep creeping up into things. iSuppli is the VG Charts of the cost-of-goods world (or worse); I'd trust the price/chip predictions of several members of this forum well before I would trust those numbers from iSuppli. Their strength lies in teardowns where the components used can be cost-estimated based on volume availability and pricing within the sales channel for said components... RAM modules, capacitors/mainboards, perhaps an iteration of an ARM processor or an FPGA. When it comes to proprietary goods and actual costs of production for IC's and such though, I have no faith in their numbers being anything other than stabs in the dark. Sony sure as hell didn't go up to them and say their costs per Cell were $37 in 2009, that's for sure, and likewise with MS. As for AMD, their overall margins improve when capacity is soaked up, whether or not on a per-chip basis the margins are high for a particular SKU; if either Intel or AMD feel that come 22nm they would have access to the volumes necessary, whether on that process or an older one, I don't see why they wouldn't offer the console players chips with minimal margin built-in. That doesn't mean that would be the path taken, of course.
 
Last edited by a moderator:
With Sony going with SGX for PSP2, do you think they'll switch PS4 to future SGX too instead of going with NV again ? How's the high end PVR offering compare to AMD and NV ?
 
I'm thinking you meant Intel released its first OOO processor in 95.
It was not the first OOO.
x86 made it signficantly harder to design an OOO chip.

PPRO was the first processor where all instructions are executed out of order. IBM have implemented OOO execution in floating point units since the mid sixties, but these are much more limited in scope.

According to Wikipedia, HAL released the SPARC64 at the same time as Intel released PPRO. However, it was hardly production ready. The only one I ever used was downclocked to 90 MHZ.

The K5 was based on an un-released implementation of AMD's 29K series. It could probably have beaten Intel as the first OOO processor.

Regarding x86 being a barrier the OOO execution. The only thing I can think of is partial register updates, which did indeed clobber Intel's PPRO when executing 16 bit code.

Cheers
 
Last edited by a moderator:
Can you prove any of this or is this just talk? You talked about DCUO doing pretty bad atm, yet DCUO is the best selling game for SOE. You, also, said SOE "didn't ship any" copies of DCUO for PC. Did you forget that it's, also, available on Steam? It's not like it would be a surprise that the console version of games sell more. It's pretty standard, actually.

DCUO has had the worst launch numbers of any recent P2P MMO. AOC/WARHAMMER/AION all have much much better launch numbers than DCUO. The issue was there were very very few pre-orders for DCUO and as such no retailers bothered to order many copies for either PS3 or PC.

Why 6 months or did you just pull that time period out of thin air?

By 6 months everyone who is on either month to month, 3 month, or 6 month billing will hit their next decision point on canceling. If people are talking about or there have been server merges by that point, its going in the downward spiral.
 
I dont think anybody has a GPU that can compete with ATI and Nvidia anymore, or even close. The last few attempts have been disaster, including Intel's Larrabee. And that is Intel, nobody is bigger in chips than Intel!

ATI themselves has stated, they have a thousand engineers working on their chips. It's nearly impossible to compete.

Not only that, but Nvidia/ATI have untold thousands (millions? billions?) of man hours of heritage to build on incrementally in their GPU's. Not even to mention the ridiculous patent system that probably allows them a near strangehold before even considering any other factors.

If Sony goes with a company besides one of those two-and because it's Sony and they are kind of crazy, it is possible, I'm half expecting it- they will be at a major power disadvantage.

There was a rumor or something like that a while back (which is so long before any actual PS4 as to be relatively meaningless, of course) out of watch.impress that Sony was wanting to go back to Toshiba for PS4 GPU. If they dont use the big two that would be my guess. Japanese companies prefer to use Japanese technology.
 
Last edited by a moderator:
I dont think anybody has a GPU that can compete with ATI and Nvidia anymore, or even close. The last few attempts have been disaster, including Intel's Larrabee. And that is Intel, nobody is bigger in chips than Intel!

ATI themselves has stated, they have a thousand engineers working on their chips. It's nearly impossible to compete.

The thing is ATI and NV GPU are very power hungry. If something like SGX543MP4 can offer similar performance to RSX while using much less power, wouldn't it be good to look at how much the tech can scale up while maintaining power advantage ? Say SGX543MP16 or future iteration. I know it wouldn't compete with NV Fermi yet but I think Fermi just uses too much power for the small console form factor. PS3 was already in the large-ish size, any bigger they might as well ship it in a similar form factor as a receiver.

Not only that, but Nvidia/ATI have untold thousands (millions? billions?) of man hours of heritage to build on incrementally in their GPU's. Not even to mention the ridiculous patent system that probably allows them a near strangehold before even considering any other factors.

SGX is there in the mobile sector, so they must have patents of their own.

If Sony goes with a company besides one of those two-and because it's Sony and they are kind of crazy, it is possible, I'm half expecting it- they will be at a major power disadvantage.

There was a rumor or something like that a while back (which is so long before any actual PS4 as to be relatively meaningless, of course) out of watch.impress that Sony was wanting to go back to Toshiba for PS4 GPU. If they dont use the big two that would be my guess. Japanese companies prefer to use Japanese technology.

I just don't believe that the GPU has to be the most powerful, because the size of the console will limit everything anyway. The more efficient things are the better it is. Mobile GPU seems to be a better fit provided they can be scaled up.

As far as Sony going back with Toshiba, I doubt that, they would have done that with PSP2 if they wanted to, but didn't.
 
Intel HD3000 with 12 SP's is also a pretty good GPU, on par with the lower end stuff from NV/AMD. A version with 48-60 SP's would be pretty competitive on performance per watt, and Intel might have the motivation to sell it cheap to establish themselves in that market, because still the common opinion is Intel IGP = crap.
 
DCUO has had the worst launch numbers of any recent P2P MMO. AOC/WARHAMMER/AION all have much much better launch numbers than DCUO. The issue was there were very very few pre-orders for DCUO and as such no retailers bothered to order many copies for either PS3 or PC.

What's your source for this ?
 
I remember the original rumors about the PS3 that had Sony using multiple Cells instead of a GPU. As we've seen, the Cell is a monster when it comes to graphics processing, so it's not beyond the realm of possibility that their next console could perhaps have 4 or more cells, and the developer chooses what each CPU concentrates on. So you could have extremely good AI/Physics with normal graphics, or fantastic graphics with normal AI/Physics.

Unfortunately, the Cell ship has sailed. It was supposed to be at 32 SPUs at this point, but with IBM and Toshiba pulling out, it's not getting the development resources it need to compete.
 
I dont think anybody has a GPU that can compete with ATI and Nvidia anymore, or even close. The last few attempts have been disaster, including Intel's Larrabee. And that is Intel, nobody is bigger in chips than Intel!

ATI themselves has stated, they have a thousand engineers working on their chips. It's nearly impossible to compete.

Not only that, but Nvidia/ATI have untold thousands (millions? billions?) of man hours of heritage to build on incrementally in their GPU's. Not even to mention the ridiculous patent system that probably allows them a near stranglehold before even considering any other factors.
I kind of agree to that, to some extend next gen console could be products of the graphic chips manufacturer only. Nvidia (if nothing goes wrong) could provide the entire system ie what ever they are calling "denver", for AMD I would willingly call some bobcat cores and a strong GPU an ATI system (say the brand still exists) than an ATI product. In both cases the CPUs either X86 or ARM are commodity parts, not top of the line products. I feel like on the CPU front it's becoming a bit of the same, I fear that nobody can touch Intel right now for high performances processors, I've few hope in bulldozer and in AMD overall in this regard.

AMD(ATI) and Nvidia rules the game in high performances GPUs, GPUs manufacturers from the embedded space could compete at some point but people should not make it trivial, ATI, Nvidia they both faced some severe productions issues with some of their products, that's what you get when you aim really high. Nothing prevent the manufacturers from the embedded space to face the same problem if they were requested to provide an order of magnitude more performances than what they actually deliver.
The same is true for ARM, they may have bad surprise with A15 or following architectures, they would not be the first ones... See what happened to AMD multiple times, the product won't clock as supposed, consumes more than expected, etc.

Shortly manufacturers from the embedded space have plenty of opportunities to messed if they want to compete with manufacturers offering higher perfs either in the CPU or GPU spaces (Intel, AMD, Nvidia, IBM, etc.).
 
Last edited by a moderator:
I remember the original rumors about the PS3 that had Sony using multiple Cells instead of a GPU. As we've seen, the Cell is a monster when it comes to graphics processing, so it's not beyond the realm of possibility that their next console could perhaps have 4 or more cells, and the developer chooses what each CPU concentrates on. So you could have extremely good AI/Physics with normal graphics, or fantastic graphics with normal AI/Physics.

Unfortunately, the Cell ship has sailed. It was supposed to be at 32 SPUs at this point, but with IBM and Toshiba pulling out, it's not getting the development resources it need to compete.

4 cells in tandem wouldn't come close to a modern mid range PC GPU, nevermind be suitable for a next gen console.
 
As my intelligence and knowledge about next generation consoles is unmatched, please feel free to ask me anything and I'll do my best to answer if I can.

Oh, thanks! :p

Do you know if the ports and other general console attachments such as USB, optical out, component and HDMI are expensive enough to worry about if we're trying to speculate on cost and next generation consoles? I remember Shifty was a bit miffed the NGP didn't have TV out.

Are the warranty repair costs on mechanical components such as DVD rom drives and HDD's expensive on a per console basis? I was thinking in the context of optical vs flash based media.
 
I remember the original rumors about the PS3 that had Sony using multiple Cells instead of a GPU. As we've seen, the Cell is a monster when it comes to graphics processing, so it's not beyond the realm of possibility that their next console could perhaps have 4 or more cells, and the developer chooses what each CPU concentrates on. So you could have extremely good AI/Physics with normal graphics, or fantastic graphics with normal AI/Physics..

Some fixed-function graphics hardware is almost certain

Intel:
Larrabee includes texture filter logic because this operation
cannot be efficiently performed in software on the cores. Our
analysis shows that software texture filtering on our cores would
take 12x to 40x longer than our fixed function logic, depending on
whether decompression is required. There are four basic reasons:
• Texture filtering still most commonly uses 8-bit color
components, which can be filtered more efficiently in
dedicated logic than in the 32-bit wide VPU lanes.
• Efficiently selecting unaligned 2x2 quads to filter requires a
specialized kind of pipelined gather logic.
• Loading texture data into the VPU for filtering requires an
impractical amount of register file bandwidth.
• On-the-fly texture decompression is dramatically more
efficient in dedicated hardware than in CPU code.
 
@bkilian: The rumors of multiple Cells were just that, rumors. And they were born more out of confusion of the facts rather than anything concrete. Said original rumors though stemmed from Sony's patent, where even there, there was dedicated graphics silicon; 'Cell' as envisioned had a GPU component as part of its heterogeneous architecture.

Anyway beyond that, even prior to NVidia there was the direction that same design team was moving towards with the Cell and the Toshiba GPU.
 
Status
Not open for further replies.
Back
Top