Salvaged Nextgen APU's

James Car

Banned
Is there a technical reason that I'm not thinking of that would block MS/Sony from using their nextgen apu's which might not cut the mustard for a ps4/xbone, but would be perfectly fine in a laptop/tablet/desktop?

With the rumors of downclocks, I was thinking would that really be necessary if those same chips could be binned for other uses?

Both MS and Sony are now in the x86 hardware biz, so it seems to me they shouldn't have any problems in using these left-over chips for this market.

Thoughts?
 
Well, the trick is finding a product with low enough volume to use salvaged chips but several years down the line when I assume manufacturing is a lot better, the product isn't popular enough to actually need to commit spending $100 APUs on. For example, the Apple TV was using salvaged A5 chips with one core disabled but since October 2012, they've sold 1/2 of the 13 million lifetime sales of the device, which probably played a huge part into building a new piece of silicon with only one core on die.

Plus, Sony's die requires GDDR5 memory, I don't see them willingly diverting any of their 4Gbit chips, which leaves you with 4GB total system memory if used in a desktop. Kinda meh if you ask me.

Microsoft doesn't make any laptops, so I'm not sure what their APUs would be used for. Micro-servers maybe?

And how many defect chips are we talking about anyway? Hundeds of thousands? Millions?
 
Last edited by a moderator:
I'm pretty sure tablets are out of the question with power consumption too high by a factor of at least 20.

There may be custom firmware or microcode for the CPUs, and the GPUs would be partly non-standard, which would require specialized driver development on the part of Sony or Microsoft or the use of their proprietary code.

I doubt there's anything impossible about repurposing the chips, but would the console makers want that?
 
Well, the trick is finding a product with low enough volume to use salvaged chips but several years down the line when I assume manufacturing is a lot better, the product isn't popular enough to actually need to commit spending $100 APUs on. For example, the Apple TV was using salvaged A5 chips with one core disabled but since October 2012, they've sold 1/2 of the 13 million lifetime sales of the device, which probably played a huge part into building a new piece of silicon with only one core on die.

Plus, Sony's die requires GDDR5 memory, I don't see them willingly diverting any of their 4Gbit chips, which leaves you with 4GB total system memory if used in a laptop. Kinda meh if you ask me.

Microsoft doesn't make any laptops, so I'm not sure what their APUs would be used for. Micro-servers maybe?

And how many defect chips are we talking about anyway? Hundeds of thousands? Millions?

If you have a salvaging plan in place then this wouldn't be true. Not that I think they do.

Edit: Following up on the AMD thought though, they might have a salvaging plan in place if they're responsible and thus a multi-dram MC which they do have an IP for.
 
Last edited by a moderator:
I'm pretty sure tablets are out of the question with power consumption too high by a factor of at least 20.

There may be custom firmware or microcode for the CPUs, and the GPUs would be partly non-standard, which would require specialized driver development on the part of Sony or Microsoft or the use of their proprietary code.

I doubt there's anything impossible about repurposing the chips, but would the console makers want that?

Well they did with Cell but then were so restrictive about it that it went nowhere. I'm more interested in whether AMD has a plan in place if this is all being done on a contract basis and they're responsible for the yields. I could see them using a 6 jaguar/16CU apu for laptops.
 
I'm pretty sure tablets are out of the question with power consumption too high by a factor of at least 20.

There may be custom firmware or microcode for the CPUs, and the GPUs would be partly non-standard, which would require specialized driver development on the part of Sony or Microsoft or the use of their proprietary code.

I doubt there's anything impossible about repurposing the chips, but would the console makers want that?

Power consumption is based on clockspeed which is the origin of the reasoning for why this binning concept was thought up in the first place.

I'm thinking 500/1000 split with perhaps 8cu and 6cores for xbone apu.

The same cpu is already planned for netbooks/tablets and pitcairn was also retrofitted for mobile as well.

The Sony side is a bit more difficult, but I'm thinking a premium gaming portable device using 8GB GDDR could be sold into a price bracket which would be worthwhile to consider.

I'm thinking of the $1000-1500 bracket which would be niche, but again, salvaging chips which couldn't cut the mustard and would be throw-away if not for these outlets.
 
Well they did with Cell but then were so restrictive about it that it went nowhere. I'm more interested in whether AMD has a plan in place if this is all being done on a contract basis and they're responsible for the yields. I could see them using a 6 jaguar/16CU apu for laptops.

Speaking of this, didn't AMD hint that they would be offering powerful APU's at the end of the year based on ps4 tech?
 
Power consumption is based on clockspeed which is the origin of the reasoning for why this binning concept was thought up in the first place.

I'm thinking 500/1000 split with perhaps 8cu and 6cores for xbone apu.

The same cpu is already planned for netbooks/tablets and pitcairn was also retrofitted for mobile as well.

The Sony side is a bit more difficult, but I'm thinking a premium gaming portable device using 8GB GDDR could be sold into a price bracket which would be worthwhile to consider.

I'm thinking of the $1000-1500 bracket which would be niche, but again, salvaging chips which couldn't cut the mustard and would be throw-away if not for these outlets.

If the ps4 is a 100w tdp then you would need to cripple the device to get it to work on portable. Remember also that your already dealing with the crap of the haul and not the cream of the crop.

Speaking of this, didn't AMD hint that they would be offering powerful APU's at the end of the year based on ps4 tech?

what ps4 tech ? We already have apu's with the same tech that both the xbox one and ps4 use. I believe its kabini that was just released a few weeks ago

http://www.anandtech.com/show/6981/...ality-of-mainstream-pcs-with-its-latest-apu/3
 
Last edited by a moderator:
If the ps4 is a 100w tdp then you would need to cripple the device to get it to work on portable. Remember also that your already dealing with the crap of the haul and not the cream of the crop.



what ps4 tech ? We already have apu's with the same tech that both the xbox one and ps4 use. I believe its kabini that was just released a few weeks ago

http://www.anandtech.com/show/6981/...ality-of-mainstream-pcs-with-its-latest-apu/3

None of the GPUs seem as strong as what's in the PS4/Xbox One though.
 
Power consumption is based on clockspeed which is the origin of the reasoning for why this binning concept was thought up in the first place.
Clocks are one component of power consumption. These are very large chips with a high-power design target, and the operating range is normally restricted to an order of magnitude from bottom to top.
I can imagine slimming things down from say 20x to 10x too high.

The same cpu is already planned for netbooks/tablets and pitcairn was also retrofitted for mobile as well.
Temash is the chip to look at for what is just about the upper limit of the tablet space, which is a down-clocked Kabini with two cores disabled and less than 4W TDP.
I'd be curious if the big chips can reliably hit that with any level of activity going on. Sony seems to think not, with its background processor being charge of background tasks in the connected idle state.

I'm thinking of the $1000-1500 bracket which would be niche, but again, salvaging chips which couldn't cut the mustard and would be throw-away if not for these outlets.
This assumes the volume will be sufficient to make up for the costs of designing and building these niche products, built from hopefully a small fraction of the total chip production.
 
Last edited by a moderator:
If the ps4 is a 100w tdp then you would need to cripple the device to get it to work on portable. Remember also that your already dealing with the crap of the haul and not the cream of the crop.



what ps4 tech ? We already have apu's with the same tech that both the xbox one and ps4 use. I believe its kabini that was just released a few weeks ago

http://www.anandtech.com/show/6981/...ality-of-mainstream-pcs-with-its-latest-apu/3


500mhz gpu + 1000mhz cpu with turbo modes for heavy single thread apps.

As I said, jaguar is already a netbook/tablet cpu, and pitcairn (8970M) is already squeezed down into laptops.

I don't think TDP would be a reason for defective chips not being used in mobile/sff x86 designs.
 
Last edited by a moderator:
None of the GPUs seem as strong as what's in the PS4/Xbox One though.
AMD has two product lines. The kabini line which is for very low power usage. The chip shown uses 15.2w and then it has its larget desktop brother based on bulldozer. However it has to make a profit for amd at the $150 range. We will see faster ones released all year long leading up to the ps4 debut and next year we will see faster ones.

AMD isn't going to put a huge gpu in with its cpu's because they can sell large gpus at a profit. They can sell a $150 cpu + a $100 motherboard + a $150 gpu and make a profit on each one. A xbox one or ps4 apu on the market would destroy amd's profit margin for quite some time and would have to sell for a lot of money to make up for the profit on each of the pieces amd is selling today


500mhz gpu + 1000mhz cpu with turbo modes for heavy single thread apps.

As I said, jaguar is already a netbook/tablet cpu, and pitcairn (8970M) is already squeezed down into laptops.

I don't think TDP would be a reason for defective chips not being used in mobile/sff x86 designs.

Your assuming that would get you down to the proper tdp

The a4 500 is a quad core jaguar at 1.5ghz and its die size is around 107mm2 vs what 400mm2 for the xbox one and most likely similar for the ps4. Those are big power hungry chips .

Your simply not going to get them down to 15w. Mabye in the future when they go 22nm or 1xnm
 
Wasn't the Toshiba SpursEngine made of salvaged Cell chips?

If so, that wouldn't be a first time in the industry.
 
Couldn't it be just marketing talk?

I doubt any marketing dude would be saying "hey, we picked up some defective chips from the PS3, soldered them into really small add-in boards, laser-cut most of the chip, underclocked them and we're selling it to you for as much as an entire PS3".

10-20W sounds like the consumption of a half-cut Cell underclocked to half its original speeds.
 
Couldn't it be just marketing talk?

I doubt any marketing dude would be saying "hey, we picked up some defective chips from the PS3, soldered them into really small add-in boards, laser-cut most of the chip, underclocked them and we're selling it to you for as much as an entire PS3".

10-20W sounds like the consumption of a half-cut Cell underclocked to half its original speeds.

Don't know. According to the wiki schematic, SpurEngine has built-in decoders and other extras.
 
Back
Top