Was Cell any good? *spawn

Status
Not open for further replies.
Haven't read anything but the last couple of posts, but I do think that Cell was fairly forward-looking myself. The idea of many-core processing, and more importantly heterogeneous processing, essentially took shape in this architecture well before others - at least insofar as Cell could be considered as mainstream an implementation as had been attempted to date. And when you look back to its inception and the time horizon over which the development played out... I mean I've always been in the "fan" column, but the ideas and concepts made sense, and still do. I would argue that AMD's current path is "Cell-esque," and IBM certainly has incorporated some of the concepts into their deep computing efforts.

Perhaps the execution was marred, but the premise of multiple individual cores, specialized processing units on-die (harkening to the BE patent), and lower power consumption were ideas ahead of their time. Granted, low-power was not an original explicit goal of the project itself, as this was all taking place during the Netburst era (which was part of the problem given the nodes available at the time).

I'm sure I'll be expounding on these views of mine in a couple of posts, but to multiple market segments specifically, though Cell was not the candidate, I think that the success and penetration (as well as many segments served) by a company like ARM show that the basic premise behind a "take over the world" architecture can be feasible. If Sony had at all an integrated cross-divisional development culture and a partner less mercurial than IBM, maybe things would have ended slightly differently.

On a more aesthetic note, it's hard for me to get excited for an all-AMD PS4. Processing power is of course one thing, and games another - it should have both of these as best as could be expected with any system. But *without* the uniqueness associated with it, it makes me wonder what the future of a former hardware company like Sony is when their innards are now essentially straight OEM and their grasp of cross-product software is still not where it should be relative Apple and MS.

Oh well, we'll see I guess!
 
All of this talk of G80 was simply in response to the assertion that the folks at IBM/Sony had "no idea" that GPUs and CPUs were already on a trajectory to take over the space that Cell was aimed at. That notion is either 1) nonsense or 2) indicative of some pretty serious issues from the decision-makers.
So what should the decision making from around 2000 have been regards both designing a new uber-powerful console and wanting a flexible, programmable architecture that'd fit into CE devices?
 
Haven't read anything but the last couple of posts, but I do think that Cell was fairly forward-looking myself. The idea of many-core processing, and more importantly heterogeneous processing, essentially took shape in this architecture well before others - at least insofar as Cell could be considered as mainstream an implementation as had been attempted to date. And when you look back to its inception and the time horizon over which the development played out... I mean I've always been in the "fan" column, but the ideas and concepts made sense, and still do. I would argue that AMD's current path is "Cell-esque," and IBM certainly has incorporated some of the concepts into their deep computing efforts.

It wasn't forward looking at all. DMA-only certainly isn't forward looking. many-core was already widely published and researched at that point and all the research pointed away from the direction cell went. The idea of big-little had also already been researched and once again cell went in the wrong direction from the research. big-little only works when you have largely the same isa in a coherent interconnect. The ideas didn't make sense and still don't make sense. The lack of even non-coherent memory access is and was a bad idea.

AMD's current path is hardly "cell-esque" And IBM hasn't incorporated some of the concepts into their deep computing efforts.


I'm sure I'll be expounding on these views of mine in a couple of posts, but to multiple market segments specifically, though Cell was not the candidate, I think that the success and penetration (as well as many segments served) by a company like ARM show that the basic premise behind a "take over the world" architecture can be feasible. If Sony had at all an integrated cross-divisional development culture and a partner less mercurial than IBM, maybe things would have ended slightly differently.

ARMs strategy has been a simple small RISC processor with full memory access, in no way is cell at all similar.

On a more aesthetic note, it's hard for me to get excited for an all-AMD PS4. Processing power is of course one thing, and games another - it should have both of these as best as could be expected with any system. But *without* the uniqueness associated with it, it makes me wonder what the future of a former hardware company like Sony is when their innards are now essentially straight OEM and their grasp of cross-product software is still not where it should be relative Apple and MS.

The reason sony is "a former hardware company" is because their hardware failed. It failed because it failed to take into account history and research. and uniqueness isn't a priori a beneficial aspect. It is only beneficial when it grants a beneficial aspect.
 
So what should the decision making from around 2000 have been regards both designing a new uber-powerful console and wanting a flexible, programmable architecture that'd fit into CE devices?

That they are not congruent markets. That they have significantly different needs and wants. That the advantages for both markets rely not in the underlying hardware but in the combination of software, tools, and design. That trying to do one design for both will likely result in no viable design for either (which is in fact what happened. The only thing that even saved the PS3 was them bailing at the last second and buying a GPU design to use.).
 
That they are not congruent markets. That they have significantly different needs and wants.
That's perhaps true, but I think we all also recognise, or at least desire, a single unfied architecture to render everything far simpler. It may not be a good business choice, but then Kutaragi wasn't known for his good business choice as much as his desire to create technology that he felt was a good design. If Cell had made it into TVs and PVRs and stuff, with a single processor architecture and a single target for the same software to run on millions and millions of varied devices, that'd be great for the industry. We're looking at the same results of unified devices now by software abstraction, so apps will run on varied hardware to achieve the same results but with added complexity and development woes. If you write an Android app for a Nexus 7, it may not work on another Android device ot TV or Android PVR. If instead every Android PVR and TV and whatnot had exactly the same processor, then they'd work flawlessly with completely code portability. That'd be a great environment for development, even if a completely unrealistic one.

That Cell cannot do that because of cost and price issues is a shame, but the processor itself is up to the task I think. I can agree that the choice to try and achieve that was probably a bad one. Given that Sony made that choice though, the outcome is a reasonable accomplishment IMO.
 
i dont think carlB trying to say that cell and arm are similar,

i think hes trying to compare the - "take over the world, throw cell in everything, network processor creates networked processing, its a stupid idea because its from sony and theyre stupid" idea with what has currently happened with arm processors and how theyre being thrown in practically every type of system

edit - neither of us is talking about price, we're talking about the idea of throwing a single architecture across multiple devices, just the idea

nobody thought something like that was possible when ken and sony talked about it, and people laughed at the idea, but arm is doing it, just not networked processing, people thought such a thing was impossible back in 03/04 and talked about the barriers that major players like intel and amd would put up but we've seen that thats not true

how come people thought sony was crazy back in 03/04?, was it becaue they knew how much cell would cost? or because they thought the idea was crazy? if it was about costs, then did these same people predict in 03/04 that someone would rise to threaten amd and intel by putting their architecture in every type of system? or did these same people assume that amd,ibm, and intel would maintain their stronghold?

also, what if arm introduced an architecture back in 04/05 that included elements which allowed for networked/distributed processing, would it have failed? or would it have just continued to evolve on top what we see today?

like i said above, im only talking about the idea, and how it seemed like no one accepted it back in 03/04
 
Last edited by a moderator:
idea with what has currently happened with arm processors and how theyre being thrown in practically every type of system

ARMs are thrown in everywhere because they are really, really cheap. A soft IP ARM Cortex M4 is 0.04mm^2 on 40nm, probably only 0.02mm^2 on a 28nm process. The silicon cost of having one is measured in hundredths of cents (and ARM has been known to give very, very low licensing prices for new applications) -- so why not toss a few into whatever device you are building if having a real cpu you can properly program makes your physical design simpler? They are so cheap that if you save one solder pad it might be worth it.
 
edit - neither of us is talking about price, we're talking about the idea of throwing a single architecture across multiple devices, just the idea

nobody thought something like that was possible when ken and sony talked about it, and people laughed at the idea, but arm is doing it, just not networked processing, people thought such a thing was impossible back in 03/04 and talked about the barriers that major players like intel and amd would put up but we've seen that thats not true

Um, many people thought something like that was possible, they even did it. They were called ARM. What they didn't do was create a frankenstein of an architecture to do it. ARM has and had back then a significant market in CE.

how come people thought sony was crazy back in 03/04?, was it becaue they knew how much cell would cost? or because they thought the idea was crazy? if it was about costs, then did these same people predict in 03/04 that someone would rise to threaten amd and intel by putting their architecture in every type of system? or did these same people assume that amd,ibm, and intel would maintain their stronghold?

They thought sony was crazy because their solution was crazy. It makes a whole lot more sense to have a PPU with some basic offload for MPEG2/4 or even a decent vector engine, than to take the area for ~4 PPUs and fill it with 1 PPU plus a lot of non-coherent sub-cpus.

also, what if arm introduced an architecture back in 04/05 that included elements which allowed for networked/distributed processing, would it have failed? or would it have just continued to evolve on top what we see today?

it would of failed. The micro-architecture of CELL is in a wasteland of solutions. It is a poor CPU, it is poor fixed function, and it is a poor GPU.

If you want a basic design that can be tiled to reach a variety of markets, you do what everyone but sony did: Design a basic CPU core with robust wide SIMD capability and optionally add acceleration either though fixed function block or targeted instructions. ARMs success came in large part not primarily through their CPU designs but in AMBA and its various successors that allowed people to easily add fixed functionality and customization.
 
yeah, i know arm did it, thats what my entire post is about

another question i have is this, when people questioned sony's vision of throwing a single architecture in everything (discounting networked processing) - how many people at that same time pointed at arm?

in 03/04 was anyone willing to say "sonys vision is possible, just look at this Arm company" - or did people just laugh at the idea?

i just feel like everyone laughed not just at sony, but also the "throw it in everything" idea, especially in regards to desktops, and that people were willing to believe that no one would challenge amd and especially intel
 
Last edited by a moderator:
It wasn't forward looking at all. DMA-only certainly isn't forward looking. many-core was already widely published and researched at that point and all the research pointed away from the direction cell went. The idea of big-little had also already been researched and once again cell went in the wrong direction from the research. big-little only works when you have largely the same isa in a coherent interconnect. The ideas didn't make sense and still don't make sense. The lack of even non-coherent memory access is and was a bad idea.

When I say forward looking it is less speaking to specific implementation and more towards the recognition (and attempt to address) the "future" of processing. Whatever the results, the team that set out on this path accurately forecast the growth of processing demands in embedded devices, TVs, and all manner of gadgets in the "digital" world. At the minimum, it needs to be viewed through the lens of a five year design process with three self-interested parties pushing and pulling in several directions, where the nature of the anchor product prevented mid-course correction or redirection. But whether it was the ideal representation of said vision or not, it was nevertheless the first "mainstream" chip designed expressly to tackle that paradigm.

And I really would emphasize that Cell was hardly a failure across all areas... at least insofar as its inherent strengths went. The chip was fast - extremely so. That thing could and can crunch numbers. If GPUs didn't take the change in direction that they did (which honestly not many computing powers in the industry anticipated), becoming increasingly programmable to the point where they are now the poor man's go-to supercomputing solution, I think Cell absolutely would have had a very nice niche to play in. Even at launch, it still outperformed the then-contemporary GPGPU implementations.

In my mind 2008 was a key death knell year for the architecture as an arc with a future - it was the year that IBM was finally coming out with integrated Cell-based solutions for the finance and oil industries, and it was the same year that the financial crises took finance to its knees and the commodities market imploded. Come the nascent recovery in late 2009, those efforts had been abandoned with the market niche they targeted.

AMD's current path is hardly "cell-esque" And IBM hasn't incorporated some of the concepts into their deep computing efforts.

I remember specifically a couple of years ago reading an interview with the head of deep computing where he called out Cell specifically as providing important contributions to IBM's efforts in interconnect and bus technologies. Now, granted I don't have these things on speed dial as I once might have, but I'll see if I can track it down. He also mentioned a processor project which was to be a many-core spiritual successor (at least in IBMs eyes) to the concept of the SPEs. Granted I haven't heard anything recently about this chip - not that I haven't been truly out of it though recently - but potentially the same dynamics that have caused Larrabee to stall may be at play here.

ARMs strategy has been a simple small RISC processor with full memory access, in no way is cell at all similar.

No argument there whatsoever, but at one time there were factions pulling for Cell to take on just such a design. Cell on one hand is many-core, modular, low-power, scalable; on the other it was a memory model, interconnects, ISA, and architecting. Both philosophies were a part of the final design, but at the same time they were free-floating from one another. It's just that the later was the specific form chosen to achieve the former in this case.

If it had been a collection of multiple MIPS/RISC processors as some of the original engineers had desired, then we might say the design would have been better suited for scaling and the embedded space. Likewise in the other direction if the PPE had been a more robust OOE core, then the architecture might have been more successful on the server front, where as it stood IBM had to mate it to Opteron blades to serve that role.

The reason sony is "a former hardware company" is because their hardware failed. It failed because it failed to take into account history and research. and uniqueness isn't a priori a beneficial aspect. It is only beneficial when it grants a beneficial aspect.

My take on it will always be one of vested interests and bureaucracy over at Sony. TV, cameras, audio, gaming... all setting design arcs completely independently of one another.
 
Yeah, you might like John Carmack's experience in trying to get in touch with the HMD guys as an example of how far Sony still has to go in that respect.
 
When I say forward looking it is less speaking to specific implementation and more towards the recognition (and attempt to address) the "future" of processing.

Welcome to a lot of BS that doesn't mean anything then. Every CPU design team does this for every CPU they design. That basically the whole point of designing it instead of just selling what you already have. Getting grandiose about acting what attributes a products needs is ridiculous.

Whatever the results, the team that set out on this path accurately forecast the growth of processing demands in embedded devices, TVs, and all manner of gadgets in the "digital" world.

Their level of accuracy is at best highly debatable.

At the minimum, it needs to be viewed through the lens of a five year design process with three self-interested parties pushing and pulling in several directions, where the nature of the anchor product prevented mid-course correction or redirection. But whether it was the ideal representation of said vision or not, it was nevertheless the first "mainstream" chip designed expressly to tackle that paradigm.

No, no it wasn't.

And I really would emphasize that Cell was hardly a failure across all areas... at least insofar as its inherent strengths went. The chip was fast - extremely so. That thing could and can crunch numbers.

No, not really. Yes the chip has some nice peak figures, but it can't reach those outside of really anything non-useful. Peak doesn't matter and it never has.

If GPUs didn't take the change in direction that they did (which honestly not many computing powers in the industry anticipated), becoming increasingly programmable to the point where they are now the poor man's go-to supercomputing solution, I think Cell absolutely would have had a very nice niche to play in. Even at launch, it still outperformed the then-contemporary GPGPU implementations.

Pretty much everyone with a clue knew GPUs were going to get more versatile and more programmable. After all, that was only the well established trend by that point.

And Cell's only niche was being slightly more than useless in niches no one cares about.

In my mind 2008 was a key death knell year for the architecture as an arc with a future - it was the year that IBM was finally coming out with integrated Cell-based solutions for the finance and oil industries, and it was the same year that the financial crises took finance to its knees and the commodities market imploded. Come the nascent recovery in late 2009, those efforts had been abandoned with the market niche they targeted.

Cell was dead well before 2008. It was a dead man walking from pretty much commercial release. While both 360 and PS3 came out on the wrong sides of technological shift, PS3 was much further on the wrong side.



I remember specifically a couple of years ago reading an interview with the head of deep computing where he called out Cell specifically as providing important contributions to IBM's efforts in interconnect and bus technologies.

Highly unlikely to be true and certainly not shown in any actual product or briefs of future products from IBM.

No argument there whatsoever, but at one time there were factions pulling for Cell to take on just such a design. Cell on one hand is many-core, modular, low-power, scalable; on the other it was a memory model, interconnects, ISA, and architecting. Both philosophies were a part of the final design, but at the same time they were free-floating from one another. It's just that the later was the specific form chosen to achieve the former in this case.

You are looking at Cell drunk and with rose colored glasses. Cell wan't a good many core, its modularity is highly questionable, low-power isn't really its forte, scalable was never proven and data point to no, its memory model was completely broken, etc.


My take on it will always be one of vested interests and bureaucracy over at Sony. TV, cameras, audio, gaming... all setting design arcs completely independently of one another.

As they well should. Listen, if the various parts can reuse stuff from the others, then great. But they should be focused on their products, not some grand insanity dreamed up to save the world. A lot of the failures at sony can be directly attributed to trying to come up with grand master plans and then forcing products to adapt to that plan rather than coming up with actual product requirements, finding points of commonality, and leveraging joint design where possible.
 
If GPUs didn't take the change in direction that they did (which honestly not many computing powers in the industry anticipated)
No, that's nonsense. *Everyone* knew it was coming. I worked as part of a group doing some of the early GPGPU stuff where you had to virtualize/emulate almost everything, and there never was any doubt that GPUs would soon support those features natively. This notion that Sony/IBM somehow foresaw the future of how to do computing but missed that GPUs and CPUs were both already on that trajectory is the most bizarre marketing switch-around I've heard.

Anyways I don't think anyone is changing their minds at this point, but it has been enlightening to hear the different viewpoints at least.
 
Last edited by a moderator:
rant rant rant

Yeah, we're going to have to disagree here. Cell is very good at achieving its maximum potential, and definitely was a valid attempt at solving the bandwidth issues for multi-core processors. I'm hearing a lot of crap talk about how other processors are so much better where in reality they actually don't solve the many-core issue / bandwidth / pipeline / cache stuff all that well either. The few that do attempt to solve this problem, all resort to a similar solution to Cell, and while everyone on the programming side seems to agree it would be nicer if they could design it so that the cache that can be alotted to each individual core.

In the many articles I've read, unlike almost any other (multi-core) processor Cell was shown to be able to actually achieve pretty much its theoretical maximum for quite a few tasks. And it's versatility, it's position inbetween the traditional CPU and the GPU, in the PS3 has meant it can be flexibily assigned to either, which is something highly praised in GPU designs since Xenos, it still has some specific strengths for tasks in the middle, and yet is supposed to be a weakness here. Claim all the authority you want, but I'm not buying it. If I want to just buy a single CPU today that has significant multi-core power, even a quad-core i7 is significantly more expensive (and I don't doubt much more power-hungry) than any current gen console, including the most complete PS3 package you can find. And GPUs that are as good as the Cell for GPGPU programming style tasks may well exist, but find one that is as powerful at the same wattage that the Cell is currently using isn't as easy as you might think. Today. In 2012.

GPUs have many other things going for them though, but by far and above everything else, it's where the mass development is taking place that gives a big advantage these days, and if Sony skips on Cell next-gen, it will be for the same reason they picked mostly off the shelf parts for the Vita: already being mass-produced means cheaper faster, and lots of programmers working with that tech means will be used more efficiently earlier in the generation.

Their biggest mistake with Cell wasn't anything in its design per se, it was the assumption that being the leading console platform, they could get away with leaving much of the problem solving work to third party developers. What they should have done is have IBM, and their internal and 1st party studios do their utmost at researching software solutions and creating a fleshed out SDK for the system. But while this is still not a strength for Sony today, in 2005 that talent was a pretty much absent altogether.

As Andrew said though, some of us seem to have made their minds up a while ago, and that's fine. It's a shame though that the tone sometimes has to be so aggressive.
 
No, not really. Yes the chip has some nice peak figures, but it can't reach those outside of really anything non-useful. Peak doesn't matter and it never has.

I take issue again with the "non-useful" attribution; for many "useful" signal processing tasks, encryption/decryption, modeling, and encoding/decoding the SPEs were at the top of the game for a while. No it's not game consoles and embedded devices, but it was he densest group of protein folders, several homemade HPC compute clusters in universities across the country, and the fastest supercomputer in the nation for a couple of years.

Highly unlikely to be true and certainly not shown in any actual product or briefs of future products from IBM.
I'll search it out... of course it's value will only be retrospective at this point of course.

You are looking at Cell drunk and with rose colored glasses.

I am pointing out the positives of an architecture that I always found interesting. The advantages, which I know is a word you won't grant me, were real in their part. I am not saying the death of Cell should have been avoided, only that in some scenarios it could have.

As they well should. Listen, if the various parts can reuse stuff from the others, then great. But they should be focused on their products, not some grand insanity dreamed up to save the world. A lot of the failures at sony can be directly attributed to trying to come up with grand master plans and then forcing products to adapt to that plan rather than coming up with actual product requirements, finding points of commonality, and leveraging joint design where possible.

My comment wasn't Cell-related specifically with the "master plan" stuff, and relates more generally to the silo'd nature of the firm through the 2000's. Independence of design teams is one thing, but structural inefficiencies and backbiting are another. I think it can be recognized that Sony has been plagued by these issues for some time, and it's completely unrelated to Cell or Playstation in isolation.
 
Last edited by a moderator:
No, that's nonsense. *Everyone* knew it was coming. I worked as part of a group doing some of the early GPGPU stuff where you had to virtualize/emulate almost everything, and there never was any doubt that GPUs would soon support those features natively. This notion that Sony/IBM somehow foresaw the future of how to do computing but missed that GPUs and CPUs were both already on that trajectory is the most bizarre marketing switch-around I've heard.

Anyways I don't think anyone is changing their minds at this point, but it has been enlightening to hear the different viewpoints at least.

Well that's fair - I guess I didn't start hearing about it myself until around well into Cell's development, but I suppose the engineers working on Cell certainly would have. That said, it still launched as the better "general compute" solution relative to its GPGPU contemporaries in 2006, on both a FLOPs per mm^2 basis and in absolute terms.

(Marketing switch-around...?)
 
FWIW I've never considered Cell forwards looking in design.
I still think it was as much designed as an evolution of the PS2 with it's VU's as it was any attempt to be a change in paradigm.

That's how I see it too; SPUs are basically VUs on steroids.

I guess Sony (Kuturagi) looked at how developers were using the VUs for non-graphics related tasks, loved the compute density (paper mega-bollocks) and decided to run with it.

There never was a master plan.

In late 2005/early 2006 three different ways of programming CELL was presented. It took a couple of years before the compute-shader/microbatch everybody now seem to use settled out of the mess.

The only thing CELL has going for it is single precision floating point performance. CELL does not, repeat does not, attack the memory wall.

If you have a problem that can execute out of local store and use all the SP FP resources available, CELL wins. However, getting the same problem running at peak FP performance on a regular CPU is trivial. The advantage of CELL is then the difference in peak FP performance. Conversely, getting bog standard code running on SPUs are in many cases impossible. The SPUs are used for geometry processing and anti aliasing in the PS3, which cover som of RSX's shortfalls, but what it really means is that the SPUs are a really expensive geometry processor/anti aliasing hardware.

Cheers
 
Yeah, we're going to have to disagree here. Cell is very good at achieving its maximum potential, and definitely was a valid attempt at solving the bandwidth issues for multi-core processors. I'm hearing a lot of crap talk about how other processors are so much better where in reality they actually don't solve the many-core issue / bandwidth / pipeline / cache stuff all that well either. The few that do attempt to solve this problem, all resort to a similar solution to Cell, and while everyone on the programming side seems to agree it would be nicer if they could design it so that the cache that can be alotted to each individual core.

It doesn't solve any bandwidth issues for multi-core. It punted on them basically.

In the many articles I've read, unlike almost any other (multi-core) processor Cell was shown to be able to actually achieve pretty much its theoretical maximum for quite a few tasks. And it's versatility, it's position inbetween the traditional CPU and the GPU, in the PS3 has meant it can be flexibily assigned to either, which is something highly praised in GPU designs since Xenos, it still has some specific strengths for tasks in the middle, and yet is supposed to be a weakness here.

Xenos gets praise because of its much much higher compute density. Cell is sitting in the middle with all the restrictions of GPGPU but basically none of the advantages.

Claim all the authority you want, but I'm not buying it. If I want to just buy a single CPU today that has significant multi-core power, even a quad-core i7 is significantly more expensive (and I don't doubt much more power-hungry) than any current gen console, including the most complete PS3 package you can find. And GPUs that are as good as the Cell for GPGPU programming style tasks may well exist, but find one that is as powerful at the same wattage that the Cell is currently using isn't as easy as you might think. Today. In 2012.

Ridiculous. All the current gen GPUs deliver GPGPU at significantly higher perf/watt of the shrunk cells. As far as i7s, you are confusing cost and price.

As Andrew said though, some of us seem to have made their minds up a while ago, and that's fine. It's a shame though that the tone sometimes has to be so aggressive.

my tone is only aggressive when people are wearing very rose colored glasses. Cell was an albatross around the neck of the PS3 and still is. The only thing the SPUs can do is make up some of the shortfall of the RSX, the RSX design that only exists because CELL was the wrong solution and sony had to do an about face because of it. If Sony had gone down a G80 based GPU design from the beginning, they would of been able to shift resources around and had a much better console.
 
my tone is only aggressive when people are wearing very rose colored glasses. Cell was an albatross around the neck of the PS3 and still is. The only thing the SPUs can do is make up some of the shortfall of the RSX, the RSX design that only exists because CELL was the wrong solution and sony had to do an about face because of it. If Sony had gone down a G80 based GPU design from the beginning, they would of been able to shift resources around and had a much better console.


If they had gone down a G80 design initially chances are it would have been with CELL as the CPU.
 
Honestly without having my knowledge on hardware reaching Aaron's ankles I don't even believe it's a matter of "G80" based GPU.
Granted with enough money and time (and may be Nv not smelling that Sony was in position of weakness during the negotiations) I'm confident Nvidia could have done something better, more tailored for a closed system as a console, that may not have included unified shaders but still be significantly better.
 
Status
Not open for further replies.
Back
Top