Sony having to develop a GPU on its own or a CPU that can replace one is a tall order especially given the timeline.
that would be tricky, yes. if they go that route they would have had to start years ago.
Sony having to develop a GPU on its own or a CPU that can replace one is a tall order especially given the timeline.
Larrabee handles this with a combination of hardware threads and software threads.
From what I've seen of the discussions on this topic, merely increasing the LS and using purely software threading in each SPE prolly wouldn't be enough. So I agree it would be a major alteration...
When Sony goes for any third party chip of the importance of a CPU or GPU they have the problem of developing software/APIs/libraries/support for it. I suppose this becomes more of a question of balance: can a third party improve this software/bring-it-to-market faster, than Sony working alone?
Jawed
They'd basically be hoping for a market psychology reboot. It can work but you need a strong hook. Wii did that with motion control this time around. What will Sony have? Cell, Blu Ray, new GPU etc are all just enthusiast and hard core talk. It's not the hook for the masses. And frankly you don't know if the hook now, will still be alive and well in 2011. What if the PS4 becomes "wii like" but the market had enough of it and are looking for something else?
Market psychology is the most difficult thing to predict and only becomes known when it's too late.
The key for Sony will be a few things:
1. Stick to gaming. That should be the identity of the PS4. The hardcore will be there for you already but don't waste the precious marketing pre release window by being all over the place like the did with the PS3. Once the market perception is set, it takes a LOT of effort to change.
2. Price of entry. I don't think Sony will make the mistake of a $600 console again. They just won't.
3. Better development tools. Don't let your exotic tech and lack of support tools be the reason your version of the game lags behind routinely. Even when parity is achieved, market perception overrides actual results.
4. Talk shit, less and deliver more. Quit the boasting and grandstanding. Come out and show people that the PS4 will make for FUN GAMES. Don't sit there and brag about power numbers. Like I said, the hard core will be there for you. You've already established the Playstation brand in the minds of the hard core and the casuals don't care for all the tech bragging. Show the games and deliver them ON TIME.
They need to get back to the fundamentals of making a gaming system for a great gaming experience and convince the people that they've done so.
Yes I did, and I still think LRB should be used for all that sort of tasks it is particularly good at. While it can probably execute scalar and general purpose code in a fairly efficient way an OOOE CPU would probably be a better choice.From a dev POV, you're thinking that a few multi-core OOOE CPU is more suitable than a massive in-order multi-core (with my old little coding experience I'm can understood that), but that did you think of a hypothetical LRB-like CPU tale for console?
Only if you use LRB as CPU and another architecture as GPU, which again wouldn't make much senseAnd for LRB, are we go with the same dual API's problems you mentioned for the CELL+ and next gen GPU, or the x86 heritage is more friendly?
I vote for the trash can While it's true that they have put a lot of money on CELL, changing architecture wouldn't suddenly invalidated all the work they have done.The problem is... where does all the research on the Cell go to? The supercomputer market exclusively? The trash can? It seems such a waste.. Perhaps Sony could "go nuts" and add both SPEs and functionality to each SPE so it could go up against regular GPUs or LRB, but it would be risky and could go terribly wrong.
Everything is possible, but they will take this route only if they want to drive costs down a lot and they also decide performance is not really that important.What about integrated CPU and GPU? A bit like the Cell but instead of a PPE have several OOOE cores and a GPU instead of the SPEs, for physics+graphics. I think I read about AMD going in this direction, probably here at B3D...
They have been trying to come up with forward looking architectures for more than 10 years now and it simply didn't work. At each new generation they adopted something new that only had some vague resemblance of past technology. Perhaps this time things will change and they will use CELL again (or its evolution) and unless CELL 2.0 is some miracle sent by god ( in that case it would be so different that they shouldn't call it CELL anymore ) I see them getting stuck in a dead end. And it doesn't matter how much I enjoy working on SPUs, the vast majority of engineers I know doesn't, and Sony this time wants to make absolutely sure that they don't get screwed by third parties subpar 360 ports, delays or worse.Is there any scenario where going their own way on the GPU design and further developing Cell makes sense for Sony, or is it just throwing more money and resources at a dead-end?
You are assuming that they will use completely custom parts, but I don't see the reason for doing that. To put a 4 cores Nehalem (which will be dirty cheap in 3 years) on it and a next gen GPU doesn't require the amount of research, money and time you need to develop a brand new architecture (CELL).Anyway going back to the original rumour. If Sony is asking game developers for input now I find it pretty hard to believe that they will have something ready for the market by holiday 2011 considering how long time it takes to go from start of development to first silicon and from first silicon to mass production of the IC and then move on to mass production of the complete console. 2012 maybe.
You are assuming that they will use completely custom parts, but I don't see the reason for doing that. To put a 4 cores Nehalem (which will be dirty cheap in 3 years) on it and a next gen GPU doesn't require the amount of research, money and time you need to develop a brand new architecture (CELL).
Being the questionable point...Or using the 360 example , waternoose was a enhanced cpu that ibm was already working on , the changes and everything took less than 18 months correct ?
That's too vastly OTDo you guys really think dev tools and subpar ports are why PS3 is trailing in market share?
What about price and later release?
If PS3 had the bigger market share, due to earlier release and lower price, wouldn't more multiplatform development lead on the PS3?
Now, if you're saying tools are causing delays in big games like GT5, MGS4, etc. then that's a more viable argument.
But doesn't development follow the money more or less? The complaints about the tools and architecture wouldn't be more muted if PS3 was the dominant platform?
Or are the tools and the architecture the reasons for the PS3's trailing position?
Do you guys really think dev tools and subpar ports are why PS3 is trailing in market share?
What about price and later release?
If PS3 had the bigger market share, due to earlier release and lower price, wouldn't more multiplatform development lead on the PS3?
Now, if you're saying tools are causing delays in big games like GT5, MGS4, etc. then that's a more viable argument.
But doesn't development follow the money more or less? The complaints about the tools and architecture wouldn't be more muted if PS3 was the dominant platform?
Or are the tools and the architecture the reasons for the PS3's trailing position?
Hehe. So you want to move back to cache, and trying to fix the latency problem you've just introduced with hardware threading?IMHO, LS should be replaced by a nice cache with the ability to lock 256+ KB ...
It would make easier to introduce HW threading for SPU's (2-4 or more HW threads) as latency for SPU's is a problem
Hehe. So you want to move back to cache, and trying to fix the latency problem you've just introduced with hardware threading?
Latency is not a problem for SPE. LS is the foundation of CELL. There is just no point in SPEs without LS.
I believe ease of usage is the problem being fixed. Improving latency is an after affect though very welcomed.
HW threading introduces latency? I'm not sure I understand how that is. Would you please explain that to me?
Threads are used to hide latency or at least they can be.
If you get the exact same performance and deterministic behavior why then does cache not make sense?
I understood wco81's question to be somewhat rhetorical. If the programming issues aren't what's causing PS3's lacklustre sales, why would Sony care to make things easier for the devs next gen? They could provide the same tools on Cell, launch a console at $250 say, and then have it sell. The programmers' lives aren't really a factor - just look at PS2's success! Only if developers refuse to develop for your platform because it's too hard does it make designing hardware for their ease important.That's too vastly OT
That have already been discuss to death in this topic:
Hardware is overrated anyway
I wonder how long will it takes to PS3 developers to master the PS3 as well as they did master the PS2 in the past... Maybe longer than the PS3 lifespan?
Memory latency. One of the SPU's primary goals was to tackle this.
More threads means more memory access that means latency becomes a bigger issue.
What makes you think it has not already been "mastered" to the extent the PS2 was mastered?
I'm pretty sure one could make the argument that far more combined human brain power has already been spent on figuring out Cell's capabilities than the PS2's EE ever got.