Predict: The Next Generation Console Tech

Status
Not open for further replies.
I'm certain Toshiba cant do anything these days that can remotely compete with ATI/Nvidia anymore, and are we really thinking PowerVr can?

Well you never know what Toshiba can do. They've been active with leveraging Cell SPU. So if we ever going to see a Cell GPU like Larrabee, it is most likely from them.

PowerVR their last competitive part was Naomi 2, which was competitive with what were available at the time. They might not come out on top in term of performance but I think at least they can come up with something competitive.

If Sony isn't going to go with Nvidi, ATI, or at least Intel, they're probably going to suffer a significant graphics deficit. And they might even with Intel. So, yeah.

All Playstations always have graphics deficit, I think the GPU just need to be competitive and meet all the requirements (like emulating PS2 and scaling that work).

The important of next gen is going to be power. I think that's the limit. Especially now we have seen that smaller form factor consoles are an easier sale.
 
I don't know. The nintendo deal may have existed before hand. Also the chip was much less powerfull. I'm sure ms would build into the contract that amd can't provide its rivals with similar or better performance parts


Mabye sony wants to go really big on the cpu with a few ppes mabye a 4x 128 or something and have cell do all the shader stuff ?
If they want to be obliterated in perfs that's an idea ;)
Honestly have the SPU handling graphics is crazy and would take a lot of effort to make them up to the task (include texture units, multithreading => a bunch more registers).
V3 said:
Intel is unknown at the moment and Larrabee isn't a good match for Cell.
So how can you tell? Say a larrabee core is twice as big as a spu, I'm pretty sure that a 4 cores larrabee would crush the cell in any kind of graphic related task.
 
So how can you tell? Say a larrabee core is twice as big as a spu, I'm pretty sure that a 4 cores larrabee would crush the cell in any kind of graphic related task.

Well like I said it's big unknown, Larrabee performance is unknown. On paper that 32 cores Larrabee should give 58xx a good competition. But real world who knows.

However it certainly would be interesting to have raytracing benchmark between Cell and Larrabee. Well they need a working Larrabee first.

By not a good match for Cell, what I mean is introducing another different core (x86) for developers to deal with is not a good idea.

Also putting something like Larrabee and only using DirectX or PSGL to do the development would be a waste. So unless they want to make developers write their own renderer, as far as I see it Larrabee is overkill solution.
 
Well like I said it's big unknown, Larrabee performance is unknown. On paper that 32 cores Larrabee should give 58xx a good competition. But real world who knows.
They have run simulations and it wasn't that bad, the big issue is hardware implementation will they reach there clock speed projection?
However it certainly would be interesting to have raytracing benchmark between Cell and Larrabee. Well they need a working Larrabee first.
They have working larrabee but they don't work that well :LOL:
And larrabee is not intended at ray-tracing, larrabee will be sell as a GPU hence supporting Open-GL and Direct-X thus it will act as a rasterizer. Some infos here .
By not a good match for Cell, what I mean is introducing another different core (x86) for developers to deal with is not a good idea.
I don't get what you mean, actually we could say that devs deals with two architectures in the 360 3 Px (xenon) and GPU, 3 on the PS3 (PPU SPU, GPU). How having larrabee as the GPU changes something in this regard? Things could get better with a X86 fusion design or if a "larrabee only" design is reasonable in regard to perfs.
Also putting something like Larrabee and only using DirectX or PSGL to do the development would be a waste. So unless they want to make developers write their own renderer, as far as I see it Larrabee is overkill solution.
Intel will deliver various software layers, directX, openGL and more close to metal API. It's not different than any other GPU, you just can do more if you want.
I don't think larrabee is overkill but I'm more concern by its overall perfs/mm².
 
Last edited by a moderator:
Also choosing larrabree might allow Sony another interesting option. Intel might allow a version of Cell replacing the PPC with an Intel core only for Sony's use.
 
Also choosing larrabree might allow Sony another interesting option. Intel might allow a version of Cell replacing the PPC with an Intel core only for Sony's use.

If that's even possible (licensing issues and compatibility with the SPUs aside), surely that would outright murder any chance of backwards compatibility with PS3?!?

I think BC will be a big issue for both HD teams this time around, much more than last. Especially since the online services are the principal means of retention of one's fanbase moving from this gen to next (i.e. each user owning varying amounts of DLC, trophies/gamerscore etc).

Sony can't survive another mass apostasy like with PS2 to PS3, so they and MS will put as much into using Live and PSN to shackle down their installed base as possible.

To this end I think that both sides have a little more breathing room when it comes to performance next-gen, as the more each PS3 user has invested into his/her PS3 the harder it will be for him/her to decide to switch sides next gen simply because of "teh graphix"...

If Sony does end up with a GPU that gives them an overall perfomance deficit, so long as it's not too significant (and I don't think it will be) and they continue building their online platform this gen, then it won't matter much if the Xbox720 does slightly better graphix next time around.

All imho of course ;)
 
In other news, IBM ceased cell development:

http://www.playstationuniversity.com/ibm-cancels-cell-processor-development-1295/

It has been confirmed that IBM will be pulling out of Cell development, with their current PowerXCell 8i to be the company’s last entrance in the technology.
...
IBM’s most recent development in the Cell Processor’s design is the PoweXCell 8i, which is featured in the second most powerful supercomputer in the world; Roadrunner. However, this is where Cell will end for IBM. The company’s Vice President David Turek, told German website Heise Online that the planned 32 SPE Cell processor will not be made.

However, Turek did explain that features of the Cell would continue to be moulded into other processor designs. With the future looking like it’s taking a GPU route, a hybrid technology is the direction IBM will developing.

So what does this mean for Sony and the PlayStation 4? It has long been understood that the platform holder wants to use the Cell processor once again, allowing them to take advantage of this generation’s research and to make the transition into the next much smoother.

Will this change with IBM pulling out of their own development? Not necessarily. Sony can still hire IBM to create a Cell Processor for their next console, without IBM being involved in their own internal development outside of the PS4. So don’t start counting Sony’s chickens just yet.

The bolded part is the most interesting, maybe cell based gpgpu for PS4?
 
In other news, IBM ceased cell development:

http://www.playstationuniversity.com/ibm-cancels-cell-processor-development-1295/



The bolded part is the most interesting, maybe cell based gpgpu for PS4?
It can be a lot of things, they can speak of things like SPURS for example. But it has been clear to me for a while that the Cell was a dead architecture.

In regard to the post about BC and taking in account when next Ms and Sony systems will ship and the state of BC this gen, well I would favor a clean start for both manufacturers.
 
The accounts of the development process indicate IBM's idea of an ideal Cell was much closer to what Microsoft got in Xenon. The SPEs are something from the other side of the family that spawned it.

I am not as clear on what Sony expected from Nvidia from RSX, and I would like some explanation behind the thought process that Sony can blame Nvidia for not being able to emulate a very esoteric design in the PS2.
The disclosed time frames seemed to allow for a repurposed off-the-shelf design, and not a clean-sheet design that would fit around the wierdness of the previous generation.
Sony would have known that.
 
They have run simulations and it wasn't that bad, the big issue is hardware implementation will they reach there clock speed projection?

Their simulations were very limited. I assume very troublesome to run. So may not be too reliable.

They have working larrabee but they don't work that well :LOL:

Well AMD is making ground in GPU and that's a threat to Intel. So I am sure they'll get it working soon.

And larrabee is not intended at ray-tracing, larrabee will be sell as a GPU hence supporting Open-GL and Direct-X thus it will act as a rasterizer.

I know, it's just that Cell is not design for raster, so a ray tracing comparison is more interesting for performance comparison.

I don't get what you mean, actually we could say that devs deals with two architectures in the 360 3 Px (xenon) and GPU, 3 on the PS3 (PPU SPU, GPU). How having larrabee as the GPU changes something in this regard? Things could get better with a X86 fusion design or if a "larrabee only" design is reasonable in regard to perfs.

Currently (PS3), RSX is less capable compare to SPU or PPU. This will change for next gen where GPU is getting more capable. Basically it'll be like PPU and SPU, instead of PPU and RSX or SPU and RSX. Developers like Valve already complained about dealing with PPU and SPU. So having to deal with PPU, SPU and x86 is troublesome for developers.

This is true not only for Larrabee but for any next gen GPUs like NV Fermi for example. Having PPU and SPU and next gen GPU is troublesome. That's why Cell is kind of a dead end going forward. SPU may become redundant with next gen GPU, if Sony goes that route.

Intel will deliver various software layers, directX, openGL and more close to metal API. It's not different than any other GPU, you just can do more if you want.
I don't think larrabee is overkill but I'm more concern by its overall perfs/mm².

Larrabee is not your typical GPUs, it's x86 cores masquerading as GPU. Given full access it can do a lot more compare to your typical DirectX 11 GPU.

It’s large, the 32 cores which is on par with 58xx on paper but is estimated to be about twice the size. The advantage is you can write your own innovative rendering pipeline. If you’re not going to use that it’s overkill since you can get the same performance cheaper.
 
The accounts of the development process indicate IBM's idea of an ideal Cell was much closer to what Microsoft got in Xenon. The SPEs are something from the other side of the family that spawned it.

I am not as clear on what Sony expected from Nvidia from RSX, and I would like some explanation behind the thought process that Sony can blame Nvidia for not being able to emulate a very esoteric design in the PS2.
The disclosed time frames seemed to allow for a repurposed off-the-shelf design, and not a clean-sheet design that would fit around the wierdness of the previous generation.
Sony would have known that.

Since Sony did launch their initial PS3 units containing PS2 hardware, I think it was obvious that they must have intially thought that they could forgo sw BC on RSX by including PS2 hw in every PS3. I just don't think they realised until much later on how much added cost the inclusion of PS2 hw in PS3 would imply.

IMHO, Sony dropping out the GS and EE, fumbling around with partial sw BC and then finally dropping BC altogether, was much more of a response to the harsh reality of inflated unit costs from adding PS2 hw, than their original intention with regards to PS2 BC.
 
:oops:
Welcome back Paul! Are you still in contact with Vince? Bring him back too! Maybe Dave will let him post again... :D

If Sony adopts Larrabee, and makes a point of Intel's process leadership and the advantages of a standard processor with wide vector extensions, I demand Vince come back :devilish: Not that I haven't ever been dogmatic and dogmatically wrong. :oops:
 
Since Sony did launch their initial PS3 units containing PS2 hardware, I think it was obvious that they must have intially thought that they could forgo sw BC on RSX by including PS2 hw in every PS3. I just don't think they realised until much later on how much added cost the inclusion of PS2 hw in PS3 would imply.

IMHO, Sony dropping out the GS and EE, fumbling around with partial sw BC and then finally dropping BC altogether, was much more of a response to the harsh reality of inflated unit costs from adding PS2 hw, than their original intention with regards to PS2 BC.

No, the inclusion of PS2 hardware was never in original plan, when PS3 annouced the BC was meant to be done through software. They only announced hardware when they couldn't get software solution on time. Obviously later after partial software solution, they now know RSX can't do the GS emulation upto their standard.

RSX was plan B anyway. Some people mentioned that their original plan was to use Toshiba GPU, the successor to PS2 GS. I remember the Cell dev team was suprised when they announced RSX to be pair with Cell. Sony may won Blu-ray but they screw up their Playstation business when they went with RSX.
 
No, the inclusion of PS2 hardware was never in original plan, when PS3 annouced the BC was meant to be done through software. They only announced hardware when they couldn't get software solution on time. Obviously later after partial software solution, they now know RSX can't do the GS emulation upto their standard.

RSX was plan B anyway. Some people mentioned that their original plan was to use Toshiba GPU, the successor to PS2 GS. I remember the Cell dev team was suprised when they announced RSX to be pair with Cell. Sony may won Blu-ray but they screw up their Playstation business when they went with RSX.

Oh come on, Nvidia gave them a get out of jail free pass. RSX is certainly much more competitve and developer friendly than any Toshiba solution was ever going to be. What could anyone have realistically expected given the timeframes? It may not outperform Xenos but it sure as hell would have outperformed anything any other manufacturer (ATI excepted) could have come up with in the same time frame.

Great, Toshiba might have managed to produce a part that was BC with the GS, but I guarantee you it woudn't have performed on par with the latest ATI/Nvidia offerings and that developers would have kicked up a shitstorm.
 
Tahir: For nostalgia's sake of just how crazy this place was back in the PS3/Cell rumor days I don't think I can ever change my sig! Memories...

Phil: I'm not in contact with Vince but this place is simply not the same without him haha.. I have a feeling his eyes will catch this thread in time, if they aren't already.

Anyway, time for some thoughts.

A Toshiba Cell based PS3 GPU would have blown the piss out of Xenos with regards to the number game, raw power would have flown high. However, I could not imagine the feature list being too great on the part and a sort of software rendering would have been the name of the game. A total Laissez-faire approach.. here's our beast now make it roar.. oh we forgot to tell you, you have to feed and bath it too!

When nVIDIA was announced to be working with Sony everyone got excited. The conclusion was that they were working on the pixel engine and features of a cell based GPU. What we got in the end though was an off shelf rip off, such a shame to stack that with something as custom as the BE.

As for PS4, to think that it will not use a Cell 2.0 is simply insane. For Sony to invest as much as it did on this scalable architecture, all the tools, time and monies and than to simply throw all that into the garbage would be an epic failure beyond words. Nothing else even would make sense, what could they even use that would overall benefit them better than Cell? Nothing.
 
Were there ever any more details released (leaked) on the Toshiba GPU? Given the insane cost of the PS3, I'm picturing a PS3 with Cell and its 256MB of ram paired with a Toshiba GPU (without vertex shaders or equivalent) with an embedded framebuffer. Much cheaper, and full backwards compatibility.

I don't think Sony had any option but to go with a late switch to RSX, but looking at benchmarks now, boy has the GeForce 7 series aged quickly and badly.

http://www.pcgameshardware.com/aid,...nchmarks-of-the-latest-Call-of-Duty/Practice/

You can see why MS turned it down for the 360, quite apart from them hating Nvidia after the first Xbox. They would have to have been insane to chose it over what they got. I spent a lot of money (by my standards!) on a 7900 GTX in 2006, and by 2007 I largely felt I'd wasted my money.
 
Oh come on, Nvidia gave them a get out of jail free pass. RSX is certainly much more competitve and developer friendly than any Toshiba solution was ever going to be. What could anyone have realistically expected given the timeframes? It may not outperform Xenos but it sure as hell would have outperformed anything any other manufacturer (ATI excepted) could have come up with in the same time frame.

Nvidia totally screwed Sony. They took one of their parts which they had long since shelved because of both defects and performance issues, and proceeded to sell it to Sony. RSX was never meant to be sold to anyone, it was a long since shelved piece of hardware, but they realized they could make a pile of money this way and protect their PC business at the same time since the video hardware they sold Sony was obsolete from day one. Sony was desperate at that point as they were both out of time and out of money trying to go with their own solution, so NVidia took advantage of the situation and profited big time.
 
I have it on very good authority that - as per 99.99% of TEXAN* posts - the IMG link-up with Sony is complete fantasy. For anything other than PSP2, that is.

Heh I was basically bored and had nothing better to do, so I decided to find out some information about Texan. This guy is friggin unbelievable. I'm 100 % sure that he runs the FGNonline site and he posts in numerous and numerous forums sometimes with multiple accounts this same shit over and over again, and he's been doing that since atleast 2001. It's unbelievable sometimes he quotes himself from another forums and all sort of really disgusting stuff. Apparently he lives in the UK. He's banned in some sites and atleast on the official Sega forums he was told not to mention FGNonline so he started spamming the site with private messages to other users. The list goes on and on :)

edit: here for example he claims to be editor in chief at Edge magazine... Honestly he is everywhere doing weird stuff.
http://forums.modojo.com/showthread.php?t=160068
 
Last edited by a moderator:
Status
Not open for further replies.
Back
Top