Predict: The Next Generation Console Tech

Status
Not open for further replies.
I was also one of the very few to expect a top of the line laptop GPU for ps4, the idea was attacked furiously with the argument that "laptop top of the line GPUs cant be used for consoles because they cost too much and they come on limited quantity"

But it's not a top of the line laptop GPU because it's not binned for low power. It's more equivilent to a slightely cut down lower end enthusiast GPU from mid 2012.
 
Read previous 3/4 pages please.

Also, seems Richard is saying MS aren't going for all out power and are hoping Kinect 2.0 will be a difference maker like someone sitting not a million miles from my location said a few days ago :LOL::LOL:
 
Read previous 3/4 pages please.

Also, seems Richard is saying MS aren't going for all out power and are hoping Kinect 2.0 will be a difference maker like someone sitting not a million miles from my location said a few days ago :LOL::LOL:
I doubt thats the case, but if it is than they can't see the forest for the trees. First year is all about hardcore gamers and great visuals, not turning on my TV with voice command.

Seriously, I really don't understand what the hell they are thinking of doing. Seems like their situation is not exactly flattering, but I doubt they are caught by the specs from Sony, this was something expected.
 
End of 2013 will be the absolute worst time to release these consoles. HMC is just around the corner, dammit. Someone grow some balls and delay their boxes.

http://www.hpcwire.com/hpcwire/2013-01-17/micron_readies_hybrid_memory_cube_for_debut.html

Companies are eager to get their hands on this product and Micron is working with an aggressive roadmap to meet that demand, Graham told HPCwire. The Gen1 demo, on display at SC12, was real silicon, and engineering samples for the Gen2 device are due out this summer. If all goes as planned, the Hybrid Memory Cube will be in full production at the end of this year or early 2014. In fact, contracts are already in place for the 2014 timeframe.

According to the Micron rep, they've also seen interest from companies outside the consortium. Most notably absent from the member roster are AMD and Intel, but that by no means implies a lack of involvement. Intel, for its part, demonstrated a prototype HMC device during the fall Intel Developer Forum in September 2011, deeming it the fastest and most efficient DRAM ever built. As Graham put it, these companies have opted not to be involved in the open standard in order to develop their own way of using the technology.
 
End of 2013 will be the absolute worst time to release these consoles. HMC is just around the corner, dammit. Someone grow some balls and delay their boxes.

http://www.hpcwire.com/hpcwire/2013-01-17/micron_readies_hybrid_memory_cube_for_debut.html
From memory and bandwidth stand point, both of them look solid. GPU and CPU wise, both are underwhelming. If Durango doesn't have very exotic GPU (which I doubt it has, probably HD7770 base) than its even worse than underwhelming. And unfortunately, GPU TDP/W ratio is not going to change for the better as memory will.
 
From memory and bandwidth stand point, both of them look solid. GPU and CPU wise, both are underwhelming. If Durango doesn't have very exotic GPU (which I doubt it has, probably HD7770 base) than its even worse than underwhelming. And unfortunately, GPU TDP/W ratio is not going to change for the better as memory will.

True, it's all about a balance. Massive 0.5 TB/s bandwidth may not do much good if it's paired up with mid level GPUs. Although I'm sure some devs would say there's never enough bandwidth, just like some would say there's never enough RAM.

I think Sony CTO said it before, that they'll always be waiting for that 'best tech', but it's never quite there so they need to make do with what they can for right now. But these frankly boring rumors from both MS and Sony don't spark anything for me.

Scott_Arm said:
3GB for a non-mobile hybrid computer would be reasonable. If it is not running Windows 8, then I don't know why they'd need that much memory. If this rumour is true, that's the only thing that makes sense. They can share services and content (including apps) between all of their platforms (mobile, desktop, tablet, Xbox). Honestly, a home theater PC that plays next-gen Xbox games doesn't sound too bad. I'm more interested in pure gaming, so it really depends on how much better the games turn out on PS4 (assuming PS4 rumours are also true).

Yeah I kind of buy these 3GB rumors. I'm sure they'd like to keep the Windows experience across desktops and Xboxes as close as they can. And you don't run full fledged OS on under 3GB.
 
3GB and two cores for the OS sounds absurd. You probably could run 3 instances of windows RT on that XD

Orbis, at least by those specs, isn't looking that hot either, sure there are console optimizations, but it seems it won't take long at all for Pcs to bruteforce that advantage XD

Yep. Exactly what I'm thinking. Don't you think the GTX 690s would put our more FLOPS than even an optimized game for the 7970M?
 
That is, if Xbox gets "traction" in beginning. If it doesn't, than its going to be tough. Basically, if every game looks and plays better on competitor machine and your biggest focus in gaming dept. is 3rd party than you clearly choose wrong tactics.

I don't think the disparities will be too great, one way or the other. I could be wrong of course...

Microsoft's focus is going to be finding a way to use XBox to springboard a larger push against Apple across other devices, and Sony to use PS4 as a synergistic device to flesh out its PSN environment as a truly cross-device storefront with market appeal. Living room and all that.

For better or worse, we need to understand that Apple looms like a shadow over this whole thing for both companies in the context of their broader objectives, and resources will be dedicated accordingly.

With regard to traction, I think both console makers are banking at least to some extent on loyal fanbases to ensure things get going - and I'd say that's probably a safe enough bet to make, assuming reasonable launch prices and decent titles.
 
On the subject of the thread, the most intersting thing about these specs for me is the inclusion of what seems like 3 seperate processing blocks all on the same APU. GPU, 8 core CPU and what sounds like a general purpose SIMD unit makes for quite a potent APU! I guess the confirmation of GDDR5 means we're talking about a standard 256bit memory interface which lends further credence to the 192GB/s rumours. If true that's certainly a healthy chunk of bandwidth for what is a fairly modest CPU/GPU combination.

The extra unit dedicated to GPGPU sounds very interesting too. If it is adding another 200GFLOPS to the CPU which I understand itself is pushing around 100GFLOPS then that's a lot of SIMD power. Could be useful for emulating Cell?

As in emulating Cell without changing the games ? Unlikely unless they use a 6-core SPURS together with other Cell parts (e.g., memory and DMA controller).

Then again, for multiplatform titles, it is probably more economical (from Sony's perspectives) to have third parties port their multiplatform PC games to PS4 instead of using the PS3 version.

I am more lamenting for the first party titles. I wonder if they remove all the SPU assists and just run the rendering on the new GPU, what it would look like. The new SIMD engine can help in physics, audio, animation, AI, video and other media work. How much effort would it take to port Last of Us over ? ^_^
 
There's a metric ton of noise going on in this thread. So can we get back on topic of "predict next gen console tech" and a whole lot less on the "business decisions" or even "which is more powerful".

Kthxbi!
 
As for core gaming becoming stale, I'd argue that it's not really because the market has moved on. It's because core games haven't retained these people. ^_^ Wii showed them an alternate way to have (more) fun.

I'd argue it's a combination, both hardcore gamers moving to more casual games but moreso a whole new segment of gamers emerging in the form of casual/social gamers. As well as the people that are interested in the entertainment/media services which aren't at all gaming related.

And iOS introduced cheap and social games. Can core games be more fun than these ? Perhaps ?

Hell yes! Of Course they're far more fun. Actually, I'll qualify that by saying they're a far better experience. Fun could be defined as addictive and in that sense farmville probably competes with any AAA console game today! But in 10 years are you more likely to look back fondly on farmville or Skyrim? Regardless, I'll take 1 Skyrim over 1000 Farmvilles. Unfortunately I don't think the rest of the population feels the same.
 
Can the PS4 "special sauce" be something like the SPEs from Cell?

Sorry if it's a dumb question.

It's not a dumb question at all. It just seems very unlikely, unless Sony has found some manner via which to exploit them constantly and for multiple purposes, rather than simply for BC and for tasks that could be accomplished similarly for less (the audio DSP theory).
 
It's not a dumb question at all. It just seems very unlikely, unless Sony has found some manner via which to exploit them constantly and for multiple purposes, rather than simply for BC and for tasks that could be accomplished similarly for less (the audio DSP theory).

Something like "HSA"?
 
It's not a dumb question at all. It just seems very unlikely, unless Sony has found some manner via which to exploit them constantly and for multiple purposes, rather than simply for BC and for tasks that could be accomplished similarly for less (the audio DSP theory).

I remember that time ago somebody talk about using "virtual machines" on the Durango devkit, and here's the HSAIL description:

HSAIL is a a virtual byte code and virtual machine designed for parallel compute on heterogeneous devices that was introduced as part of the HSA rollout at AFDS. It is an intermediate representation that dynamically compiles at run-time for the device

And there is a patent from Sony that looks suspicius:

1. A computer graphics apparatus, comprising: a) a central processing unit (CPU), wherein the CPU is configured to produce graphics input in a format having an architecture-neutral display list for a sequence of frames; b) a memory coupled to the central processing unit; c) first and second graphics processing units (GPU) coupled to the central processing unit, wherein the first GPU is architecturally dissimilar from the second GPU; and d) a just-in-time compiler coupled to the CPU and the first and second GPU configured to translate instructions in the architecture neutral display list into an architecture specific format for an active GPU of the first and second GPU, wherein the just-in-time compiler is configured to perform a context switch between the active GPU and the inactive GPU, wherein the active GPU becomes inactive and the inactive GPU becomes active to process a next frame of the sequence of frames, and turn off the one of the first and second GPU that is inactive after the context switch.
 
It's not a dumb question at all. It just seems very unlikely, unless Sony has found some manner via which to exploit them constantly and for multiple purposes, rather than simply for BC and for tasks that could be accomplished similarly for less (the audio DSP theory).

They can run an updated autonomous security kernel in a sealed SPU ! D^8
(That means they'd need a 7 core SPURS, even less likely haha)

One of the rumors mentioned that the PS4 audio DSP can decode about 200 concurrent MP3 streams. I can't find an equivalent measurement for the SPUs, except that Toshiba showed a full Cell decoding 48 MPEG2 1080p streams.

If Sony CTO's last interview holds any weight *at all*, I am very curious about the "configurable logic" he mentioned, and how they intend to tackle the security issues given that they were thinking of some form of PS4 Linux (again).
 
Something like "HSA"?

In my mind it would honestly have to be something more targeted and specific than simple compute resource availability to warrant inclusion, in part due to the extra silicon and PCB expense considerations. Not to mention unless there were a way to functionally recreate a Cell environment using a larger SPURSEngine variant and the native PS4 ICs, then you're looking at a next-gen Cell with new memory controller to take on the job. I just don't think so, though it would warm my heart.

They can run an updated autonomous security kernel in a sealed SPU ! D^8
(That means they'd need a 7 core SPURS, even less likely haha)

One of the rumors mentioned that the PS4 audio DSP can decode about 200 concurrent MP3 streams. I can't find an equivalent measurement for the SPUs, except that Toshiba showed a full Cell decoding 48 MPEG2 1080p streams.

If Sony CTO's last interview holds any weight *at all*, I am very curious about the "configurable logic" he mentioned, and how they intend to tackle the security issues given that they were thinking of some form of PS4 Linux (again).

I'm here without ideas on the subject - if they do include what is essentially a new Cell chip in the console and have dreamed up functional roles for it to play, I will be floored. Why give 200 MP3 streams up as an example though? Just seems strange, especially if it is a DSP. (And why claim otherwise if it's not?) The transistor budget... it just doesn't make sense in my mind.

Hey - I'm ready to be surprised. Sony: surprise me.
 
Last edited by a moderator:
I remember that time ago somebody talk about using "virtual machines" on the Durango devkit, and here's the HSAIL description:



And there is a patent from Sony that looks suspicius:

CPU+APU+dedicated GPU seems really likely now and goes well with that patent.
 
200 GFLOPS CU. :oops:

Cu or CPU? Too many acromyms lol From the sound of it the CPU will be packing more than 200 GFLOPs is you include the dedicated GPGPU hardware. If that's 2 CU's running at 800Mhz as has been suggested (but as far as I know is completely unconfirmed) then that's 200 GFLOPS alone. Then the CPU which is a little over 100 GFLOPS giving a total of 302.4 GLFOPS of 'CPU' power.

That's some serious CPU SIMD, nearly 50% higher than Cell and even higher than a top end quad Sandybridge (although a fair bit short of a quad Haswell).
 
Status
Not open for further replies.
Back
Top