Predict: The Next Generation Console Tech

Status
Not open for further replies.
It sounds to me like MS might be designing their next console around Kinect 2.0.

This would explain lots of RAM and a powerful CPU, while neglecting the GPU.

Bkilian said RAM is very important for the Kinect stuff, "by far the most precious resource on the box" he called it.

You can read the incriminating post here http://forum.beyond3d.com/showpost.php?p=1645931&postcount=409
Heh, nice try. But what I said doesn't have anything to do with future products :). Almost any CPU heavy function can be accelerated by prodigious use of RAM. In the 360 we tried to balance the usage for the skeletal system, we want it to be fast, we want it to not use too many resources, so they cut down the number of joints tracked. And memory is definitely the most precious resource on the box. Developers spend a _lot_ of time trying to stuff ever more complex and larger textures into a tiny space. They get cranky when you try to take any of it away from them. They'd probably give us an entire CPU core before they gave us 32MB of ram :).

Interesting!
Speech recognition, the skeletal processing etc. would benefit more an heavily-multithreaded process or from less but faster threads? Could GPU-compute be used to accelerate those process?
I'm not sure. Skeletal looks to be a highly parallel problem, speech too. I believe the current implementations on the 360 are single threaded though.
 
Llano's architecture has had R&D behind it since K8 or K7.
It reaped the benefits of generations of work put into the same basic pipeline.
Its problems are that its pipeline worked best for the challenges of 130nm and 90nm, and it has been an increasing struggle to make it scale at lower process nodes.

Even if on a redo AMD didn't make the same choices as were made for Bulldozer, it wouldn't make another Llano either.


Some of those things are very fundamental to the design. Start changing them and you might not get a BD, but it wouldn't be a K10.


Bulldozer was probably a serious misstep.
However, Llano was a reworked K10.
Just because the new thing they tried wasn't all that great doesn't mean a warmed-over old pipeline could give them what they needed.
Ok that's make things clearer.
I see (at last) what you meant :)
So if I follow your line of thinking what I'm advocating for would be more a "split" BD module.

As a BD "core" has less resources than previous architecture the front end may be over-sized to feed a single core. Nice is that SMT should be possible as it already see two threads.
But then there is the shared SIMD units... they would have to support four threads making things even more complex...

That's indeed worth than what I understood, AMd can't go back, they are stuck to the BD pipeline, which is longer and overall narrower than prior design and Intel offering.

The cache hierarchy is f***d up as it is, the only sane option to make it performant is to somehow "break" the module and so basically only sharing the FP/SIMD units.

As the core is narrower they would need to down size the front end/decoder. nice thing is the SMT should be doable.

Then AMD has to work on brand new cache hierarchy. Faster and more associative L1.
They need to rework the L2 and L3 as if Intel and IBM came to the same conclusion there must be a good reason behind that matter of fact.

Then there is the FP/SIMD which need to be reworked if SMT were to make it into the design.

Well that's 5 years of hard work... :LOL:/:???:
it's indeed really bad, Intel provide in a lot of case +100% the perfs and Haswell is coming.
The amouint of work that need to be put in BD to make it flies seems enormous.
 
To add to the discussion over a month ago a poster said this in a different PS4 thread.

http://www.neogaf.com/forum/showpost.php?p=37364710&postcount=1265

sweetvar26 said:
Alright guys, got some information for you guys.

The PS4 AMD project called as Thebe. Previously it used to be based on Themesto and Callisto based chips but now that has been revised. They moved on to a chip called Jaguar replacing the Streamroller. They moved on to TSMC 28nm solution from the 32, which the streamroller is.

The whole thing basically is APU solutuion, they made the changes considering the 10 year product life cycle and to keep the initial product costs at minimum.

As of now it is called as the Thebe Jaguar project or TH-J.

I put the important part in bold.
 
I don't know what to think if that's true. Jaguar is meant for low power APU's , so guess overall performance could be lower if it's a 1 for 1 exchange.

They could also do something like 4 Steamrollers exchanged for 8 Jaguar cores. In a dedicated console, that could be a better solution.
 
Bg, can you please point out those "weird" things you've been hearing on Durangos part? Seems like we know more about PS4 than next Xbox.
 
The Orochi die is 319mm2 because of an inefficient automatic layout, and 8MB of slow cache which is a lot less useful in non-server workloads.
An individual Bulldozer/Piledriver module is only around 31mm2.
I'm pretty sure people criticizing the die size of Orochi because of inefficient automatic layout don't know what they're talking about.
 
From what I know so far Durango is... "weird". I need more details than what I have to confirm if it's officially "weird" or just a misunderstanding due to vague info.

That's interesting, by 'weird' do you mean that Durango has the most exotic architecture of the two this time around?

Or does 'weird' mean that the general tone of rumours that Durango is more powerful (and iherre's claim that Durango= high range PC, PS4= mid range and Wii U = low range) aren't true and there's no clear cut winner at the moment between the two consoles?
 
I'm pretty sure people criticizing the die size of Orochi because of inefficient automatic layout don't know what they're talking about.

isn't it from all the HT links, memory interfaces and the L3 crossbars? lots of that goes away in a Consumer targeted SOC.

If bulldozer hit its targets people wouldn't be rubbishing quite the same way. there's lot of "good stuff" in bulldozer and trinity is seeing good improvements with out addressing any of the architectural problems people see. on top of this it seems like AMD actually have a road map of how they are going to develop this idea unlike K8.

I think streamroller/HSA must have some pretty compelling stuff it to get the consoles to move away from Power.
 
To add to the discussion over a month ago a poster said this in a different PS4 thread.

http://www.neogaf.com/forum/showpost.php?p=37364710&postcount=1265

I put the important part in bold.
Steamroller is also targeting 28nm, so a hop to that node wouldn't make a difference when choosing between Jaguar and Steamroller.

One other wrinkle is that only Steamroller APUs mention HSA application support, which if Sony intends to use the GPU for general FP applications would be a necessity. That's not a hard restriction if a semi-custom design is in progress.
 
To turn a turn on the discussion from GPUs and CPUs...

Does anybody have an idea how the Wiiu streams the video output to the Wiiu gamepad?
Currently from leaked information the Wii u gamepad seems to have Wifi and NFC, as well as bluetooth. Nothing else seems to be specified.

NFC wouldn't seem to work in any capacity for these purposes, so that's out of the window. Bluetooth would seem to have bandwidth problems, so it would seem to me that the most direct way of doing the video feed this would be through Wifi.
Control inputs take up very little bandwidth so I'll leave them at that.

I haven't seen too much on this subject in the forums and it would be interesting to see on feasibility on smartglass on xbox or PSV + PS3 in the future.

Currently PSV+PS3 is the closest thing that's not Nintendo that has shown similar capabilities but there seems to be reports of lag input in Remote Play. I'm not currently at home so I couldn't test this out for myself, but I'll do it later in the day.

Given the relatively low latencies of wifi, it couldn't (or shouldn't) be the connection that we should be worrying about.
CPU intensive encoding issues perhaps? And how does Wiiu get around that?
 
Jaguar would be an insanely bad CPU for a console, it is meant to be an upgrade to the low power bobcat core, not a high power console CPU.

Bobcat does not clock very high, has very slow floating point capability and I see no reason for Jaguar to be any different.
 
To add to the discussion over a month ago a poster said this in a different PS4 thread.

http://www.neogaf.com/forum/showpost.php?p=37364710&postcount=1265



I put the important part in bold.

Hmm yeah I remember that guy but had forgotten about him. Anyways so I combed his posts from that time, and he also claims (I guess) the Next Box is using Jaguar cores, here http://www.neogaf.com/forum/showpost.php?p=37365190&postcount=1269

Which goes against the "powerful CPU" next box rumors.

Then I think he's saying there's two Jaguar cores in PS4, and they're clocked at 1.6ghz? That seems really low end if so.

It's hard for me to even be sure which system he's talking about there.

For what it's worth he also said the Xbox project was "higher priority" at AMD and some things like that, he seemed to hint at the Xbox being more powerful, but the fact he doesn't understand this stuff and he's talking about his roommates work.

here's a few of his other posts on it, basically the ones with anything juicy

http://www.neogaf.com/forum/showpost.php?p=37352658&postcount=1240

http://www.neogaf.com/forum/showpost.php?p=37354821&postcount=1249

Disregarding the Xbox stuff, I guess the main thrust is it's possible the PS4 uses Jaguar rather than Steamroller cores now.

Edit: I do remember being skeptical of steamvar at the time because I remember reading an article about the great secrecy AMD kept between the Wii and 360 projects (which werent even competing really) back in the day. Separate campuses and no communication allowed between them, strictly enforced. So it struck me as implausible the same guy (his roommate) would work on PS4 and 720 both (which are directly competing).
 
Hmm yeah I remember that guy but had forgotten about him. Anyways so I combed his posts from that time, and he also claims (I guess) the Next Box is using Jaguar cores, here http://www.neogaf.com/forum/showpost.php?p=37365190&postcount=1269

Which goes against the "powerful CPU" next box rumors.

Then I think he's saying there's two Jaguar cores in PS4, and they're clocked at 1.6ghz? That seems really low end if so.

It's hard for me to even be sure which system he's talking about there.

For what it's worth he also said the Xbox project was "higher priority" at AMD and some things like that, he seemed to hint at the Xbox being more powerful, but the fact he doesn't understand this stuff and he's talking about his roommates work.

here's a few of his other posts on it, basically the ones with anything juicy

http://www.neogaf.com/forum/showpost.php?p=37352658&postcount=1240

http://www.neogaf.com/forum/showpost.php?p=37354821&postcount=1249

Disregarding the Xbox stuff, I guess the main thrust is it's possible the PS4 uses Jaguar rather than Steamroller cores now.

Edit: I do remember being skeptical of steamvar at the time because I remember reading an article about the great secrecy AMD kept between the Wii and 360 projects (which werent even competing really) back in the day. Separate campuses and no communication allowed between them, strictly enforced. So it struck me as implausible the same guy (his roommate) would work on PS4 and 720 both (which are directly competing).


Jaguar which is apparently advanced and it is also being used in the next Xbox project called as Kryptos.

So looks like there are 2 Jaguar CPC's (Core plus cache) with 4P/2MB. Clocked at 1.6Ghz(~1.25V).

I'm thinking about the big.LITTLE architecture. Maybe the Jaguar-core are OS-reserved/dedicated to security. Bobcat is pretty tiny, Jaguar-cores will be even more tiny-> 2 cores+cache at 28nm should be around 15mm^2.
 
I'm thinking about the big.LITTLE architecture. Maybe the Jaguar-core are OS-reserved/dedicated to security. Bobcat is pretty tiny, Jaguar-cores will be even more tiny-> 2 cores+cache at 28nm should be around 15mm^2.

You are reading it wrong. 4P refers to 4 cores, so this would have two units of 4 cores, both of them connected to their own 2MB pool of cache, for a total of 8 cores and 4MB cache.

... which would be pretty pathetic. One bobcat core at 1.6GHz is roughly 3-4 times faster than one Xenon thread at 3.2GHz in integer, so unless Jaguar is a real upset, that's realistically less than 5 times the power of Xenon, on 8 threads.

It would be pretty low power and low cost -- I wonder which I would prefer if PS4 and XBnext had roughly the same silicon budget, where Sony spends it on a beefier CPU and MS goes for a better GPU.

Crossplatform games would almost certainly look better on the system with the better GPU, because it's harder to scale visuals up or down to match the CPU.
 
Another thing I hear vague implications of, it came up in one of steamvar's post too, is that the 720 GPU is more "custom".

I wonder if that's just EDRAM or what.
 
*There would be 8 Xenon2 cores
*Those cores would be reworked Xenon cores:
-Integer pipeline OoO + other improvements and a bit more resources(ALU, AGU,LS units)

This is just wrong, and I have to correct. If the core is designed by IBM (which it would), and it would get OoOe, and it would have been designed in the past 5 years or so, it would almost certainly be built around register renaming and PRF. And built around really is the correct wording --when you build a PRF cpu, the first thing to go down in the layout is the register file, and everything else is built on it. OoOe is not a checklist feature, you cannot just add it to a cpu and call it a day. So the hypothetical OoOe Xenon2 would not in any way or form be a derivative of the Xenon, it would be a whole new CPU.

Which would make sense, given that the cores in Xenon are another evolutionary dead end -- afaik, no work (other than shrinks) has been done on their line since 2006 or so. IBM or AMD, it's going to be something new.
 
Does anybody have an idea how the Wiiu streams the video output to the Wiiu gamepad?
Currently from leaked information the Wii u gamepad seems to have Wifi and NFC, as well as bluetooth. Nothing else seems to be specified.
I imagine those are the standards implemented for user connection to the pad. There could be an unspecified tech used for the video streaming which wouldn't need to be divulged any more than other internals. That said, Wifi is good enough for content streaming. Maybe Wuu includes a Wifi-N hub in the box to ensure everyone gets a decent connection and isn't fighting other network users over their old Wifi-G network?
 
http://forum.beyond3d.com/showthread.php?t=62007

attachment.php
 
5.5GHz wifi would be perfect, because it's still in an unlicensed spectrum band, a nicely wide one with very few users. it can be used for two things mainly, wifi access inside a room, or point-to-point exterior connections with line of sight - the signal is so high frequency that it doesn't go through obstacles, and is somewhat directional.

wiiU would have two wifi controllers, one 5.5GHz for the tablet(s) and one 2.4GHz to connect to the home network and internet.
 
5.5GHz wifi would be perfect, because it's still in an unlicensed spectrum band, a nicely wide one with very few users. it can be used for two things mainly, wifi access inside a room, or point-to-point exterior connections with line of sight - the signal is so high frequency that it doesn't go through obstacles, and is somewhat directional.

wiiU would have two wifi controllers, one 5.5GHz for the tablet(s) and one 2.4GHz to connect to the home network and internet.

I'm not sure that would work, considering that different countries have different spectrum requirements and licencing that everybody selling stuff in the country needs to be aware of.

Just because a spectrum is open for use in the US doesn't mean they're open for use in Japan, Europe, etc., and vice a versa.
Therefore I don't think they'll use anything outside the "globally accepted" spectrum.

For example, 802.11a bandwidth is prohibited from use in Taiwan because it's actually military spectrum. (at least legally I recall)

Now Nintendo could just change the hardware for each country but that will just be a logistical disaster.
 
Status
Not open for further replies.
Back
Top