Predict: The Next Generation Console Tech

Status
Not open for further replies.
Unless Intel does the fabbing, I'm not convinced it's going to happen next gen.
TSMC and GF seem to have a horizon 1-2 nodes behind Intel for a shift from planar, which could add another 4-6 years.

AFAIK the other fabs will switch to tri-gate after 20nm, so that will be too late for the launch of the next-gen consoles. :(
 
It does show Intel has a clear roadmap out to 10nm in 2015, though. Which I guess is encouraging.

2h4duue.jpg


Maybe next gen consoles can begin on 32nm (or TSMC's equivalent) (I think we need to be conservative here, given TSMC lag, the nature of console production etc), and they will still have 3 shrinks left. Or if on 20nm, perhaps 2 shrinks, which should still offer good cost savings.
 
Last edited by a moderator:
I hope next gen consoles use similar processors than the new line of 22mm processors Intel, and other companies, are creating for the future. The Ivy Bridge. 3D chips -physically 3D, not flat transistors anymore- using a technology called Tri-Gate.

http://www.bbc.co.uk/news/technology-13283882
Same as the other unlikely, the amazing thing is that Intel advantage in term of process is growing overtime.
It ships this year :oops: Damned pssibily before TSMC/GF 32/28 nm node.
By 2014 Intel will be able to deliver 14nm lithography, I'm not sure GF and TSMC will be in situation to ship @ 22nm (which could be problematic depending on what N puts in it Wii2).

Actually keeping in mind time line and if Intel is willing to a rebate on its products, larrabee 3/4 may be a pretty valid contender for either Sony or MS.

I'm really scared for AMD, Bulldozer won't be up the task by an healthy margin. It sounds crazy but Intel may be in a sutuation to clock its CPU in turbo mode above 5GHz :devilish: :oops:

EDIT by the way thanks Cyan for bringing this news to my attention, it's been imho the biggest new in regard to hardware I read since let say a while, that's huge (no console related, but really huge).
 
Last edited by a moderator:
Although there is no technical information, the "matching those consoles' abilities to render and output graphics" is pretty much self-explanatory to me. It means Nintendo isn't aiming to surpass the render capabilities of PS3 and X360 (which is a shame as it would be dirt-cheap nowadays). Nintendo is aiming to match the current-gen's render capabilities.
Why don't you read the full sentence? "matching those consoles' abilities to render and output graphics in high-definition" is what the article says. The sentence is perfectly clear. You're selective quoting doesn't even make sense - if you only read that part, it seems to state that Nintendo consoles so far didn't render and output graphics at all.
 
It's also flat-out wrong. Recently one UK town hit the world's top hundred fastest broadband connected cities, with an amazing connection speed of 6.2 Mbps. 6.2. Megabits. Akamai's State of the Internet reports (as of Q4 2010) show only South Korea getting above 10 Mbps on average.
Everyone else is below, around 6-8 Mbps for the faster nations.
I think you're mistakenly mixing everything up in something that shouldn't be mixed.

How do they make those statistics? Do they count only the houses with internet or do they count the houses without internet as having a 0Mbps connection? Even then, should those statistics matter to this discussion, as they make no filtering whatsoever regarding the gamer population?

The developed countries are also the countries with a largest aging population. Why would most >65 year-old couples want internet in their house? Or why would the "typical >45 year-old salary-men\women" childless couple want anything more than the slowest+cheapest possible connection?
And how many of those people ever going to buy a gaming console to play online games?

Maybe the only way to have a clear(er) perception of the "internet connections that gamers use" would be to only count the houses with a 7th-gen console and\or a gaming-oriented PC. And even then you'd have wholes in the statistics like all the people who don't care for online gaming and still just use the web for some occasional browsing and e-mail., i.e. parents who bought the console for small children.

The only info I can gather from that 6Mbps UK town is that it has a relatively young population, perhaps with low unemployment rates.

And Speedtest shows, of those geeks into testing their fast broadband, the typical is all of 10 Mbps. OFCOM reckons the UK average is 5.2 Mbps, whereas Akamai pegs us at 3.8. Nothing anywhere points to superfast broadband being at all common. It'll be a reasonably long time before 10 Mbps is even commonplace (UK government is aiming for 2015 infrastructure rollout). It'll be an age before the ~20 Mbps practical BW of Wifi G becomes a serious internet bottleneck, by which point wifi G will be long dead I'm sure.
My Speedtest shows 51Mbps, so what kind of geek am I? With a Wifi-G connection in the house, I am mostly limited by the wifi network whenever I not within 5 meters of the router.
It's not an out-of-the-ordinary internet connection. I just got the fiber-optics "~100 channels HDTV + phone + 50mbps internet + 3G pen with 100MB" plan for ~55€/month. Back when we made the deal, there were 4 people living in the house, so I don't think it's that much.
The same plan with more channels and 100Mbps has been around for 2.5 years.

All my friends have >20Mbps connections, because you can't really find any lower than that in the city right now (other than people using 10-year-old contracts and didn't bother to upgrade (for free)).

I'm not taking these numbers from my ass, you know? I can just send you the links from our ISPs and you can see for yourself the slowest speed you can buy, but it's really not ontopic, so if you want I can send them to you through PM.


The point is, Internet connections are now fast enough (not as in "everyone has it" but as in "you can have it if you want to") for using cloud storage in gaming for non-bandwidth demanding things like saved games, profiles, achievements, game replays and other stuff.
And that could make up for the Stream's rumoured 8GB of storage.
That was my only point.



Why don't you read the full sentence? "matching those consoles' abilities to render and output graphics in high-definition" is what the article says. The sentence is perfectly clear. You're selective quoting doesn't even make sense - if you only read that part, it seems to state that Nintendo consoles so far didn't render and output graphics at all.

English is not my native language, so I may be wrong in the interpretation, but here's how I read it:
The new 2012-scheduled Nintendo system will fall more in line with the 360 and PlayStation 3 by matching those consoles' abilities to:
1 - render -> meaning it has about the same performance as the other two;
and 2 - output graphics in high-definition -> meaning it'll probably have a HDMI output or component cable out-of-the-box (with the first making more sense)
 
How do they make those statistics? Do they count only the houses with internet or do they count the houses without internet as having a 0Mbps connection?
They are a massive data company, running servers which shoulder and serve a huge amount of internet traffic. This report is derived from their monitoring of their own real-world traffic across real-world internet servers, many thousands of times as many samples as something like Speedtest, and for all internet users irrespective of demographic.
Even then, should those statistics matter to this discussion, as they make no filtering whatsoever regarding the gamer population?
Because gamers don't have ttheir own infrastructure. I don't get my own cable connection because I'm a gamer when the rest of this poky little village doesn't! So if the UK#'s average connection is 4 Mbps, that's what a console designer has to bear in mind when designing a console, unless they choose to serve only a subset of potential buyers who have access to faster BB.

My Speedtest shows 51Mbps...
All my friends have >20Mbps connections, because you can't really find any lower than that in the city right now
Your city is great. My friend 10 miles from me recently got upgraded to 30 Mbps fibre, although it sometimes drops below 20. Another friend, 5 miles away, is looking forwards to an upgrade from his 1 Mbps connection (fastest he can get on an "up to 8 Mbps" connection) later this year or whenever. I'm looking at 6 Mbps on my "up to 8 Mbps" and checking the rollout plans for fibre, see my locality isn't down to get it ever (so should get it by 2015, but the rollout plans don't list that far ahead).

But if you want to go by your city's amazing BB speed and, ignoring the likes of Speedtest and the Akamai report, want to imagine a Europe where everyone has access to 30 Mbps if they want it, that's your prerogative. ;)

The point is, Internet connections are now fast enough (not as in "everyone has it" but as in "you can have it if you want to") for using cloud storage in gaming for non-bandwidth demanding things like saved games, profiles, achievements, game replays and other stuff.
And that could make up for the Stream's rumoured 8GB of storage.
That was my only point.
That I can agree with, but that's not what you said. You said 'streaming content' without talking about that being small-scale, non-time-critical content, like save games, which suggested something of a server-client model with live net content streaming. That's just not an option. The internet infrastructure isn't there for all but a few lucky cities, unless you are talking very smallscale content that'll fit in a 8 Mbps connection (hoping no-one else in the house starts watching YouTube HD content while your gaming). Hence next-gen console design is still going to have to support a fully offline model of distribution and storage, even if a fully online model is also catered for. This next generation should be the transition one, and after that live net content may be possible.

English is not my native language, so I may be wrong in the interpretation, but here's how I read it:
It's an ambiguous statement that can be read different ways, so either you or wsippel could be right. There's certainly no fault in your language skills for reading it how you did. ;)
 
English is not my native language, so I may be wrong in the interpretation, but here's how I read it:
The new 2012-scheduled Nintendo system will fall more in line with the 360 and PlayStation 3 by matching those consoles' abilities to:
1 - render -> meaning it has about the same performance as the other two;
and 2 - output graphics in high-definition -> meaning it'll probably have a HDMI output or component cable out-of-the-box (with the first making more sense)

It could be read that way, but its a bit of a stretch IMO. The most obvious interpretation of that comment is "like 360 and PS3, Wii2 will have the ability to render/output HD graphics".
 
Last edited by a moderator:
Once bandwidth is fast enough and it is actually used, you'll see pay for throughput / bandwidth caps come next.
 
not the strategy adopted by the ISPs over here, well hopefully.
you know, there are all the costs that come in metering, billing, multiple commercial offers, complicated customers relation ; it's a bit like all the healthcare red tape in the US.

keeping a one-size-fits-all offer and upgrading the networks as usual looks so more simple ;). plus we can anticipe huge reductions in costs of fiber transmitters, switching equipments and a few more tenfold increases in bandwith.

worst case would be pulling a comcast, consistantly sizing your network to keep it in congestion then prepare to invest huge crapload sums in ultra expensive gear to perform some content-based, bribe-based, censorship-based etc. throttling.
 
I just get curious about the power consumption of different consoles,so here is the list so far (all of them game play number). All of them done by me with direct measurement :
Nintendo :
N64 - 8.7 W
Gamecube :22 W
WII : 16.2 W

Sony >
PS1 , first gen : 9 W
PSONE : 7 W
PS2 fat : 29 W (34W with HDD)
PS3 160 GB : 79 W

Sega Dreamcast : 19 W

So , there is a tenfold increase in the power consumption within ten years in the cutting-edge technology (playstation)
I can't see less than 300 W for the PS 4 / XBOX 3 - but possibly for visible improvement they will have to go for 500 W
 
That you know the kinect is made so that if you plug in the nextbox it gains an order of magnitude of precision reducing the lag at the same time, or we must espect a kinext with better cams?
 
Your calculations are wrong. Ps2 is more than 10 years old and draws 30watt. Ps3 draws 80watt. That's less than a 3 times increase in over 10 years.

I doubt we will see a 300watt console and I'm pretty sure we will never see a 500watt console . Besides that you don't need a 500watt console to beat ps360 gfx. You could get some mid range standard pc hardware and beat ps360 gfx no sweat.
 
We already have >500W consoles.

But we happen to call them "Gaming PCs", just for the fun of it.
 
Your calculations are wrong. Ps2 is more than 10 years old and draws 30watt. Ps3 draws 80watt. That's less than a 3 times increase in over 10 years.

I doubt we will see a 300watt console and I'm pretty sure we will never see a 500watt console . Besides that you don't need a 500watt console to beat ps360 gfx. You could get some mid range standard pc hardware and beat ps360 gfx no sweat.

You comparing the last gen PS3 with the first gen PS2.
The last revision of the PS3 has been released in 2010 , the last rev of the PS1 in 2000.
So,the same hardware revision has ten fold increase in the power consumption.
So,if we say that the next gen console has only five fold increase,then we counting with a big leap in the technology.


My first gen PS3 get the YLOD,if I fix it then I will check the power consumption of that too.
But as I remember that has 150W power draw.So,if you compare that with the first rev PS2 ( 29 W , released in 2000) with the first rev PS3 (150 W released in 2006) then the picture even more interesting.

Five fold increasing within six years.
 
Yeah, but this is more like clock - at some point it becomes impractical. In the meantime, look at the iPad 2? How much power does that use? And how many times can you fit that into the just under 200w of the launch PS3? EDIT: and now for the heck of it, take off the amount of power that the screen of the iPad 2 uses.

I am fairly convinced the current console gen overstretched itself in terms of heat generation and we will not see an increase beyond 200w, sooner the reverse.
 
You comparing the last gen PS3 with the first gen PS2.
The last revision of the PS3 has been released in 2010 , the last rev of the PS1 in 2000.
So,the same hardware revision has ten fold increase in the power consumption.
So,if we say that the next gen console has only five fold increase,then we counting with a big leap in the technology.


My first gen PS3 get the YLOD,if I fix it then I will check the power consumption of that too.
But as I remember that has 150W power draw.So,if you compare that with the first rev PS2 ( 29 W , released in 2000) with the first rev PS3 (150 W released in 2006) then the picture even more interesting.

Five fold increasing within six years.
first revision ps3 was well over 200.
 
I just get curious about the power consumption of different consoles,so here is the list so far (all of them game play number). All of them done by me with direct measurement :
Nintendo :
N64 - 8.7 W
Gamecube :22 W
WII : 16.2 W

Sony >
PS1 , first gen : 9 W
PSONE : 7 W
PS2 fat : 29 W (34W with HDD)
PS3 160 GB : 79 W

Sega Dreamcast : 19 W

So , there is a tenfold increase in the power consumption within ten years in the cutting-edge technology (playstation)
I can't see less than 300 W for the PS 4 / XBOX 3 - but possibly for visible improvement they will have to go for 500 W

:LOL:

I can see you're out of your mind.
 
Status
Not open for further replies.
Back
Top