Predict: The Next Generation Console Tech

Status
Not open for further replies.
Welcome!

If one were to assume a linear progression, one would assume 250mm2 250mm2 split with cpu/gpu.

If one believes as I do that future consoles will rely on the GPGPU for CPU assistance, then part of the CPU die budget would go toward the GPU.

How aggressive this shift is is anyone's guess, But I'd assume a smart route would be to take an existing off the shelf gpu, and mate it with a custom CPU solution scaled from existing Cell/Xenon tech.

This will enable the relatively large GPU to be binned from a desktop chip which perhaps couldn't cut it at 925MHz, or perhaps doesn't have all of the 2048 ALU's working.

Thanks for the welcome. Your reasoning from a typical progression makes sense. However, from what you're predicting, while sensible, doesn't mean anything...

In fact, I don't think you can go by the past at all anymore. It's a different console market these days, and I firmly believe that Microsoft will go for a small and quiet, less powerful console, which, whenever Sony releases there's, will follow suit.

That's the main reason I was saying why is that the ballpark, simply because there's a good chance that the end result is smaller and much less powerful. I think you, and many others here, are way overshooting next-gen specs, in my opinion, almost in the same way that the Wii U thread is severely underestimating the specs.
 
GPU's on consoles will always pretty much be running at 99% load as they're pushed harder due to being closed boxed systems.
I don't know how tools measure GPU load, but by my definition of load this statement is false. Rendering a game environment is series of bursts. Sometimes you'll be shader limited like Furmark. Other times the limit is fill rate, texturing, memory bandwidth, prim rate, etc.

It's rare to have the ALU's pegged and to simultaneously be achieving peak memory bandwidth, fill rate, etc.
 
It seems Tim Sweeney had a talk at DICE recently and it's a shame it's not online as it seemed to have a lot for tech wonks.

There's only a couple writeups, http://venturebeat.com/2012/02/09/epics-tim-sweeney-predicts-the-next-20-years-in-gaming-technology/


Epic sure is singing a different tune than the power doom and gloom on message boards

Sweeney, a shy but brilliant programmer who helped create the backbone graphics engine for Epic’s blockbusters, predicted in his talk that games will look better and better, and upcoming advances will be so good, that console makers will be able to introduce next-generation machines that will blow us away with their visual quality.

“How good is good enough, and how close are we to that now?” Sweeney asked.

Our eyes are equivalent to the quality of a 30-megapixel camera. You don’t perceive improvements in frame rate beyond 72 frames per second. Many games already run at a rate of 60 frames per second. The best resolution for humans, then, is 8000 x 4000 pixels, or several times better than today’s best displays. That is about 20 billion to 40 billion triangles per second in terms of graphics rendering.

“The limit really is within sight,” Sweeney said.

Doom’s rendering required 10 megaflops; Unreal in 1998 required 1 gigaflop; and now the Samaritan demo requires 2.5 teraflops, an order of magnitude higher.

Doing a human face with movie-level accuracy is still very hard to do, Sweeney said. He believes that chip makers could double computing power every two years (as predicted with Moore’s Law), and that could continue for quite some time (perhaps even a couple of centuries) with the vertical stacking of chip circuitry and other advances in leading-edge physics.

That’s a bold claim, but predictions about how fast technology will advance in the long term always tend to fall short of reality.

“Within our lifetimes, we will be able to push out enough computational power to simulate reality,” Sweeney said.

Also a couple tweets, maybe from the IAA Awards tonight? I'm unclear

epicactual Mike Capps
Got to intro Tim Sweeney of @EpicGames for his speech and I got all choked up. His talk is amazing! We need 2000x more power!

Also the GAF IAA thread is atwitter with apparently Mark Rein pretty explicitly said something about next gen, I'm not watching so not sure what he said, but it seemed UE4 related.

http://www.neogaf.com/forum/showpost.php?p=35004258&postcount=101

I'm with the GAF poster who said Epic's not going to let MS/Sony release something that wont run UE4/Samaritan.
 
It seems Tim Sweeney had a talk at DICE recently and it's a shame it's not online as it seemed to have a lot for tech wonks.

There's only a couple writeups, http://venturebeat.com/2012/02/09/epics-tim-sweeney-predicts-the-next-20-years-in-gaming-technology/


Epic sure is singing a different tune than the power doom and gloom on message boards







Also a couple tweets, maybe from the IAA Awards tonight? I'm unclear



Also the GAF IAA thread is atwitter with apparently Mark Rein pretty explicitly said something about next gen, I'm not watching so not sure what he said, but it seemed UE4 related.

http://www.neogaf.com/forum/showpost.php?p=35004258&postcount=101

I'm with the GAF poster who said Epic's not going to let MS/Sony release something that wont run UE4/Samaritan.

Most of that isn't realistic...
 
That next gen will run UE4/Samaritan (or at least a decent approximation), yes I'm pretty sure that's going to happen. A virtual certainty.

If you pay close attention to what comes out of Epic a pretty clear picture emerges. A picture of powerful traditional next gen consoles that will run UE4. They've dropped plenty of hints. Heck I posted a link from a few weeks ago where Mark Rein was part of a CES rountable, and when the question of "what's next gen came up", and this other guy started trying to cloud it with facebook and cloud and whatever, Mark flat said next gen will be a box with "more compute power." And consider that Epic probably has at least next Xbox back in a closet somewhere, so he may have couched it on speculation but in reality it's grounded in fact.

The other stuff seemed more like "where is this all going eventually" and not directly next gen.

Also it probably deserves it's own thread but Epic announced UE4 will be revealed in 2012: http://www.g4tv.com/thefeed/blog/po...evealed-in-2012-according-to-epics-mark-rein/

"People are going to be shocked later this year when they see Unreal Engine 4 and how much more profound an effect it will have," Rein said.

I wish it was GDC, but really dont get the vibe from "later this year" that it would be.
 
Why would epic reveal anything at TGS?

I have no idea where they WOULD unveil it, of course. GDC seems like the best idea to me. e3 and Gamescom are out there. I suppose they could just do their own event at any time or just trickle it out onto the internet.

Edit: I see the partial joke went right over my head...
 
It's a different console market these days...

How so?

Did the HD consoles recently start to slump in sales vs Wii?

Did the low priced Wii bundles dominate sales this past Christmas?

Are people unwilling to spend $399 on a console bundle? (see: Kinect + HDD bundle price)

Taking a look at recent data suggests not only is the hi-end console market viable, it's thriving.

The only thing different is it isn't dominated by Sony anymore. There are no seismic shifts in sales away from consoles or console games and in the console space, there isn't a dramatic shift toward cheaper either ... so with that said, I'm not sure where all these people are getting the impression that nextgen will be a gimp-fest of wii clones with gimmicks.

I think it's safe to say that what to expect nextgen is a lot more xb360/ps3-type than Wii-type.

And the reason for that is simple, both MS and Sony already have their "wii consoles" on the shelf right now. They did the "mid cycle refresh" and as the prices come down, both ps3 and xb360 can comfortably fill the role of casual/mom console while being profitable and shrinking in size.

This leaves room in the pricing bracket for high end consoles (which are necessary to differentiate the growing onslaught of tablets and smartphones).

A change from this would have to be accompanied by an aggressive 2 year refresh Apple-style model with forward compatible games.

Otherwise, these fresh new consoles will have ipads breathing down their necks before their replacements are ready.
 
More UE4 next gen hints (basically some blog wrote up Rein's statement):

http://loudmouthedgamers.com/blog/2...running-on-systems-i-cant-talk-about-by-name/

The DICE summit is currently taking place as awards are being handed out and moments ago Mark Rein took the stage. While talking about Unreal Engine 4 he said that the graphics engine was running on “systems I can’t talk about by name.” This would obviously point to the next generation of consoles being pretty far into development. Let the rumors continue to come out of the woodworks!

UPDATE: The quote was technically “Including systems we can’t name yet.”

I think that should seriously crimp some of the uber low spec talk. In fact given what Samaritan ran on, it points towards a pretty darn high spec.
 
Well if you want to defend Wii U, you could put stock in the rumors that Nintendo was considering changing the name, so he couldn't mention the new name :p

Not that I think that's the case, just pointing out the wiggle room.
 
Well if you want to defend Wii U, you could put stock in the rumors that Nintendo was considering changing the name, so he couldn't mention the new name :p

Not that I think that's the case, just pointing out the wiggle room.

Interesting thing is it seems he has working silicon now.

Implications of that are both positive and negative I'd say...

Positive: it seems likely the console will ship soon ~12mo
Negative: the likelihood of that working silicon being the SOC that was rumored recently just went up...


I really hope they don't try to Wii this thing ... I hate gaming on the PC, but I'll switch if there's no reason to buy a 720.
 
Interesting thing is it seems he has working silicon now.

Implications of that are both positive and negative I'd say...

Positive: it seems likely the console will ship soon ~12mo
Negative: the likelihood of that working silicon being the SOC that was rumored recently just went up...


I really hope they don't try to Wii this thing ... I hate gaming on the PC, but I'll switch if there's no reason to buy a 720.

UE4 likely means pretty decent specs at the least. Samaritan was running on 3 580's (of course, I am not implying exactly that level for next gen, but yeah)

And I'm not sure why you rushed to conclude next Xbox would be like Wii via your opposite world logic (hey, it runs the most graphically demanding demo ever presumably, therefore it must be extremely low power), when he didn't name anything, and in fact said "consoles" plural, so in that case PS4 is also launching within 12 months (hint: neither is)

Personally I'm pretty skeptical of the SOC rumors. They only have one source. I'm pretty skeptical of all Xbox rumors to date (of course they all contradict each other to some degree anyway, so obviously most are false). We dont know what it is, but I'm pretty sure it's a strong traditional console that isn't launching for a while. And the longer we go without any real info, and Epic hinting at UE4, the more confident I feel about that.

Epic is the key and they've been very clear, some just arent listening.
 
Yes, but its not linearly dependent on voltage. Otherwise, if 100mhz needed 0.1 volt then 1000mhz would need 1 volt and 3000mhz would need 3 volts. Ever seen that?

Each node has a minimum operating voltage.

However, fundamentally you're charging and discharging parasitic gate capacitances every time you switch a CMOS gate.

That means moving a set amount of charge within a limited time interval. Current (amps) is charge (coulombs) per unit time (seconds). If you want to switch twice as fast, you need twice the current. You get twice the current by applying twice the voltage.

To sum up: If you want to double frequency, you double drive current. Doubling current quadruples the power spent for each switching of a transistor. And since we're switching twice as fast, we get eight times the power consumption.

Cheers
 
Well if you want to defend Wii U, you could put stock in the rumors that Nintendo was considering changing the name, so he couldn't mention the new name :p

Not that I think that's the case, just pointing out the wiggle room.

Or you could point out that devs are under NDA. Also:

OK, i just watched that video, and he clearly does NOT say that UE4 is running on "engines we can't mention yet."

He's talking about the Unreal Engine as a whole and doesn't mention UE4 until AFTER he says that line.

For all current systems, Wii U, and systems 'he can't mention,' he's talking about the Unreal Engine in general terms. He gives absolutely no specifics about consoles running the UE4 engine.


Guys, he specifically said Wii U right before he said "AND systems we cant name yet". So yeah, 720 and PS4.

;)
 
You mean the UE3 games don't look the same on the Wii (or iOS) as they do on the PS3 or Xbox 360? /gasp

Of course the future consoles will run UE4 (assuming they get a UE4 out of the door) that doesn't mean they'll be able to run it on every or any platform with all of the assets in the Samaritan demo.
 
Our eyes are equivalent to the quality of a 30-megapixel camera. You don’t perceive improvements in frame rate beyond 72 frames per second. Many games already run at a rate of 60 frames per second. The best resolution for humans, then, is 8000 x 4000 pixels, or several times better than today’s best displays. That is about 20 billion to 40 billion triangles per second in terms of graphics rendering.

I wonder how he got that figure, my understanding is that there is between 2.4m to 3m ganglion cells providing input to the brain, that is at most around 3m nerve fiber inputs. If I'm not mistaken that is in the form of action potentials, virtually all or nothing input at any one moment or 3~m bits.


I would say that 4kx2k should basically get us covered, that would be 8megapixel. At six feet of distance I don't think one can discern much more than that when using a 50-70inch tv.

With regards to 20 billion to 40 Billion triangles, seeing as I've heard v1 area(biggest area of visual cortex, where the image closest to raw is kept) estimated at 0.5-1B cells. It also seems possible that displaying a number an order of magnitude smaller might also suffice(2-4B triangles... though many scenes could easily do with a fraction of even this lower figure without perceptible change.).
Most of that isn't realistic...

It depends on the complexity of the scene
fix the image quality of that, and it gets pretty close to real.

We've all seen comparison pics between reality and realtime, that are uncannily close. A simple wall or floor can be made photorealistic. Slightly more complex objects can also be made photorealistic. It is moving complex objects with complex surface properties that present the most difficulty.
 
Last edited by a moderator:
An engine is more than graphics. I don't see why they wouldn't have it on WiiU.

True.

And the likelihood of WiiU not having a dx11 class gpu is pretty much slim to none, so the feature-set of graphic effects they're looking to introduce in UE4 will be compatible with WiiU, even if that means it has to scale them back to fit the 50w budget of the box :p.
 
Status
Not open for further replies.
Back
Top