Predict: The Next Generation Console Tech

Status
Not open for further replies.
Not all problems/code/algorithms can be programmed to be solved in a parallel fashion. Some are still serial. In the latter case IPC and megahertz wins.

Graphics and 3D rendering of course benefit enormously from a parallel approach. Even then if there is any serial code within a program you are only as fast as you can execute those instructions.
 
Well, there's certainly a case for AI and physics. But aside from graphics, what were you thinking of having lots of CPU cores do?

AI and game logic, I want my RPGs to actually do some simulation of the NPCs in a city for example.
Embarrassing parallel, and soaks more power than a time travelling Deloran!
 
What would be the purpose of such a CPU? How big the thing would be? ARMv8 are A15 right?

I am talking about a "Cortex A15" with a longer pipeline to achieve higher clock-speeds. I know that this type of processor is an insane idea, but I am with the idea that is better to have the best performance possible than going with the "this is enough" scenario. We know that the future will be "completely programable GPUs", at least this is what some developers want for the next generation but this doesn´t mean that all has to be charged to the GPU, I prefer to put all the GPU power for graphical algorithms in exclusive and have a very powerful CPU that can move things like physics instead of putting this type of stuff in the GPU.

The truth is that I want to see the application of evolutive algorithms applied to videogames and I believe that you need a very powerful and multithreaded CPU for this.
 
AI and game logic, I want my RPGs to actually do some simulation of the NPCs in a city for example.
Embarrassing parallel, and soaks more power than a time travelling Deloran!
Yeah, it's gotta be said so far I've been mostly disappointed with the next-gen-ness of games. There aren't amny really using processing power for game changing experiences. Things like PixelJunk Shooter's fluid physics are something novel, or that upcoming god game with the world creation engine. Otherwise the overall feel is much like a moderate progression. No doubt in part due to CPU power being given over to generating fancy graphics - you can't get much of a sense of a living world in a PR bullshot... (there's another feature that never manifest, the Sony demo of the world growing and coming alive. All this procedural content that was supposed to make everything fresh and exciting.)

Perhaps it's telling that expectations for CPU are how it helps GPU, as the housework is a minimal task that'll run on a moderate spec CPU. Who's pushing the boundaries on non-graphics CPU processing to show what could and should be achieved next-gen, hence a requirement for a meaty CPU in the next boxes?
 
I am talking about a "Cortex A15" with a longer pipeline to achieve higher clock-speeds. I know that this type of processor is an insane idea, but I am with the idea that is better to have the best performance possible than going with the "this is enough" scenario. We know that the future will be "completely programable GPUs", at least this is what some developers want for the next generation but this doesn´t mean that all has to be charged to the GPU, I prefer to put all the GPU power for graphical algorithms in exclusive and have a very powerful CPU that can move things like physics instead of putting this type of stuff in the GPU.

The truth is that I want to see the application of evolutive algorithms applied to videogames and I believe that you need a very powerful and multithreaded CPU for this.
I'm secretely hoping Sweeney is right and "larrabee like" are the way to go. Actually my secrety wish (a geek wish not a reasonable wish) is that the chip would even pass on texture sampling as Sweeney describe in his paper. Texturing latencies put a hell of a pressure on the design I'm confident (based on thin air ... treu... lol) that one could be able to fit a bunch of vectors on one chip, delivers crunching power in TFLOPS matching nowadays processors, while being possibly more power efficient. I actually believe the thing could turn pretty efficient at crunching numbers, the real question is... graphic performances... :LOL: I state it has a joke but I also state it as we have no serious reference. intel gave references (and based on simulation not the real deal) on how GeoW would perform on Larrabee that one thing, a proper reference would be Render GeoW using clever tricks Larrabee allows not running Geow on top of a huge software layer on larrabee. End of the rant it won't happen
 
AI and game logic, I want my RPGs to actually do some simulation of the NPCs in a city for example.
Embarrassing parallel, and soaks more power than a time travelling Deloran!

You and Peter Molyneux would get along so well. ;)
 
AlStrong said:
Well, there's certainly a case for AI and physics. But aside from graphics, what were you thinking of having lots of CPU cores do?

Everything that needs a lot of power. I personally would design everything in my engine for streaming data. I'm a boring desktop and services programmer but studying high performance computing has changed the way I look at programming quite a bit even there. Now its all about the data. ;)

In the gaming world I like games such as Super Rub-a-Dub and Super Stardust as an inspiration of what streaming can bring to enhance even the most basic of game premises.
 
so far Apple is closest to me Grand Vision of the Future:
http://www.eurogamer.net/videos/ipad-2s-real-racing-2-hd-trailer

TV out using the iPad as a motion controller.


Cool. Only that, iPad is connected to a mac via HDMI, which is then connected to a HDTV. Not the best solution IMO.

Apple should allow TV makers to stream application windows over wireless, as in full monitor mirroring over wifi. It would be nice to adapt wifi-direct so that TV won't have to be connected to a network. You go to a hotel room and play your ipad2 games in your room on HDTV.

As far as I know Apple is planning to allow TV makers to stream video over wifi. I hope it will allow other applications too.
 
I think you're confused. ;) The video has picture-in-picture to show the iPad in the user's hand, and what's displayed on the TV. There is no intermediary device. A cable has to be used to connect the iPad to the TV. What's a bit rubbish is Apple's reluctance to use open standards, so instead of using a mini HDMI port, you have to use their overpriced Apple convertor.

What they have got right, once you've spent an extra forty bucks, is full media out onto your TV. Sony's NGP really needs this, or they'll lost considerable value that's easy and cheap to support. I wonder how long it'll be before someone creates an Eyetoy like camera-based game for iPad2 plugged into a TV, and pretty much create a first draft of my Grand Vision?
 
AI and game logic, I want my RPGs to actually do some simulation of the NPCs in a city for example.
Embarrassing parallel, and soaks more power than a time travelling Deloran!

You already had that in Ultima Black gate. You could even go to the theatre and see NPCs performing. And it run in my 386 25 mhz perfectly...

Something is missing as GPUs evolve. I have already said some time, I can´t understand how a 1994 game like Blade of darkness had perfect dynamic shadows casted from every torch and moved by a TNT 2 and in today games the only you get are blocky shimmering shadows or missing ones...
 
AI and game logic, I want my RPGs to actually do some simulation of the NPCs in a city for example.
Embarrassing parallel, and soaks more power than a time travelling Deloran!

A lot of this is already possible with current hardware I think, as long as clever streaming algorithms are used. I think what is probably holding back this type of game design more is that a) it is harder to direct awesome moments (though emerging gameplay can produce those, but how to make sure that a reviewer doesn't miss them?) and more importantly b) many devs can't handle the QA requirements for it. You need to take QA into account when designing your game day one, or you'll run into trouble as you can't exhaustively test every possible scenario.
 
Status
Not open for further replies.
Back
Top