Future console discussion and thoughts

Wouldnt that be true for every game, dont take me wrong I do think I could very well like of episodic content (althought I dont really like the way it is being made by valve). Why not sell each of the Halo levels or Obvilion quests in episodic content?
It's not needed for existing, conventional games and franchises as they already sell. You can invest a large amount into 'Halo 3' or 'Racing Game 16' because they sell and you can be confident to make your money back. It's a huge risk to invest heavily in 'Abstract Weird and Uncovnentional Game' as you don't know if the public will lap it up or not. In the case of Okami, consider the investment that went into that. If it sells a million copies, great! If it only sells 40,000 copies, that's a waste of money. If instead they had released a download starter title in that style, offering gamers the stylized graphics and gameplay at an entry level price without such a large investment, they could have judged from the initial 40,000 purchases 'this game won't do very well. Let's leave it at that.' Or if it sells a few hundred thousand, go ahead with part 2 and grow the series.

If I were a developer, that's how I'd do it. I'd have my big main title that's a bread and butter game with bread and butter potential - Sci Fi FPS number 32 - lacking in originality because the mainstream keep buying the same old generic ideas. And on the side I'd create my artistic, visionary, out-there masterpiece on the cheap, testing the water with how much effort to invest in it. When I find everyone's too busy playing 'Sci Fi FPS number 32' to give a damn about 'Artistic Visionary Out-There Masterpiece Part 1' I haven't lost much and know not to waste more developing it. Then I can get to begrudgingly creating Sci Fi FPS number 33, hating the masses for forcing me to develop the same old same old over and over, and create 'Unique, Brain Expanding, Surreal Game Experience part 1' on the side to see if that one has any chance of success or not!
 
Dont take me wrong, I understand the idea and I even think that it is the right soluction to some/many(?) games but not to all and not to everyone for several reasons.

For one by my own experience in many of those 'Abstract Weird and Uncovnentional Game' if had only played 1/10 of what I should/would (in a normal case) it would be much harder to ever pick game again (actually this would even have hapened with others games like Halo), if consumers (or even I sometimes) had to pay and give the time for try something that I am not sure I like it enoughtthat I must remember what hapened and when it is out... it will be much worst for all of those games (at least some of them I wouldnt picked again) that at the end I liked and would like to see more content for it.

Some kind of games just wouldnt work, for example any game that evolves only after a good amount of time (eg Fable), IMO many sports games wouldnt work.... You just need to remember that most/all? gamers when they get a new game they spent a lot of hours at once playing it and the games are made to make those hours a good time, which would be much harder with episodic content.

Many games are good because it does evolve through levels and quests, ie in level 4 you need to use all that you learned in levels 1+2+3.

I do think that episodic games will be great but they will also limit a lot in some things, like I do think that the full games also limit a lot so 'Abstract Weird and Uncovnentional Game' cant be limited to episodic games.

Personally think that the diference between movies and TV series can show what I think about it, I wouldnt like to see a House MD or a CSI 3hours long movie if they are actually anything that resemble the series but I wouldnt like to see LoTR or Matrix in a episodic tv series fashion if it pretends to be anything close to the movies, althought sometimes some things can be good in both in movies and series like Impossible Mission. If any of the formats is lost we would miss great things, so if we only had innovation in movies we would have lost great series and vice versa.

Althought there is one thing that it can be hapen in the future if the console do have mass storage system as a standard (and if costs dont skyrocket again), that is the creation of what is already done with novels/comics, ie something that will give futher devlpments to the main game and/or give more background info/gameplay made by episodic content (HL and the expansion pack Oposing Forces (I am yet to try the other one) already did something like this).

Anyway to keep the post on topic this is good to say that things like this make me think that future consoles will be dev with much more than raw specs/features on mind but they will try to adapt the console to be ready to a new way of gaming that is already appearing in things like: the Wii remote, conect24, half life episodes, EyeToy, buz, downlad content. I dont hink that in spirit it will be that diferent from those things but that in pratice it will be really diferent and there will be a new industry based on those things.
 
Last edited by a moderator:
....If I were a developer, that's how I'd do it. ...

Agreed - Although with the new consoles using online as an integral part of their offering I think any type of revolutionary game will get good exposure through the avenue of free downloadable demos.

A similar model worked pretty well for Doom.

I'd like to see devs using xb live and Sony's online game section to explore these experimental game ideas and apply the moer popular ones to full fledged games with a proper budget applied. As it is it seems most are using this component for a quick cash-in instead.:cry:
 
I've told it in my first post (but nobody notice) It will be interesting ot see the advantage of numa vs uma for next gen.
Both system are likely to have the same quantity of ram say 2GB very hight speed. But interestingly in a uma design you care less about bandwith because you have edram for bandwith intensive operation, so a system with uma design could come with more cheaper memory. MS will stick with UMA.
Do you think given the cell design Sony will manage to put together an UMA design or will stay ram/vram without hyper fast memory for frame buffer operation?

I don't believe in the diminishing return, actual games only come close to some old 2d graphics cuteness. They 're still room for visible improvement.
Especialy for tectures that where uma vs numa will be an interesting design choice.

I think MS will stay with SMP with maybe some dsp. But in regard with heat/power consumption MS will have a smaller cpu than the ps4. So ms will spend more transistors on the gpu, it will be interesting to see how the gpgpu field improve in regard to performance and tools.

You can call me biased but I think MS is likely to do better design choices for the 720 than sony because they're are in a more open situation, sony has to leverage the huge r&d cost of the cell developpement , so sony is stuck with its design cell+numa (it can turn to be the right choice but we still don't know).
 
Question for you Laa-yosh - What do you think would be more valuable in a ps4/x720: 15x more ram or 15x more processing power? Obviously in either case the ram and processing power will both increase but which do you feel would be more valuable?

I'd say 7.5x more RAM and 7.5x more processing power! ;)
 
It's fun to add numbers to other numbers, but consoles are VERY unpredictable. To all those who are speculating about future console specs:

1. Could anybody have predicted that MS would go from a mobile celeron in the Xbox, to a triple-core IBM PPC that has no OOO execution?

2. Could anybody have predicted a crazy DX9+ unified shader GPU with embedded ram?

3. Could anybody have predicted an nVidia GPU and a single Cell in the PS3?

4. Could anybody have predicted that Wii would be an overclocked Gamecube with more ram and flash memory?

Pretty much EVERYTHING about the current generation of consoles took us completely by surprise. Who knows what the future holds?

There is one awesome new technology I would love to see make its way to future consoles, though. It's this thing called "anisotropic filtering". With AF, not only do the graphics look beautiful directly around your character, but they no longer look like mud 5 feet in front of your character. Amazing!

As far as digital distribution: I can see MS, Sony, or both trying very hard to push all-digital distribution next generation. It would mean no more used game sales, and therefore bigger profits for MS, Sony, and game developers. The only problem I can see is that Sony is trying so hard to push Blu-Ray. If Blu-Ray really takes off, then Sony might stick with BD-ROM after all.
 
I think Sony picked the Cell because it will scale very well making backwards compatibility much easier in the future, so I expect the "PS4" to include a future version of the Cell. I also think it's possible for the GPU to disappear in favor of "ray-tracing" and other techniques.

http://arstechnica.com/news.ars/post/20060805-7430.html

http://graphics.cs.uni-sb.de/~benthin/cellrt06.pdf

http://www.youtube.com/watch?v=lr-R4bUZIQw

Everything I read points to the fact that the PS3 might be able to do this admirably, and a dual Cell system could do this very well. I can only imagine a future revision of the Cell will accelerate this much further.
 
I also think it's possible for the GPU to disappear in favor of "ray-tracing" and other techniques.
Except RayTracing has negligable benefits in a gaming situation while requiring far higher resources. RT doesn't scale well at the moment to shader rich, excitingly lit graphics. The examples of realtime raytracing look like primitive rasterized graphics from 10 years ago. The terrain demo isn't 'proper' raytracing in that it's the most simplistic single-iteration model, and gives no idea of how it'll scale.

To avoid derailing the thread further, I suggest you search for raytracing in this forum to see the limits and impracticalities, and where it will have uses.
 
I was thinking about Nintendo's current state the other day, and something occured to me. I just realized that out of Nintendo's library of IPs, there's only 2 that would really "need" the ultra high tech power of the PS3/360 and beyond. Those are Zelda and Metroid. Mario relies on art, and as shown by Galaxy, creative uses of it can still give acceptable results despite weak hardware. Star Fox, same thing. Donkey Kong, yep that too. Fire Emblem is an SRPG and you don't need to bother with fancy graphics with such a game. And so on and so on. Now, put yourself in Nintendo's place. Zelda is no longer a 7.6 million seller that it was back during OoT's days. Metroid Prime 1 and 2 sold like 2 million combined. Sure, good numbers, but would significantly higher budgets be worth it if only that many copies are gonna be sold. On top of that, you'd be making powerful hardware only for the benefit of 2 games. Why should Nintendo bother at all anymore? MAYBE if Nintendo gets some good marketshare this time around, and lots of SERIOUS third party support (no last minute PS2 ports, plz). But until then, I don't see Nintendo giving a damn about the graphics race anymore, and feel they'll just let the gap widen even further. My predictions for Wii2:

CPU: 1.0-1.2 Ghz
GPU: 333 Mhz
RAM: 128 MB

In other words, pretty much what Wii should have been before Nintendo decided to skimp even further.
 
I was thinking about Nintendo's current state the other day, and something occured to me. I just realized that out of Nintendo's library of IPs, there's only 2 that would really "need" the ultra high tech power of the PS3/360 and beyond. Those are Zelda and Metroid. Mario relies on art, and as shown by Galaxy, creative uses of it can still give acceptable results despite weak hardware. Star Fox, same thing. Donkey Kong, yep that too. Fire Emblem is an SRPG and you don't need to bother with fancy graphics with such a game. And so on and so on. Now, put yourself in Nintendo's place. Zelda is no longer a 7.6 million seller that it was back during OoT's days. Metroid Prime 1 and 2 sold like 2 million combined. Sure, good numbers, but would significantly higher budgets be worth it if only that many copies are gonna be sold. On top of that, you'd be making powerful hardware only for the benefit of 2 games. Why should Nintendo bother at all anymore? MAYBE if Nintendo gets some good marketshare this time around, and lots of SERIOUS third party support (no last minute PS2 ports, plz). But until then, I don't see Nintendo giving a damn about the graphics race anymore, and feel they'll just let the gap widen even further. My predictions for Wii2:

CPU: 1.0-1.2 Ghz
GPU: 333 Mhz
RAM: 128 MB

In other words, pretty much what Wii should have been before Nintendo decided to skimp even further.

They should make hardware that at least outputs graphics that are within the window of diminishing returns to that of PS4 and Xbox720 so that would make it like a Xbox540. They also need to support 1080p likely say through scaling. Since they said they want to continue to keep the console small and low power, I'd say throw in a low power/low heat PPU coprocessor onto the same die or package as the CPU core which itself should stay small and simple for BC say 1GHz. That should be cool enough for a small case and the cost would be low too. The GPU should keep the same eDRAM on package like the Wii but configured like the XGPU. If you think about it the 24MB of eDRAM that's in Hollywood can be designed to work like that of the 10MB eDRAM in XGPU with a few modifications and still be used for BC with GC/Wii games. With 24MB of eDRAM you wouldn't need to tile either.
 
Last edited by a moderator:
The terrain demo isn't 'proper' raytracing in that it's the most simplistic single-iteration model, and gives no idea of how it'll scale.

The terrain demo is from 2004 I believe, quite a long time to work out the coding. The USB Camera demo is running on a PS3 though, and using only 6 SPEs. That's a bit more interesting.

That said, most people on these forums think a Cell/GPU combo will still be in the next generation, to use raytracing in the foreground and traditional graphics in the background. I just think if Sony could drop the cost of a GPU, they would. If their next generation Cell is what they hope, about 5 times the speed of the current, then over the life of the console two (or more) Cells would be less expensive than a Cell and a GPU.
 
They should make hardware that at least outputs graphics that are within the window of diminishing returns to that of PS4 and Xbox720 so that would make it like a Xbox540. They also need to support 1080p likely say through scaling. Since they said they want to continue to keep the console small and low power, I'd say throw in a low power/low heat PPU coprocessor onto the same die or package as the CPU core which itself should stay small and simple for BC say 1GHz. That should be cool enough for a small case and the cost would be low too. The GPU should keep the same eDRAM on package like the Wii but configured like the XGPU. If you think about it the 24MB of eDRAM that's in Hollywood can be designed to work like that of the 10MB eDRAM in XGPU with a few modifications and still be used for BC with GC games. With 24MB of eDRAM you wouldn't need to tile either.

Speaking of RAM, that makes me wonder something. Around how much RAM in the future would be the same price for what Nintendo is paying for their 88 MB right now?
 
My 2 cents…

All I am certain about is an ample of amount of eDRAM or ZRAM for a 1080p frame buffer without tiling. 32MB would be right, no?

Sony’s CPU should be Cell based. No question. Roadmap is (un)defined. NVIDIA can’t be ditched if they wish to keep backward compatibility – NVIDIA tech stays. Ditto Blu-Ray.

2GB RAM minimum. Sony will likely still use Rambus memory.

Production will start at 45nm. The more ambitious will push for 32nm but just like 65nm this generation I don’t see it happening from the onset.

Everyone should have waggle. What I want is Minority Report – so HDIP camera as standard or similar device/apparatus to facilitate capture of movement.
 
All I am certain about is an ample of amount of eDRAM or ZRAM for a 1080p frame buffer without tiling. 32MB would be right, no?.

16mb is enough for a 1080p framebuffer (with the front buffer put in the normal vram like on the x360)

But what we want is atleast 4x AA to go with that framebuffer, that would mean ~64 mb.
 
16mb is enough for a 1080p framebuffer (with the front buffer put in the normal vram like on the x360)

But what we want is atleast 4x AA to go with that framebuffer, that would mean ~64 mb.

OK thanks for the correction, that's without tiling right? In my opinion 64MB eDRAM seems way too much in ~2011-2012 for any console. The ZRAM theory could make it doable depending on how widespread SOI use is by then, but it still seems like fantasy technology.
 
OK thanks for the correction, that's without tiling right? In my opinion 64MB eDRAM seems way too much in ~2011-2012 for any console. The ZRAM theory could make it doable depending on how widespread SOI use is by then, but it still seems like fantasy technology.

There is MRAM or 1T-SRAM as well :p NEC does seem to be pretty quick on eDRAM development, with volume shipments of 8Mbit and larger units available in H2 2007. 45nm engineering samples are on slate for late 2008 with mass production in early 2009. Depending on whether NEC is aiming for 32nm, or the 32nm halfnode, ~ mid-2010 seems reasonable for their next process node.

I mention this because 32nm would be ~ 8x as dense as 90nm (current eDRAM process). The 32nm halfnode would bring that to 10x-12x as dense--all before the 2011-2012 timeframe you mention above.

Xenos has 10MB of eDRAM, and at 8x the density that is 80MB. While MS and Sony will be mindful of the troubles waiting ahead at the 22nm and 16nm nodes and beyond, I also think we won't see the Xbox3 until 2010 at the earliest and most likely 2011, with Sony aiming for 2011/2012.

And just to speculate, with the next node drop in the late 2011/early 2012 timeframe, a console launching in late 2011 could use a fairly larg eDRAM die knowing that in 6-12 months they would be reducing the size by ~50%, reduce power usage, reduce heat production, as well as increasing yields.

Whether eDRAM is used will largely depend on how GDDR4 and beyond, XDR, and other memory technologies develop over the next 5 years as well as the changing needs in graphics technology. Who knows, maybe we will eventually see TBDR rear their heads again, or maybe designs will use eDRAM for more than just the framebuffer (Xenos) and be more like the PS2. One of the problems of Raytracing and Global Illumination is not just math, but as well memory access. But we have a while before the next gen picture becomes more clear. But anything is possible when we starting hearing of 64 core CPUs or 32core CPUs with tens of MBs of caches and the introduction of new technologies like ZRAM that could, possibly, allow for significant jumps in memory density.
 
OK thanks for the correction, that's without tiling right? In my opinion 64MB eDRAM seems way too much in ~2011-2012 for any console.

Yup, that completely without tiling =)

Here is the way to calculate it (credits to Dave) :

Back-Buffer = Pixels * FSAA Depth * (Pixel Colour Depth + Z Buffer Depth)
Front-Buffer = Pixels * (Pixel Colour Depth + Z Buffer Depth)
Total = Back-Buffer + Front-Buffer

Now, in the case Xenos the front-buffer only exists in UMA memory, so only the back-buffer size is of concern for the eDRAM space.
 
Back
Top