Predict: The Next Generation Console Tech

Status
Not open for further replies.
Doesn't make sense because games need to support baseline hardware which would limit how engines can be designed. It would perhaps work for occasional loadtime reduction and higher res textures/less popup but it wouldn't allow radical engine redesign requiring fast streaming. Also it would add another path that doesn't just need to be coded but also tested. It can add quite significant cost to QA.

i'm not for the ddr5 addon, but what if your baseline console is one that count on the possibility of a variable size memory?
something that the system make trasparent to the programmer, but that can be used for the caching of recent files

maybe that can be done differentiating the pro from the elite
one ships with 250GB of HDD, and the other 400GB of hybrid HDD with 4GB of solid state memory
 
Variable sized memory can't be made transparent. Read game developer presentations to see how far they go in squeezing every bit of performance out of the hardware, the two approaches can't work together.
 
I do tend to agree with this. People point to the Xbox as the example of "see power means nothing" but I feel like it's a misleading example.

Overall I'm with Erick, I believe the most powerful system (if there is a clear advantage, bigger than PS3/360 or PS1/N64 for example) will generally triumph in the hardcore space (and the Wii though it still lost to the HD twins as a whole, is not in the hardcore space). But that will trend OT as always :p

People argue against that, yet for example every Wii U speculation thread is full of Nintendo fans absolutely pining for one more ounce of power in the system (check neogaf), etc. It seems actions speak louder than words. I believe in the hardcore space graphics are the most important factor, and there isn't a close second. I once read Jaffe lamenting that he wished for one console of baseline never increasing power or something, so that in his words, all game devs like himself dont have to spend 70% of their time (of course with Jaffe it was, 70% of their f*cking! time :p ) trying to squeeze better graphics into their game and instead can concentrate on gameplay. My take is, well they spend 70% of their time on graphics because that's what people care about.
I completely agree. Not to mention RAM gives you access to exclusive games. For instance, Doom 3 on the Xbox -it couldn't be released on PS2 and GC-.

Also more RAM means your console is more future proof.

Question is..., what would you prefer, investing on a console that will last 5 years or investing some more money on a console that can last 10-11 years? :idea:
 
And yet you people still haven't explained how it's good ROI; the folks buying the console 5-10 years down the line are not the hardcore. With price drops, they also are not the ones helping to mitigate the increased costs that you are all suggesting. It's also nowhere near the same situation as between 64MB vs 512MB or whatever the WiiU is eventually going to have.
 
And yet you people still haven't explained how it's good ROI; the folks buying the console 5-10 years down the line are not the hardcore. With price drops, they also are not the ones helping to mitigate the increased costs that you are all suggesting. It's also nowhere near the same situation as between 64MB vs 512MB or whatever the WiiU is eventually going to have.

So how was it good ROI to have 512 MB ram in the PS3 and 360?
Don't answer that. :)

Anyway, with my plan, only the hardcore would need to pay for faster loading/ less UE3 style texture pop-in. Sounds like a good deal to me.

In the case of GT5 you can in a way already do this, by investing in a SSD which will cut loading times in half.. At the minimum :cool:
 
I don't say they should ship with 8 or whatever number, I just think its a good idea to have the consoles a bit customizable.
Both Nintendo and Sega did it (ram carts) before. In a PC it also works, 2vs4 GB for example and to my knowledge it's not that difficult to implement for developers. Even when used as a texture cache it would greatly benefit the loading or streaming. Even if it's just ddr3.
See my concept as an optional ram-disk


I agree fully on your point that it would be expensive over the full lifetime to include the extra (and faster) memory by default.
 
And yet you people still haven't explained how it's good ROI; the folks buying the console 5-10 years down the line are not the hardcore. With price drops, they also are not the ones helping to mitigate the increased costs that you are all suggesting. It's also nowhere near the same situation as between 64MB vs 512MB or whatever the WiiU is eventually going to have.

Ehh, it's a balancing act. I'm willing to bet the ROI on 360 adding the 256 MB was off the charts. I'd bet had they added an additional 256 MB on top of that, that ROI would have also been very good, and they'd be in a better position today (it's probably arguable at least on the 2nd 256 though)

Of course, there comes a point where it does not make sense. Especially if you already significantly technically outgun the other guy. In my mind you only need to be noticeably better than the competition. Beyond that wont really help you, or more correctly to say, isn't necessary.

you'd need 32x2Gbit or 16x4Gbit chips

So this isn't doable? Guess I need to study up. I look at high end graphics cards, 2GB GDDR5 cards are out there, anyway.

Couldn't you do 32X2 or 16X4 now and worry about reducing down the number of chips later as it becomes possible? Seems like you could to me.

I just maintain we have to get something that provides a noticeable jump over this gen, and that will last 7-8 years possibly minimum, and I dont see 2Gb fitting that bill. I'm guessing Epic is working closely with MS right now, and I'm sure they're saying "here's what we need to run Samaritan, to run UE4 the way we'd like, and to provide a next generation leap", and I imagine it's beefy numbers, and I doubt MS is going to just disregard that. The very fact there is a U4 rather than just "here throw some more stuff on UE3" sort of proves it.

I dunno, I cant wait to find out just exactly how theyre going to tackle all this stuff.
 
http://www.engadget.com/2011/09/07/sony-announces-a-4k-projector-for-the-home-at-cedia-prices-hmz/
less than a dollar per column of pixels, great :D

But about the memory, I know Sony allows users to change the hard drive.
Would it be a real big stretch... to have a DDR5 slot which allows the user to upgrade the cache?
It could be fairly simple for developers to implement, and it would allow users to cut loading times, if they want to pay for that. The console only needs an extra connector/lane/memory controller which is less expensive than the ram itself would be (probably)

Wouldn't work. It's not like a PC with a heavy OS that does all kinds of memory management for developers. In the console world, the devs do all of the memory management themselves. If you threw more memory into a console, it wouldn't do anything because they games are not programmed to use it.

To make this work you'd have to design a console OS that somewhat takes memory management out of the hands of the devs, which causes all sorts of problems. In the console space the devs like to know at exactly which memory address a piece of information will be stored in, and they have full control of the memory layout. Coding to the metal, as they say. One other solution would be to write code for different configurations (512MB, 1GB, 1.5GB), where any variation outside those parameters would be useless. It wouldn't be worth it in the end. The devs would still have to make sure their games work and perform optimally on the lowest spec of memory. So is it worth the time and effort to make different code paths for different memory configurations? You wouldn't be able to take advantage of that extra memory in the same way you would if it was the default configuration, and it adds a whole new level of testing complexity. On top of that, you get people putting bad dimms in their console, who'd end up blaming the console vendor or developer for their "POS buggy game/console!" Not worth it.
 
Appreciate your patience, Scott, I personally really don't know why we have to repeat the same year-old circles again and again...
 
If Wuu is considered next-gen then 2012 is pretty much a sure bet. Only question is when will Sony-MS follow.

I personally don't (consider Wii U next-gen).

I think that Epic making developers excited with their new version doesn't mean what some people may think it means. The first developers will start working for next-gen platforms up to 3 years before it releases. Epic is basically previewing their next-gen engine, which is a good sign as I reckon both Sony and Microsoft will make having a stable Unreal Engine ready to go well before launch a big priority. But it also means that the first next-gen projects probably won't start until 2012, which imho means / confirms we won't see next-gen before 2014.
 
Ehh, it's a balancing act. I'm willing to bet the ROI on 360 adding the 256 MB was off the charts. I'd bet had they added an additional 256 MB on top of that, that ROI would have also been very good, and they'd be in a better position today (it's probably arguable at least on the 2nd 256 though)

Well, my main beef here is the 8GB vs 4GB, which for Microsoft, would be akin to them including 1GB in the 360, which would have made the console quite a bit larger and more complicated with the tracing on the mainboard for years to come, not to mention even much more supply constrained as GDDR3 700MHz was the cutting edge in 2005 and one of the major reasons the console was supply limited in the first place.

I do realize that PS3 represented a 16x increase, but again, there's the physical I/O involved here as well. PS2 was smaller than Xbox.



-------

One thing I would like to point out is how the Xbox 360 dev kits still only used 512MB RAM for years because they used identical hardware. There was just no room for double the RAM chips. It wasn't until the 1Gbit GDDR3 chips were in mass production that they finally updated them.

Will devs need that doubling for dev kits next gen? Maybe not, but they'll be wanting to make use of the whole amount of memory without the headaches of fitting dev tools in memory simultaneously with the game content.

So this isn't doable? Guess I need to study up. I look at high end graphics cards, 2GB GDDR5 cards are out there, anyway.

Couldn't you do 32X2 or 16X4 now and worry about reducing down the number of chips later as it becomes possible? Seems like you could to me.
Doable... it's just there are a number of factors involved.
 
I suppose it does depend on whether the console is released end of 2012 or goes to end of 2013.

MS may make the decision to go to 8GB and if they do, kudos to them. I am more interested in the CPU and GPU right now :).
 
Also more RAM means your console is more future proof.

Question is..., what would you prefer, investing on a console that will last 5 years or investing some more money on a console that can last 10-11 years? :idea:

The PS1 had 3.5 MB, the Saturn had 4 MB - that Saturn lasted 5 years and the PS1 lasted 10. The PS2 had 40 MB, the Xbox had 64 (+1 scratchpad on the HDD) - the Xbox lasted 5 years and the PS2 is 11 and still going!

The relationship between longevity and specification seems ... complicated.
 
Wouldn't work. It's not like a PC with a heavy OS that does all kinds of memory management for developers. In the console world, the devs do all of the memory management themselves. If you threw more memory into a console, it wouldn't do anything because they games are not programmed to use it.

Not even in the realm of doing caching and prefetch from the optical disc? If there is a potential surplus of RAM in one model of a console, I'd think the console's libraries could be made to use that RAM pretty transparently when reading from the read-only optical disc or when reading and writing to the hard drive.

Of course, games couldn't be written to depend on that being there, so timings, frame schedules, etc., would have to be 'good enough' without that.

PSP did see some benefit from the extra RAM in certain models, though.
 
I consider an expanadable RAM drive as certainly a possibility, but pretty much irrelevant to the topic of the next-gen consoles for the reasons described. It won't be useable as RAM in the same way it is in a PC, so the games would still be designed around a core RAM configuration, and it's that configuration, and the processors running on it, that this thread is about. There are other about upgradeable consoles.
 
So we're talking about having an upgrade slot for people to install additional RAM that would be used as a cache space instead of caching to a hard drive, or as an additional layer of caching? Optical media -> HD -> cache RAM -> RAM. I'd say that's very unlikely to be a design consideration.

Edit: What Shifty said.
 
Prefetching from an optical drive above and beyond the few MB buffers they usually have doesn't seem like a good Idea to me.

Any cache is going to have to not move the read head for fear of making access times totally unpredictable, so all it would be doing is reading ahead when the game wasn't reading. I'm not sure you get very much additional value.

I guess the streaming layer in the OS could just cache every read sector, and eject on some LRU basis, but you introduce the risk of introducing seeks into reads that other wise wouldn't have them.

The streaming system has to accomodate the worst case (nothing cached) anyway, so I'm not sure what it really buys you.

For linear levels or those with limited branching, streaming is relatively easy to implement.
For open world games it's "hard" and comes with a lot of tradeoffs.
 
Status
Not open for further replies.
Back
Top