Predict: The Next Generation Console Tech

Status
Not open for further replies.
Marketing. UE4 will end up on everything from PCs to mobile phones. He's giving platform holders the bait. Again. No different than what Rein has been doing. They have to sell an engine, after all.
 
Marketing. UE4 will end up on everything from PCs to mobile phones. He's giving platform holders the bait. Again. No different than what Rein has been doing. They have to sell an engine, after all.

Sure, UE3 is on IOS but that dosent mean that platform is capable of running Gears of War at this time anywhere near acceptable quality
 
I stopped reading after this

Well... sure PC games, even console ports, do tend to look better... but the margin has become quite slim.

And don't forget, it's an american (probably) talking to american press... PC gaming in the US is virtually extinct. Not that there's NO pc gamer, but compared to Europe, that's something very different. In Germany, the PC section in a lot of stores is often bigger than all three consoles combined (my local Saturn for example).

But PC gaming has turned around quite a bit in the recent past, too. With direct downloads making massive strides it's no wonder that retail isn't willing to carry those games anymore, when it's more convenient to buy your games online AND download them directly. And there's no difference either (well, except the packaging), because with PC games you already have to register and whatnot your game anyways, rendering second hand sales non-existant, too.

Still... it's funny seeing him say that, after having released the Samaritan Demo and some games running their engine, that look miles beyond anything the consoles could ever hope to display.
 
But is he talking about console players? In which case I can agree that they'll see graphics that they've never seen before.

PC gamers like myself will be harder to please as although there's not really any 'next gen' games out on PC given the right mods a game can look next generation quite easily.

Skyrim for example below.

http://static.skyrim.nexusmods.com/downloads/images/16200-1-1335822018.jpg

That screenshot is not impressive at all really.Next gen has to look WAY better:cry:
 
Marketing. UE4 will end up on everything from PCs to mobile phones. He's giving platform holders the bait. Again. No different than what Rein has been doing. They have to sell an engine, after all.
It will be available on phones, but I'm quite sure that it will take a while.

If they use something like compute for micro polygon renderer with stochastic sampling, they will need a nice phone to be able to do that. ;)
 
Last edited by a moderator:
In my opinion SSD as cache drive for streaming is highly unlikely for next generation for two reasons:
- poor reliability while used for intensive read/write operations;
- price - it would be cheaper, faster and more relaiable to include 2,5 inch 120 gb HDD (40$) + 8 Gb DDR3 Ramdisk (40$) just for cache.

I think that data streaming is the most important challenge for nextgen. Other than that Sony/MS just have to choose the parts.
 
Well... sure PC games, even console ports, do tend to look better... but the margin has become quite slim.

And don't forget, it's an american (probably) talking to american press... PC gaming in the US is virtually extinct. Not that there's NO pc gamer, but compared to Europe, that's something very different. In Germany, the PC section in a lot of stores is often bigger than all three consoles combined (my local Saturn for example).

Still... it's funny seeing him say that, after having released the Samaritan Demo and some games running their engine, that look miles beyond anything the consoles could ever hope to display.

Head over to overclock.net and tell them that... you'll be shot :LOL:
 
price - it would be cheaper, faster and more relaiable to include 2,5 inch 120 gb HDD (40$) + 8 Gb DDR3 Ramdisk (40$) just for cache.
You'd then have to load 8 Gbs of data each time you want to use 8 GBs of cache. Flash (as a chip on board rather than SSD) would keep all that data between plays, so could be much faster in startup. It'll also be a lot cheaper, but of course nothing like as fast as DDR3. Still, if the purpose is just to provide the fastest data IO, and not to provide a glob of system RAM, flash gets my vote (although I'm not sure how it'll be affected by wear).
 
And here comes reliability o NAND Flash. Memory cells tend to wear after each write cycle, which is not a problem in normal day to day usage, but may be when SSD is used as a cache drive.

You only get a write to the cache when an item is inserted into it. An item is only inserted into it when you miss cache. The larger the cache, the fewer misses, as well as larger capacity for redundancy.

Imagine a chunk of flash large enough to cache an entire game. Once cached, you won't see another write until you play something else.

IMO, they should stick 96GB flash in the base model, use half for cache, the rest for system data and DLC. The extended model should have 96GB flash and a large (1TB) HDD.

Cheers
 
You only get a write to the cache when an item is inserted into it. An item is only inserted into it when you miss cache. The larger the cache, the fewer misses, as well as larger capacity for redundancy.
Streaming assets, especially like megatexturing and megameshing, could see quite a lot of rewriting of data I'd have thought. That said, you have a very good point. A large enough cache would reduce wear. Then again, it'll take an age to populate as well and people who vary their games will be constantly flushing it to replace Bioshock with Halo, and then with Forza, and then a bit of Halo again. Full 50Gb game copies every time will be pretty boring!

If sebbbi's around, maybe he can throw in some estimates of how many tile copies a megatextured game would write to cache (depending on cache size) in the course of play? If this is dozens of times and not hundreds, a long life NAND chip should work in a five year lifespan box. If the cache will be written to often, perhaps a replacable cache-cart would be the solution?
 
Streaming assets, especially like megatexturing and megameshing, could see quite a lot of rewriting of data I'd have thought. That said, you have a very good point. A large enough cache would reduce wear. Then again, it'll take an age to populate as well and people who vary their games will be constantly flushing it to replace Bioshock with Halo, and then with Forza, and then a bit of Halo again. Full 50Gb game copies every time will be pretty boring!

It's a cache, it is demand loaded. On a miss, data is read from the BR disc and stored in the cache. You don't wait for an install, the game loads slower initially, then much faster.

Let's assume an average read speed of 25MB/s ( 6 X BR speed) from the optical drive. You can write 25MB/s for 450 days nonstop to a 96 GB flash with a 10K write cycle limit.

Another way to look at it: A BR disc holds 50GB, with a 10K write cycle, 365 days in a year and a 5 year expected lifetime, you can completely load 10 discs per day, every day for five years, that's a lot of game changing.

IMHO, It is absolutely not going to be a problem.

Cheers
 
flash with a 10K write cycle limit.

Modern flash does not have 10k write cycles. The most common MLC varieties are somewhere between 1k-3k, and the upcoming TLC is going to have 100-500 cycles. Each node shrink, the write cycles of the flash chips halve, and the size of the eraseblock (and thus write amplification) doubles. This will keep happening until no-one wants to shrink flash anymore.

If you want to put flash with 10k write cycles in there, take the market price and quadruple it.
 
Modern flash does not have 10k write cycles. The most common MLC varieties are somewhere between 1k-3k, and the upcoming TLC is going to have 100-500 cycles. Each node shrink, the write cycles of the flash chips halve, and the size of the eraseblock (and thus write amplification) doubles. This will keep happening until no-one wants to shrink flash anymore.

It's not 10K cycles, but it is not as bad as you make out. Intel/Micron's 20nm MLC NAND has similar number of write cycles to their 25nm products, 3-5K. This is the amount of cycles for a block to reach data retention of less than a year. I think lifetime could be extended by actively scrubbing the flash ram, reprogramming blocks which haven't been touched in a given time period.

And since we're caching data from an optical medium, were data is optimized for long sequential reads, write amplification will be a non-issue, IMO.

Cheers
 
How much does it cost to have memory card readers on a console? It seems that Microsoft and Sony could put out a memory cards specialized for the purpose. They could be physically large, like a PS2 memory card rather than a SD/microSD card. Let them be a good deal slower than the average SSD if need be (though faster than SD), give the cards and the console operating systems the ability to monitor and report on the wear level of the card, and treat them as replaceable items.

There were a lot of complaints about Sony producing proprietary cards for Vita, but if these are produced and marketed appropriately, they could possibly avoid being compared unfavorably to SD cards or 2.5" SSDs by occupying a middle space between those products.

It's been nice on the PS3 being able to use any 2.5" HD, but having proprietary hard drives doesn't seem to have hurt the 360 that much, and a modified / shrunk SSD style card could provide performance far better than what a HD could provide.
 
And if you wanted to return to the PS2 use model, you could put two readers on the console. Buy a new card, stick it in the second slot, and copy all your old stuff over.

Right now on the PS3, you have to use an external USB-attached hard drive to back up your internal PS3 drive, then swap the internal drive, then restore.

Microsoft's solution of providing a cable to let you transfer the HDD's contents to a new one when you upgrade is an improvement over the PS3, but I think that having two readers on board would be better than that.

I think one of the things that worked so well on the PS and PS2 was that they both felt like appliances. Having a memory card is a nice simple abstraction for 'digital storage', and is easy for users to reason about.

Tablets and phones with built-in storage are simpler still, but they either require desktop synchronization for backup or the use of a variety of cloud services to handle data synchronization. If we're talking about having large games held in on-board storage, then we'd be talking about possibly hundreds of gigabytes to re-download several games.

Much easier to slide in a second card to make a backup or to upgrade.
 
Stopped reading there, because that's simply not true. PC gaming is going through a bit of a Renaissance at the moment thanks to the stagnation of this generation

It will be available on phones, but I'm quite sure that it will take a while.

If they use something like compute for micro polygon renderer with stochastic sampling, they will need a nice phone to be able to do that. ;)

Of course, that much is obvious - the economies of scale come into play (like running a game on low vs Ultra or whatever). However it is in the best interests of any respective third party engine maker for it to be as scalable as possible.
 
Last edited by a moderator:
But a vsynced 60fps at 1080p with high levels of AF/AA would be a massive deal in the console world, at least with forum goers. The only reason it's downplayed is because it's the PC doing it and people treat it as a given.

To get additional graphical effects and top of that could simply be considered a bonus even if they are minor, which to be fair in most games they are. But then you get the top tier games like Crysis 2 and BF3 that that come with considerably improved graphics, at least if you have an eye for them.
 
Status
Not open for further replies.
Back
Top