Predict: Next gen console tech (9th iteration and 10th iteration edition) [2014 - 2017]

Status
Not open for further replies.
I think backwards compatibility matters even more now than ever with the growth of indies and digital download games.

Hopefully Nintendo will use the iOS model for Switch. With regular incremental updates that increase CPU and GPU power without breaking Software. They used to do that with the old Gameboy line, the pocket colour was twice as powerful as the original Gameboy. Although Apple seem to be doubling iPhone GPU power on an annual basis.
I'm not saying reasons for BC aren't there and the rise of digital makes the reasons stronger.
I was responding to your statement that Sony promised BC when they moved to x86.
I'm just saying no they didn't and it's everybody else who says or implies that.

If anything they've said more about that they don't think it's important. Although I personally think they will say that until they have it, then it will be important. So I don't hold too much stock in those statements.

They could even just say psnow is their BC service if you want to play old games, "didn't want to compromise new machine" or something like that.
And BC not being top priority could give them a lot more options when designing the PS5.

Again, I'm not saying it's my opinion, just that they never said that and it's not a done deal.
 
@Scott_Arm spotted this in PC Gamer
http://www.pcgamer.com/wolfenstein-2-system-requirements-and-pc-specific-features-revealed/

"The New Colossus version of the id Tech engine is no longer mega-texture based, which means we have a lot more control over when and what textures are being streamed in," he explained. "With that said, there will always be texture streaming, but in a way that will cause a lot less risk of noticeable issues."

When asked:

I thought this was interesting, because a move away from megatexturing could imply a bunch of things, some of which I believe could result in increase VRAM usage.

When I looked at the minimum required specs for Wolf 2:
Code:
The "Can I Play, Daddy?" (720p and 60 fps at low settings)

[LIST]
[*]CPU: Intel Core i7-3770/AMD FX-8350 or better
[*]GPU: Nvidia GTX 770 4GB/AMD Radeon R9 290 4GB or better
[*]RAM: 8GB
[*]OS: Win7, 8.1, or 10 (64-Bit versions)
[*]Storage: 55GB
[/LIST]
The "I Am Death Incarnate" (1080p and 60 fps at high settings)

[LIST]
[*]CPU: Intel Core i7-4770/AMD FX-9370 or better
[*]GPU: Nvidia GTX 1060 6GB/AMD Radeon RX 470 4GB or better
[*]RAM: 16GB
[*]OS: Win7, 8.1, or 10 64-Bit
[*]Storage: 55GB
[/LIST]
Suspect is the 16 GB of memory. I actually had an interesting issue with Forza 7 constantly stuttering on my 1070 setup until I moved from 8GB DDR4 to 24 GB. Then all issues went away.

Also to note here from the article
"We’re also taking full advantage of Vulkan, allowing us to push improved performance across the board in ways that simply weren’t possible before," Gustafsson said. "The minimum hardware requirements are set to ensure a high-quality experience, and with Vulkan players should see solid performance on a variety of system configurations. However, if you’re aiming for 1080p with high framerate and settings, you would need to bump towards our recommended specs."
If we look at the recommended requirements, we're not seeing a super high CPU combined with a high end GPU. In fact both come across as pretty mid level.

It may be that we're finally entering the next gen API engines.

So I'm definitely looking at requiring more memory as we move forward, I would have to ask the question of how large of an impact memory will be, and will it be more important than raw compute going into next gen. If so, we can see next gen coming sooner as 8-12TF would suffice, as long as you had a lot of memory.

If not, the wait would have to be longer to bring the cost of TF/watt to come down.
 
That's interesting because I thought all of the "it's gonna have 32GB of memory" predictions for the next gen were outrageous. But maybe 32GB wouldn't be that far fetched.
 
32GB of memory would be nice. I think we'll get as much as possible that's economically and physically possible.

Whatever the ram technology, I think 16 chips is the maximum count that will be on the motherboard so then it's just a matter of what densities can be hit at reasonable prices. I think 16Gb chips are very likely in 2020, 32Gb I'm skeptical.
 
Memory price is horrible, I don't know how gddr5/6 will be affected.

8gbits gddr5 have been available for quite a while already. I wouldn't be surprised to see 12gb and 16gb gddr5x/6 parts widely available for a 2019 launch. The fact that gddr6 added 12 and 24gbits capacities has to be because a major member of jedec asked for it.

Maybe money would be better spent on gpu and bandwidth instead of trying to reach 32GB of ram... To get a similar price of $399...

12x 12gbits @ 14gbps (lowest bin in 2019?)
So 18GB total gddr6
672GB/s feeding a 12-15TF gpu
2GB permanent OS reserve
4GB DDR3 1866 32bits on South Bridge
App swapping 4GB in about one second if pcie 4.0 4x to the SB.
16GB real game memory so 3x the PS4P.
 
Unless storage technology advances rapidly in the next few years, I shudder to think of how long it'll take to load levels in games using greater than 16 GB of memory.

Just started playing Destiny 2 on PC and the level load times on a SSD are shockingly long (albeit much shorter than on console).

Maybe just use the extra memory to cache game data so it doesn't have to go to disk to load data as often (like what XBO-X will do with some BC games).

Although that seems a bit of a monetary waste for the console maker.

Regards,
SB
 
Unless storage technology advances rapidly in the next few years, I shudder to think of how long it'll take to load levels in games using greater than 16 GB of memory.

Just started playing Destiny 2 on PC and the level load times on a SSD are shockingly long (albeit much shorter than on console).

Maybe just use the extra memory to cache game data so it doesn't have to go to disk to load data as often (like what XBO-X will do with some BC games).

Although that seems a bit of a monetary waste for the console maker.

Regards,
SB
Is there a proprietary tech that could be developed (or is being developed) to fast load HDD to RAM? Would that solve a lot of issues? It will be a 'special case', but consoles are pretty specialized... so perhaps such a solution is a little more viable for console than say it were for PC.
 
Is there a proprietary tech that could be developed (or is being developed) to fast load HDD to RAM? Would that solve a lot of issues? It will be a 'special case', but consoles are pretty specialized... so perhaps such a solution is a little more viable for console than say it were for PC.
A technology faster than moving compressed bytes over the fastest hardware bus available? I'd like to see this technology. :yep2:
 
A technology faster than moving compressed bytes over the fastest hardware bus available? I'd like to see this technology. :yep2:
once upon a time in my senior year, my professor had me working on helping develop optical (edit: photonic) memory (he owns the patent on it), I had no idea where to begin, I wasn't sure what he wanted ultimately.

Looking back, optical (photonic) does seem like really fast method of data transfer, I'm not sure about the storage part of it.
 
Holodisc. And no. :no:

The best solution we could collectively think of so far has been an SSD caching area the size of one or two games. Or a user supplied small SSD.

It's not easy to find anything intermediate between the southbridge ram and the hdd without breaking the bank...
 
once upon a time in my senior year, my professor had me working on helping develop optical memory (he owns the patent on it), I had no idea where to begin, I wasn't sure what he wanted ultimately.

Looking back, optical does seem like really fast method of data transfer, I'm not sure about the storage part of it.

Prior to engineering, my background was physics (nuclear) and yes, there are very few provable things faster than the speed of light. :yes: However, unless some research has passed me by optical storage generally refers to discs with are slow, because they are physical mechanical systems. :yes:

edit: and by physics I mean chemistry. Most nuclear engineering is chemistry, not physics.
 
Prior to engineering, my background was physics (nuclear) and yes, there are very few provable things faster than the speed of light. :yes: However, unless some research has passed me by optical storage generally refers to discs with are slow, because they are physical mechanical systems. :yes:

edit: and by physics I mean chemistry. Most nuclear engineering is chemistry, not physics.
Wrong choice of words ;) , wrote optical, actually meant photonic, Congrats on nuclear physics though, I bow down. After 2nd year math, I couldn't take much more physics. 4th year physics courses were burning me out, so it's hard to imagine doing nuclear.
 
Holodisc. And no. :no:

The best solution we could collectively think of so far has been an SSD caching area the size of one or two games. Or a user supplied small SSD.

It's not easy to find anything intermediate between the southbridge ram and the hdd without breaking the bank...
then we're stuck with either longer loading periods, or the technology from AMD (where you can operate with less memory) becomes more desirable?
 
Wrong choice of words ;) , wrote optical, actually meant photonic, Congrats on nuclear physics though, I bow down. After 2nd year math, I couldn't take much more physics. 4th year physics courses were burning me out, so it's hard to imagine doing nuclear.

Light particles are photons. :yes: Physics is a fascinating field but you still need a foundation in chemistry. If you don't understand a material's chemical properties, you haven't a hope in hell of determining it's physical properties.
 
then we're stuck with either longer loading periods, or the technology from AMD (where you can operate with less memory) becomes more desirable?
Well not necessarily, if hdd higher capacities in 2019 are twice faster than the 500gb of 2013, and games installs grow from 50 to 100, we get the same load times. A small SSD area to preload/prepare what the current game needs would make it a much better experience.

I think AMD hbcc is only about accessing the PC main ram (or other sources) transparently, it still has the physical bandwidth limitations from the pcie bus, or the main ram. The data has to be somewhere.
 
Last edited:
then we're stuck with either longer loading periods, or the technology from AMD (where you can operate with less memory) becomes more desirable?
There must be some practical limit to what they need for a given scene.

But I mean, when we compare SSD to HDD on current consoles, the data rate doesn't seem to be the issue when compared to the same game on PC, which typically has much more powerful CPUs for decompression in terms of speeding up load times; SSDs help more when random accesses occur with open world titles or potentially loading a saved game.
 
But I mean, when we compare SSD to HDD on current consoles, the data rate doesn't seem to be the issue when compared to the same game on PC, which typically has much more powerful CPUs for decompression in terms of speeding up load times; SSDs help more when random accesses occur with open world titles or potentially loading a saved game.

I can't fathom where the bottleneck is in PS4/Pro. GTA V on Pro from a SSD loads slower GTA V running on my PC off a HDD - equivalent textures.

I'm assuming it's CPU and the the way the world is generated. It doesn't appear to be I/O. :???:
 
Status
Not open for further replies.
Back
Top