Predict: Next gen console tech (9th iteration and 10th iteration edition) [2014 - 2017]

Status
Not open for further replies.
What's an X390?
And I also can't see much of a problem with playing bloodborne or COD in VR for hours. There's nothing about VR that precludes using the controller for movement input.

I think x390 refers to the upcoming AMD card (Fiji 390X)


Well, apart from headaches and nausea inherent to VR (yes, with faster refresh rates and better resolution it will improve)

I believe for traditional fast paced FPS (twich games are called now??) which rely heavily with the right analog, (this corresponds to the head movement, right?) . I mean, I don´t see how you could make those fast movements and not braking your neck. lol

Well, you could move the analog and not you head for looking, but I think part of the problems of VR come from detachment between what you see and what you feel.

I could be way off base here, i know

(I haven´t tried VR, and with my prescription glasses i´m afraid that i never will :( )
 
I'm not sure what technically we need but I'm going to aim conservative. I think a reasonable milestone for next generation graphics (2019) is approximately 1080p @ 60fps with 3 bounce GI and respective shadow casting lights going up with highly detailed shadows with a healthy dose of high quality AA. And some more geometry pushing power? I think asset building is getting more and more costly, so whatever can be offloaded to the GPU to generate things like leaves, grass and hair might be ideal in reducing the budget of AAA titles while simultaneously providing a higher fidelity world.

I'm slightly skeptical on the need for higher resolution unless we are demanding screen sizes to be larger than 65" using 65" as the low end. We can make a debate about the increase of dpi etc, but it just seems to me like a whole lot of wasted horsepower (on resolution) on a market that isn't quite there yet.
 
I think x390 refers to the upcoming AMD card (Fiji 390X)


Well, apart from headaches and nausea inherent to VR (yes, with faster refresh rates and better resolution it will improve)

I believe for traditional fast paced FPS (twich games are called now??) which rely heavily with the right analog, (this corresponds to the head movement, right?) . I mean, I don´t see how you could make those fast movements and not braking your neck. lol

Well, you could move the analog and not you head for looking, but I think part of the problems of VR come from detachment between what you see and what you feel.

I could be way off base here, i know

(I haven´t tried VR, and with my prescription glasses i´m afraid that i never will :( )
VR games will have their own new genres, and ports of classic FPS won't work. It will never be a situation where all games are VR. I don't really undestand how it would influence the console design, sure it would need more power for an acceptable IQ, but more GPU power is a good thing regardless. If they really focus on VR, would it change the balance between cpu, gpu, memory, etc... ?

There are problems with VR, but it's not "inherent", there are limitations that are being discovered and documented, there are specific things that cause headache, the devs have to learn not to do those things. There's a target latency that needs to be respected. It's really a learning process right now.

Morpheus was designed to be used with prescription glasses, I don't think next gen would backtrack on this.
 
I'm going to go super conservative here because all the 4K/VR configs make no sense to me when it comes to A) cost and B) cooling.

2018 specs
CPU: 12 x AMD low power x86 cores
- Intel don't do custom CPUs and are too damn pricey when they do
- I'm ruling out ARM as those cores are still god awful from an IPC point of view esp. on FPU code, also x86 is far easier to find coders for
GPU: ca. 2016 mid-range GPU
- so whatever the fastest mid range AMD card is (prob 2/3 of whatever the R9 490 turns out to be)
- I'm ruling out NV as they pissed MS off something fierce w/OG Xbox, their CPU plans are a shambles and who wants to hire two design firms for a single ASIC (esp one with the poor rep on heat NV has)?
Process: 12/14 nm
- Not Intel? Then god help you out bidding phone companies for the best process nodes
Optical Drive: BD 100 drive
- There's your 4K support right there and it helps to alleviate b/w costs in developing markets (aka BRICs)
HDD: Hybrid 4TB w/64 or 128GB flash
- I don't see flash being cheap enough to go fully flash based w/o making the drive too small to store more than 2-3 games at once
 
@mr Fox I fully agree, games will be tailored to the VR experience.
Well, the problem it´s not the glasses per se it´s the prescription lol, 3d is wasted on me....
 
I believe for traditional fast paced FPS (twich games are called now??) which rely heavily with the right analog, (this corresponds to the head movement, right?) . I mean, I don´t see how you could make those fast movements and not braking your neck. lol
Echoing Mr. Fox, you wouldn't want to play those games in VR. VR will be like real life, and you'll want an immersive game that approximates real life, with human like encounters. The FPSes in VR will probably be a lot more realistic, with less gun-ho and a lot more sneaking about, taking pot shots, etc. Over time, boundaries may get pushed if people can adapt to them, creating faster-than-life pacing. But certainly VR games will be tried and tested on the VR systems rather than straight ports. The devs themselves will be play-testing and will experience broken gameplay when it happens.
 
Echoing Mr. Fox, you wouldn't want to play those games in VR. VR will be like real life, and you'll want an immersive game that approximates real life, with human like encounters. The FPSes in VR will probably be a lot more realistic, with less gun-ho and a lot more sneaking about, taking pot shots, etc. Over time, boundaries may get pushed if people can adapt to them, creating faster-than-life pacing. But certainly VR games will be tried and tested on the VR systems rather than straight ports. The devs themselves will be play-testing and will experience broken gameplay when it happens.


Yes i know.
I was just answering to prophecy, in its current incarnation VR is not suited to COD and the likes, the future? Who knows, Neural implants anyone? :)
 
Good hardware support for foveated rendering would be important if VR becomes mainstream, is this something that can be done already or does it need new stuff(tm) in the gpu?
 
Yes i know.
I was just answering to prophecy, in its current incarnation VR is not suited to COD and the likes, the future? Who knows, Neural implants anyone? :)

I dunno, lots of folks that have played the VR version of halflife 2 on the Occulus Rift have said that it's the best VR experience they've had. And I'm sure that's with hte controller... or maybe it was with that Razer motion controller...?

I forget...
 
Good hardware support for foveated rendering would be important if VR becomes mainstream, is this something that can be done already or does it need new stuff(tm) in the gpu?

Wouldn't it simply need eye tracking in the headset, and the rest be done in software?
 
I dunno, lots of folks that have played the VR version of halflife 2 on the Occulus Rift have said that it's the best VR experience they've had. And I'm sure that's with hte controller... or maybe it was with that Razer motion controller...?

I forget...

It was with Razer Hydra and user mod that puts HUD directly into the VR world.
youtu.be/-RehCTRrWM0?t=329

HL2 is good example of VR game, because the world and user movement speed are very natural. One of my goals for early days of VR is to play through Black Mesa and HL2 in VR. :D It's gonna be awesome [except in those frenetic and bumpy watercraft/car sections].
 
Wouldn't it simply need eye tracking in the headset, and the rest be done in software?
In theory. Just need two or three render views, one full screen, low res; one where you're looking, high res, with a feathered edge blend or something, I expect.

My concern with foveated rendering is whether the eye-tracking latency can be low enough. If the delay between saccades and rendering updates is too great, there'll be an awkward transition where everything jumps into focus. But there's a refocussing period after a saccade so there should be time. Fove is out there now, so I guess it's workable.

http://arstechnica.com/gaming/2015/05/vr-headset-company-fove-is-betting-on-eye-tracking-to-compete/

This from 4 days ago. Eye tracking is in Fove but not foveated rendering yet.
 
Id vr is such an unknown and they are designing now, may we see a return of the good old expansion port for flexibility and late design changes with regard to vr, or is that just too risky for security now days.
 
In theory. Just need two or three render views, one full screen, low res; one where you're looking, high res, with a feathered edge blend or something, I expect.
Would that require many times the geometry throughput?

No idea if this is possible or not: Suppose the GPU architecture could render directly on a non-rectilinear projection (for VR fisheye, spherical, etc..) or a progressive resolution (foveated) on a single frame buffer, it could be much more efficient than rendering multiple times and blending.
 
Would that require many times the geometry throughput?
You'd want some LOD filtering, but yes, it does increase some overhead. There's a quote in my above Ars link from a researcher mentioning "the significant overhead of foveated rendering."

I was wondering about non-rectilinear rendering too, but I don't think that's supported on the GPU, though I'm far from knowledgeable there! But I believe they transform everything based on a standard flat camera projection. This is demonstrated by super wide game FOVs lacking barrel distortion and being unable to render fish-eye. Being able to render to the lens's distortion would be the ideal. That's two areas for future VR development - efficient LOD rendering for foveated rendering so the one scene is processed at the required detail for each pixel based on view (possibly doable now in software in the shaders using existing LOD techniques) and spherical optics for rasterisation. All of which is easily solved by transitioning to realtime raytracing. :runaway:
 
I am a console gamer and I don't follow GPU progress outside of R&D for console. And I forget the first 28nm GPU the AMD 7970 was released in December 2011. And we will probably not change of process node before next year maybe 4 years and a half to 5 years after.

The next process node will probably be there for 4/5 years. Next year mid GPU is maybe the PS5 GPU...
 
Question is, was the 7970 high end in 2011? If so, and I have no idea cause ain't nobody got time to check, then we could very grossly predict that the consoles will get whatever the higher end GPU on the market was two years before release. Or something.
 
Yes it was the highest end single GPU on the market. Well, it was paper launched so not available until early January maybe? Nvidia had the 580 at the time, the 680 was about 3 months after the 7970 (late march/early april 2012 I believe).
 
Sounds about right. These days getting a GPU which was the highest end 1/1.5 years before sounds quite a good deal for a £299 little box.
 
Status
Not open for further replies.
Back
Top