Next Generation Hardware Speculation with a Technical Spin [post E3 2019, pre GDC 2020] [XBSX, PS5]

Status
Not open for further replies.
Is there reason to believe GDDR6 will be any worse for latency than GDDR5 was this gen? IIRC, the latency only looks worse than DDR3/4 if you a talking about cycles. Since the GDDRX memory runs at a much higher frequency the actual latency in terms of nanoseconds isn't that different.
My understanding is that latency of any dram standard (ddr, gddr, and even hbm) have been pretty much the same forever. Latency cycles have been proportional to the clock increase, so the actual latency in nanoseconds remain similar.
 
I really hope next gen fully embraces variable refresh rate. Not being bound by/designed for locked 30/60/120Hz would be great. Latest highend tv's already have variable refresh rate. Sony at least should have big incentive here as I believe anyone who has tried vrr/freesync/g-sync would likely desire to have such a display. Might as well sell tv+console. By the time next gen is mainstream so is vrr in new(ish) tv's.

Using vrr might allow for better visuals than forcing locked framerate. And definitely user experience is smoother and tear free.
 
As discussed befor, there's a fair sounding compromise in 48 fps. However, if you can't be sure all your users can use it, would you want it as a target? Put it in as a performance option?? We have that already in mid-gen games. Is it time to offer Quality/Performance modes in every game?
 
As discussed befor, there's a fair sounding compromise in 48 fps. However, if you can't be sure all your users can use it, would you want it as a target? Put it in as a performance option?? We have that already in mid-gen games. Is it time to offer Quality/Performance modes in every game?

Optimize enough for locked 30Hz so that it's ok but not necessarily great(depending how much money&time dev has). Make vrr enabled experience great and use the superior vrr footage as marketing weapon.

Anyway these new consoles are likely 500$ boxes that only sell to high end users. I wouldn't be surprised if audience for 500$ consoles would already have or are capable of/willing to upgrading their displays. Old gen will keep selling to price sensitive people for years to come.

Launch for either console is likely 1+ year away. Install base being signifacant compared to current gen is likely 2.5-3.5 years away. It would be shame to be shortsighted here.

Maybe embracing vrr is one way to make supporting old/next gen less expensive.
 
Variable resolution with a locked frame rate should hopefully become the norm. Maybe it remains difficult to achieve, so some games still need VRR as a crutch to lean on.
 
I do not appreciate dynamic input lag being sold as a feature.

If this is on context of vrr shouldn't it reduce lag? Frame is displayed as soon as it's ready instead of waiting for 15/30/60Hz to be hit before updating... Or worse do something like triple buffering and update real old frames or don't buffer and suffer tearing. Higher framerate should also have less lag. Example could be same game locked to 30Hz or let it run with vrr and framerate might hover between 30-120 depending what user is doing.

I doubt the input lag would vary. It would be sensible to decouple physics and input from display updates and run physics&input at higher rate.

VRR is one of those things that require one to experience it before the appreciation is there. I have 1080ti and even with that card pc gaming experience became so much better with vrr and monitor capable of 155Hz..
 
One example from pc side for vrr is gta v. On my rig the game had random stutters with locked 60Hz and I didn't want to reduce detail, run 30Hz or lower resolution. Now with vrr on experience is butter smooth despite framerate wandering between 40-120Hz. So much better with vrr on.
 
I'm thinking about how Sony could return to the mobile segment.
I think that would very well be possible, now with all these different and compatible machines and AMD's good technical momentum, for Sony to build a Playstation "Mini", with Zen 2 and Navi fabbed on 7nm+ to run the exact same code made for the PS5 at 720p@30fps.

What you think?
Sony can manage to survive abandoning the mobile segment AND ignoring the Switch?
 
I like the idea, but I don't think it would be the best choice to bind the PS5 to the lowest common denominator of a portable.

I think the way Sony stays competitive with the Switch is by buffering a portable with the PS4 and PS5 either side of it in terms of software compatibility.

Slightly feasible tablet specs if fairly chubby:
  • CPU - 8 core Zen 2, without SMT, clocked at 1.8GHz.
  • GPU - 20CU Navi GPU clocked at 950MHz.
  • Memory - a single, low clocked 8GB stack of HBM2. And 4GB LPDDR5.
  • Storage - 128GB NAND. SD card and game cartridge slots.
Code for the tablet could just run natively on the PS5, and both the tablet and the PS5 would be BC with the PS4. Then, let developers decide on which platform they wish to develop:
  • Develop for the PS4 and capture the biggest audience of PS4+PS5+PS tablet.
  • Develop for the PS tablet and face a smaller audience of PS5+PS tablet. But have access to newer architecture.
  • Develop just for the PS5 if going for a 30fps 4KCBR, RTRT technical marvel.
 
For multi skus with different levels of power, can't Sony and MS do the following?

Low end sku $399:
8 core Zen at 3.2ghz
16GB+ GDDR, <14gb/s downclocked
48+ CUs, up to 12 disabled, clocked at <1600mhz
1TB SSD
APU cooled with simple heatsink + fan ala Xbox One S

High end SKU $549:
8 core Zen at 3.2ghz
16GB+ GDDR, 16gb/s
60CUs clocked at 1800+mhz (silicon lottery winners)
1/2TB SSD
APU cooled with vapor chamber cooler

Developers target the base system. Games will be encouraged with to have dynamic resolution / variable frame rate / other dynamic effects. Games auto scale on high end system without any work for the developer.

Does anyone have binning stats for large GPUs? i.e. what % of GPUS have all cores functional.
 
Again these lower power machines make little sense when streaming options with arguably better margins are available to both Sony and Microsoft.

They may have sense if really cheap... That is just 8 giga RAM, no SSD but just HD, no blue ray, low freq clocked so low power dissipation.... Just a replacement for ps4-pro mostly (or One-X)....
 
They may have sense if really cheap... That is just 8 giga RAM, no SSD but just HD, no blue ray, low freq clocked so low power dissipation.... Just a replacement for ps4-pro mostly (or One-X)....
They can do a very profitable streaming box Amazon fire device. That sort of tech is probably good enough for people targeting 1080P or trying to spend less.

Sony and Microsoft want someone to buy a PS5 or Nextbox and a couple titles at launch, remember with BC we could see a relatively slow ramp up of new software sales. The timing would make both platforms eager to generate profits off subscriptions and SAAS to make the whole thing worth the trouble.
 
Well not everyone has a fast connection for streaming gaming. But maybe a connection good enough to download maybe 50 giga in two hours... So the old day Console with local storage has still future....
 
As discussed befor, there's a fair sounding compromise in 48 fps. However, if you can't be sure all your users can use it, would you want it as a target? Put it in as a performance option?? We have that already in mid-gen games. Is it time to offer Quality/Performance modes in every game?

What about 50fps, all HD TVs should support via HDMI I would think as it ties to Pal footage?
 
No compromise. 50hz sucks compared to 60hz. I can easily see the difference in all kind of contents.
I really don't think so. In the past you could see it because games were made for 60hz and than reduced to just 50hz (well most times it was 30fps vs 25 fps). But I don't think you really would see/feel a big difference between 50 fps and 60fps. 50fps would be far better than 30fps and give you a bit more render-time for each frame.
 
I really don't think so. In the past you could see it because games were made for 60hz and than reduced to just 50hz (well most times it was 30fps vs 25 fps). But I don't think you really would see/feel a big difference between 50 fps and 60fps. 50fps would be far better than 30fps and give you a bit more render-time for each frame.

To add I am not sure what content is 50fps and locked, also remember the content playing device if over HDMI needs to renegotiate with the TV or it will likely send that 50fps over a 60hz signal inducing judder

For pal tv Xbox can drop to a 50hz link to the TV.

It won't be as good as 60 but locked and correctly frame paced it will be smooth and a big jump from 30.

US TV wise I would be surprised if they did not accept it. It's part of the blu ray spec I think. It's the old interlaced content that they probably struggle with. 50hz progressive should be simple.
 
Status
Not open for further replies.
Back
Top