Next Generation Hardware Speculation with a Technical Spin [post E3 2019, pre GDC 2020] [XBSX, PS5]

Status
Not open for further replies.
The reason why I'm waiting next year for the 3080 series of cards
I am not sure things will change much by then, next gen games with more demanding graphics will be released, PC versions of those games will get even more demanding Ultra settings, and RT will be used more often. So the status quo will remain.

What will change is your ability to play current gen demanding titles comfortably. I saw you mentioned Witcher 3 and GTA V as examples of 30fps on PCs, in my experience the The Witcher 3 NEVER drops to 30fps even at 4K and HairWorks, it's hard locked to 60fps, GTA V can run sub 50fps on 2080Ti @4K if you use TXAA + 4X MSAA, it can drop to sub 40fps if you push 8X MSAA, but otherwise it's locked to 60fps as well. These are lightweight games.

What I am talking about are the heavy weight champions, games like Quantum Break, Control, Metro Exodus, Final Fantasy XV, Ark Survival Evolved, Watch_Dogs 2, Kingdom Come Deliverance, Agents Of Mayhem, Ghost Recon Breakpoint, Rise of Tomb Raider, Shadow of Tomb Raider and Red Dead Redemption 2 ..etc. These games barely sustain above 30fps @4K with max settings, which usually involve RTX, insane draw distance, crazy shadow settings like HFTS and VXAO or a combination of them.

We also have several CPU locked games, where you can never hope to increase fps because our single threaded CPU performance is not fast enough, games like ARMA 3, Flight X Simulator, Total War Warhammer 1, Total War Warhammer 2, and pretty much most Total War games too, in addition to fancy things like Ultimate Epic Battle Simulator, all of them drop to 30fps or even under that when the screen is filled with so many objects, and/or massive draw distances.

This forum is one of the only places where I actually see a PC gamer that opts to go 30 FPS.
Before my 2080Ti, I had a 1070, so I pretty much locked myself to 30fps @1440p in many of the above mentioned titles, because I only play at max visuals.
 
Yea, looking at his resume; he is a driver/API writer in his time at Caustic.
I'm not sure that means they are running PowerVR for PS5 (I don't put much stock in this thinking). They could need him for a variety of tasks, but he appears to be the VR team here.

If they ran PowerVR, I would have expected some sort of 'public announcement' about it soon. Imagination would love to have their stock go up

Plus it is messy. AMD as the manufacturer would be forced to license the technology and pay royalty fees. On top of that if AMD has its own homegrown solution, having access to ImgTec technology could open itself to future litigation because there would always be an ongoing question of whether AMD's own solutions incorporate ImgTec's IP. Also, BC could force royalties payments on both AMD and Sony for an indefinite period of time.

Hiring someone from ImgTec doesn't mean licensing their tech. Apple has gutted ImgTech of its engineers over the last few years with the intention of producing its own tech and weaning itself off of ImgTech licensing fees and royalty payments.
 
Plus it is messy. AMD as the manufacturer would be forced to license the technology and pay royalty fees. On top of that if AMD has its own homegrown solution, having access to ImgTec technology could open itself to future litigation because there would always be an ongoing question of whether AMD's own solutions incorporate ImgTec's IP. Also, BC could force royalties payments on both AMD and Sony for an indefinite period of time.

Hiring someone from ImgTec doesn't mean licensing their tech. Apple has gutted ImgTech of its engineers over the last few years with the intention of producing its own tech and weaning itself off of ImgTech licensing fees and royalty payments.
On the topic; how did AMD and IBM handle it for 360?
 
Hardware is only one piece of a console platform. If sony hired the software and tools guy from caustics/imagtec, the most obvious assumption would be that it was for him to build sony's new software and tools.
 
Next gen won't be 30 FPS nor 60. It'll be VRR.

Stop trying to gatekeep "good videogames" through the one feature you like, people. We should have grown beyond that by now.


Lol shit. Okay.
what about the intel+AMD chip? Is this similar in nature? Or is the PowerVR RT block too specialized such that the level of integration needed is much more complex?

https://siliconangle.com/2017/11/06/rival-chip-makers-intel-amd-join-forces-take-nvidia/

That's just a discrete GPU chip AMD sold to Intel, which then connected to an actually very boring 4-core Kaby Lake through a PCIe 3.0 8x bus.

The only true technology sharing I see in there is AMD adopting Intel's EMIB to connect to the HBM2 stack, instead of an interposer.
 
On the topic; how did AMD and IBM handle it for 360?
I remember hearing that with the original Xbox, Microsoft had signed deals with intel and nVidia that mandated a certain price per chip, and it never decreased with age. It was an oversight on MS part, because by the end of the generation the core components of their console cost exactly the same as they did at the start, but the console had price cuts to remain competitive. That was part of the reason MS was so eager to move on to 360. Later, when backwards compatibility was announced (Xbox to 360), there was an interview where someone had stated that one of the reasons BC was a late addition was because certain things (texture formats or framebuffer formats or something like that) was patented by nVidia. I think the biggest thing MS learned from their first consoles was about licensing and preparing for the future, and I think that's part of the reason why 360 BC is so good on Xbone.

With that history, I think however it was handled on 360, Microsoft was much more aware of how technology licensing was handled, and they were careful to not mess up that part of it. It might also be possible that MS owns Xenos' design outright. There has never been a PC part quite like it.
 
I wish devs would put a 30fps better gfx mode in their 60fps games. I love dem gfx and don't mind 30fps locked with Good motion blur :p
 
... except for @Silent Buddha, who likes 120.

Well, not yet, but I'm definitely planning for the future. :)

With temporal reconstruction likely becoming a large focus for next gen, 60 FPS would be like this generation's 30 FPS. 120 FPS will be like this generation's 60 FPS.

I can already see this in titles like Control where the temporal artifacts at 60 FPS make it look like a 30 FPS game (IE - horrible, IMO). One of the reasons I'm glad it's an Epic store exclusive (so that I'm not tempted to buy it) as I'd be extremely unhappy playing it at 60 FPS due to the temporal artifacts. 120 FPS is where I'm shooting for next gen.

So, I haven't gone 120 FPS yet, but I plan to be at 120 FPS before or soon after the next gen consoles hit in preparation for how games will likely be rendered.

Temporal reconstruction next gen is going to look absolutely horrendous at 30 FPS, IMO.

Regards,
SB
 
It crops up every so often and the only conclusion we reach is that some of us like 60fps, whereas some of us are fine with 30... except for @Silent Buddha, who likes 120.

Well, it's also true that in motion, a 60fps title does look better then a 30fps one. DF even commented on that some times. Their Halo reach pc analysis says at 60fps animations etc look better. Frame rate is a part of graphics, i mean devs could do a SotC and prefer graphics at 15 to 20fps, or even lower.

60 FPS would be like this generation's 30 FPS

Hope so, 30fps with the amount of motion blurring we got doesn't do all that nice to me. Some even dip (way) below that target, in special on the base platforms. After the PS2, which was mostly about 60fps in AAA games, we somehow went to 30. Maybe with PS5 we will see an upswing again.
 
If you watch the Project Scarlett reveal trailett trailer you will notice that:

1. When the NVMe module is shown, the module is standing on its end, as though it were being supported by something. If you look at the top of the module, you can see what looks like two screw heads. Eventually the video reveals that there indeed is something behind the NVMe module, which I suspect is a hard disk drive. So a ssd + hdd combo.

2. Looking at the part of the video showing the gddr6 chips, one can infer the number of memory chips. Counting counter-clockwise, starting from the top left corner, 3, 3, 3, 1, 4. Total of 14 chips. If 1GB chips are used,that would mean the amount ram is 14GB. If 2GB chips are used, 28GB, like for a devkit.
 
If you watch the Project Scarlett reveal trailett trailer you will notice that:

1. When the NVMe module is shown, the module is standing on its end, as though it were being supported by something. If you look at the top of the module, you can see what looks like two screw heads. Eventually the video reveals that there indeed is something behind the NVMe module, which I suspect is a hard disk drive. So a ssd + hdd combo.

2. Looking at the part of the video showing the gddr6 chips, one can infer the number of memory chips. Counting counter-clockwise, starting from the top left corner, 3, 3, 3, 1, 4. Total of 14 chips. If 1GB chips are used,that would mean the amount ram is 14GB. If 2GB chips are used, 28GB, like for a devkit.
You are late to the party. Some have already spotted the model of the chips, both 1GB and 2GB are used.
 
Status
Not open for further replies.
Back
Top