Next Generation Hardware Speculation with a Technical Spin [pre E3 2019]

Status
Not open for further replies.
Use the entire 1TB SSD as a cache for games from either external or optical - longest since used is the first to be ejected - with an option to 'lock' a game in. DD games are locked by default and need to be removed manually, with OS prompts for the user to ok cleanup when appropriate.

Covers all use cases and preferences, minimum user input required, minimises unnecessary writes!

And while you're at it, distribute the games with an emulator/hiberfile type "save state"at the start screen so the game is ready to play after about three seconds. You'll need them anyway for your cloud streaming instances!
 
I don't expect a 60 CU GPU I expect anything from 42-48 CUs. I just don't see next gen having the same CU count as the Xbox X.

I realise this is only a rumor, but according to a Adoredtv source, a Navi 20 with 64 cu will have a tdp of 225W and a 60 cu Navi 20 will have a tdp of 200W.
.
 
don't know if this is really true. Personally I think the only console hardware that's launched that was already outclassed by PC hardware has been Xbox One and PS4. Sure, PC hardware has outpaced consoles, but I think if you look back and think of the PC hardware available in, say, 1994, the PS1 would have been considered high end. 3Dfx Voodoo was at least a year away, PC hardware lacked the advanced video decompression of PS1, and it's lighting and geometry engine is very impressive for it's time. PS2 was mighty impressive. It's pixel fillrate is insane, as was it's geometry engine. Xbox launched with what amounts to a Geforce 3, along side the launch of the Geforce 3. Even then, Xbox had twice the vertex shaders. 360 had a tesellator, unified shaders, 4 sample per pixel per cycle MSAA, and competitive fillrate. And the PS3... Well Cell was paper impressive. If you go back further, console's ability to scroll backgrounds and draw sprites destroyed what was available on PC's when they launched.

Anyway, consoles have launched with high end hardware. They just don't anymore. I think it's partly because the PS3 / 360 generation was so long that even midrange parts were a huge upgrade, plus PC hardware's pace, especially in the ultra high end, has really gotten out of hand. I think traditionally (since the launch of 3D hardware) the highest end PC graphics cards have been close to price range of a console, at launch. An X800XT was roughly $450 when the 360 launched, IIRC. A 20GB 360 was $399.99. Now, we've got the ultra-high end PC graphics cards at $1300+. There's no way a console can launch at that price unless they want to get 3DO'd. Which also launched with impressive hardware at the time.

Nah, consoles mostly didn't match high-end pc hardware around the time of release. PSX was towards high-end yes, but i dunno really, how much could a fast P1 166mhz do only on it's CPU alone...
PS2 was impressive yes, but high-end pc matching? I dunno either, high-end pc between march/autumn 2000 would be P3 1GHz/athlon 1.4ghz, 256mb or more main ram, Geforce 2GTS to Ultra 64mb. PS2 kinda needed that fillrate though to get things really done.
Geforce 3 launched spring 2001, Xbox november 2001, NV2A was high-end for the time but constrained in areas where the GF3 wasn't, NV2A had twin vertex shaders. Radeon 8500 had twin vertex shaders, and was probably more advanced then NV2A. For CPU the 733 p3 was outdated by then, as was 64mb total ram.
I wouldn't say directly that OG Xbox matched a high-end pc, perhaps in it's GPU department, but i wonder how a R8500 faired.

X360 was close to X1900 launch, which i think had more raw power to play with, along with much more ram. PS3's hardware came a year too late to be matching high-end parts. Intel C2Q quad cores where around and the 8800GTX. The 360 seemed more capable overall.

For the 80's and early 90's, it depends on if you consider the Amiga etc to be PC's, those outclassed consoles.
All in all i think highest end pc parts outclassed consoles by a large margin for the most, but at a much higher price, sometimes consoles where close or matching like PS1. I agree that consoles seemed to be more 'high-end' back then as compared to now. They used more exotic hardware aswell. Whatever can be said about PS4/Xone, they had a large amount of ram, whilest older gens had tiny amounts of ram.

Anyway's i think it's getting offtopic with this^^
 
Nah, consoles mostly didn't match high-end pc hardware around the time of release. PSX was towards high-end yes, but i dunno really, how much could a fast P1 166mhz do only on it's CPU alone...
PS2 was impressive yes, but high-end pc matching? I dunno either, high-end pc between march/autumn 2000 would be P3 1GHz/athlon 1.4ghz, 256mb or more main ram, Geforce 2GTS to Ultra 64mb. PS2 kinda needed that fillrate though to get things really done.
Geforce 3 launched spring 2001, Xbox november 2001, NV2A was high-end for the time but constrained in areas where the GF3 wasn't, NV2A had twin vertex shaders. Radeon 8500 had twin vertex shaders, and was probably more advanced then NV2A. For CPU the 733 p3 was outdated by then, as was 64mb total ram.
I wouldn't say directly that OG Xbox matched a high-end pc, perhaps in it's GPU department, but i wonder how a R8500 faired.

X360 was close to X1900 launch, which i think had more raw power to play with, along with much more ram. PS3's hardware came a year too late to be matching high-end parts. Intel C2Q quad cores where around and the 8800GTX. The 360 seemed more capable overall.

For the 80's and early 90's, it depends on if you consider the Amiga etc to be PC's, those outclassed consoles.
All in all i think highest end pc parts outclassed consoles by a large margin for the most, but at a much higher price, sometimes consoles where close or matching like PS1. I agree that consoles seemed to be more 'high-end' back then as compared to now. They used more exotic hardware aswell. Whatever can be said about PS4/Xone, they had a large amount of ram, whilest older gens had tiny amounts of ram.

Anyway's i think it's getting offtopic with this^^

You've got this wrong. Consoles typically focused their hardware on specific gaming related functions as opposed to general purpose processing. That's why a Megadrive (1988) could nail a PC in terms of locked 60fps parallax scrolling joy well into the 90s.

And the Dreamcast from 1998 absolutely took a flying dump on anything from PC land in that year.

It's the ever increasing dollar and power budget from the high end PC market since then that allowed the PC market to support silicon that's beyond the wildest dreams of a console vendor. But until the latter 200X's, that comprehensive high end ownership sure as shit want a thing the PC had.
 
The problem is that the oversupply is going to be temporary.
Oversupply is definitely temporary, but NAND makers won't stagnate their offerings either. Adoption of smaller nodes will get them higher density per layer, and all makers are planning on increasingly taller stacks of layers.
So production price per GB will also go down.
 
Xbox Scarlet : XB1X GPU with higher clock rate, 7nm, 7 TFlops, Zen2 CPU, 12 GB gddr5 memory, 400$ price.

Xbox Anaconda: 2 x XB1X GPU, 14 TFlops, Zen 2 with more core, maybe Zen 3, 16-24 GB fast RAM, 500-600$ price.

Best cost saving and power efficient solution. Technically possible.
 
Use the entire 1TB SSD as a cache for games from either external or optical - longest since used is the first to be ejected - with an option to 'lock' a game in. DD games are locked by default and need to be removed manually, with OS prompts for the user to ok cleanup when appropriate.

Covers all use cases and preferences, minimum user input required, minimises unnecessary writes!

And while you're at it, distribute the games with an emulator/hiberfile type "save state"at the start screen so the game is ready to play after about three seconds. You'll need them anyway for your cloud streaming instances!

Forget what I said. I think you have the better solution. I am also wary of the constant writing to the SDD wearing it down. With 1 TB you can have 8-10 games with instant loading and the OS can prompt for swapping of games if needed.

I only hope though Sony will not go for SSD + HDD as some of the "leaks" are saying. Just give us a 1TB ultra fast SSD and leave an empty slot for a sata 4.0 (or nvme) for future expansion.
 
Forget what I said. I think you have the better solution. I am also wary of the constant writing to the SDD wearing it down. With 1 TB you can have 8-10 games with instant loading and the OS can prompt for swapping of games if needed.

I only hope though Sony will not go for SSD + HDD as some of the "leaks" are saying. Just give us a 1TB ultra fast SSD and leave an empty slot for a sata 4.0 (or nvme) for future expansion.
I see several problems with that:

- Price. custom SSD + HDD might not be more expensive than regular off the shelf 1TB SSD
- Speed: According to Cerny, their custom solution is faster than any available SSD. As it's custom, they are not dependant of decade old techs mandatory in off the shelf HDDs, even SSD. And to acheive the speeds displayed in the Spiderman demo, you'll need smart caching strategies that wouldn't be as efficient on slower and regular off the shelf SSD.
- Total storage. 2TB > 1TB
- cheap to have more storage. If there is only a 1TB SSD, (like it's rumoured for XB2 BTW), it won't be possible to use a cheap 4TB HDD. You'll need bigger SSD.

And they won't make optionnal caching strategies like I read in some place if you add an external HDD to the 1TB SSD. That's not realistic for several reasons. It's either smart caching with custom SSD + slower HDD or it's the usual loadings strategies using 1TB SSD.
 
Nah, consoles mostly didn't match high-end pc hardware around the time of release.
Even if high-end PCs had more raw power than the console, what they delivered in games was inferior. Games targeting the console instead of a weaker base PC spec (far less scaling up of assets and resources back then; higher res with tearing or juddering, more shadow casting lights and AA was about it) could do way, way more thanks to a large install base of a solid 'lowest spec' and low rendering overheads. When you then couple what that console hardware was actually capable of, comparing end-of-lifecycle games versus what the best launch PC had to offer, it's obvious why consoles completed so strongly versus PC in gamers minds - producing better games on cheaper (less flexible, unable to do your homework on them also) PCs.

They were two different machines with two different evolutionary paths and the specialisation of games consoles made them better for the task.
 
I say (fantasy)... HD (large maybe 3 tera) plus consistent RAM buffer 32 or even 64 giga of cheap DDR RAM... Main Memory HMB2 (or HMB3) of 16 giga... The console is studied to keep -while idle- the DDR powered. While idle it may be crunching numbers low clocked to gain bitcoins and such.... They say its a revolutionary machine
 
According to Cerny, their custom solution is faster than any available SSD. As it's custom, they are not dependant of decade old techs mandatory in off the shelf HDDs, even SSD. And to acheive the speeds displayed in the Spiderman demo, you'll need smart caching strategies that wouldn't be as efficient on slower and regular off the shelf SSD.

It's either smart caching with custom SSD + slower HDD or it's the usual loadings strategies using 1TB SSD.

Ok, so it's custom SSD. Is it possible for that custom SSD be 1 TB? That's what I was going for. Or is it too costly prohibitive?
The additional HDD or SSD can be added later.
 
Ok, so it's custom SSD. Is it possible for that custom SSD be 1 TB? That's what I was going for. Or is it too costly prohibitive?
The additional HDD or SSD can be added later.
Which defeats your request, if you can add a "slow' consumer grade device it must deal with caching and swapping in and out. If it has that baggage what is the gain of a 1tb SSD something Vs a smaller one and 1tb HDD out the gate.

Same caching employed , still 1tb storage?
 
- Speed: According to Cerny, their custom solution is faster than any available SSD. As it's custom, they are not dependant of decade old techs mandatory in off the shelf HDDs, even SSD. And to acheive the speeds displayed in the Spiderman demo, you'll need smart caching strategies that wouldn't be as efficient on slower and regular off the shelf SSD.

What are you talking about? I've shown that the speeds required for the Spider-Man FastTravel demo are nothing special in this thread. They only loaded anywhere from 600 Meg to 2.25 Gig that are easily hit by SSDs of today. What's really needed are faster decompression and CPU support.
 
What are you talking about? I've shown that the speeds required for the Spider-Man FastTravel demo are nothing special in this thread. They only loaded anywhere from 600 Meg to 2.25 Gig that are easily hit by SSDs of today. What's really needed are faster decompression and CPU support.

Maybe just an adapted CFExpress memory card?
https://www.notebookcheck.net/Sony-...s-for-highest-read-write-speeds.411009.0.html
Gen 3 can go up to 4 GB/s.

Sony is already producing Gen 2 with 1700 MB/s read and 1480 MB/s write.
 
Last edited:
Xbox Scarlet : XB1X GPU with higher clock rate, 7nm, 7 TFlops, Zen2 CPU, 12 GB gddr5 memory, 400$ price.

Xbox Anaconda: 2 x XB1X GPU, 14 TFlops, Zen 2 with more core, maybe Zen 3, 16-24 GB fast RAM, 500-600$ price.

Best cost saving and power efficient solution. Technically possible.

I don’t think that’s that cost effective.

I’m a believer that for it to be cost effective, Lockhart and Anaconda need to be the same soc with Lockhart being a salvaged chip. So maybe the specs for the GPU would be inline with this, but the CPU architecture will be the same and RAM type will be the same as well.

Developing two SOCs requires paying for the development of 2 no matter how trivial the changes between the two. It’s easier to cut down the same chip post silicon.
 
Im a believer that for it to be cost effective, Lockhart and Anaconda need to be the same soc with Lockhart being a salvaged chip.

A salvaged chip is not cost effective. That would mean a cost reduction, but not something cost effective.
Lockhart can have lower, same or even higher demand than Anaconda. If the chips were the same how, and lockhart was dependent on bad Anaconda chips, how would you reply to demand?
 
Last edited:
Status
Not open for further replies.
Back
Top