Next Generation Hardware Speculation with a Technical Spin [2018]

Status
Not open for further replies.
liolio said:
actually the 8GB came pretty late in the PS4 design

Did it really? When would constitute 'late'? Because I highly doubt it was just before PS4 reveal in February 2013 like it is often made out.

Question for devs: Right at the beginning of developing a game at what point does RAM budget need to be determined?
 
When would constitute 'late'?
As I've seen it described (I have no actual insider knowledge), the devkit was always meant to have 8GB RAM, and the retail model 4GB. Then this changed at some late-ish date, exactly when I've never seen anyone pin down precisely, but then again it's not a subject that I've spent any time at all researching either, so... :p
 
Yep so the 8GB model was already designed for the dev kits so it was kind of easy to make it the retail model.

Usually Devkits have more memory indeed, Sony have 16GB devkits available now...
 
Last edited:
I'm sure everyone would be happy with 24-32 GB RAM for a less than $500 small form factor machine coming out in about 2 years or so but that seems less likely with the stagnation in memory. The Titan cards have been sitting at 12 GB for a long while now.

With so many industries needing DRAM, China trade war uncertainties, and the miner craze, it's better to keep expectations in check and maybe be pleasantly surprised.
 
Nov 2012 would be my guess when they decided to go with 8GB. It was surely always a possibility left open and there had to be a set milestone when it had to be final. Adam Boies talked about the meeting when they decided it was necessary, but refused to give any hint about when it happened.

The moment they decided had to be at least a couple months before they dropped the speed to 176GB/s which leaked in january. We learned only later why that speed drop happened, the same part running at 192GB/s would have to be 176 in clamshell.

How much time before january is unknown, but it would have to be enough time before for the procurement contract to be fully negotiated and signed, and the new contract would be doubled in volume which is over 200 million chips per year to secure from samsung. Then enough time for writing and distributing new specs once it's sure to happen, then the leak from swiss cheeze studio giving it to vgleaks, then vgleak hyping the leak until the january release.

A full year before launch is a comfortable lead time for procurement. It's three months before unveiling which isn't enough time for a final enclosure design (different because of the need to cool both pcb sides, more power, changes emi validation, etc...). Which would explains why the enclosure wasn't ready in february.
 
Did it really? When would constitute 'late'? Because I highly doubt it was just before PS4 reveal in February 2013 like it is often made out.

Question for devs: Right at the beginning of developing a game at what point does RAM budget need to be determined?
They never told exactly when, just that it was something they were evaluating. It came anyway late enough so I remember presentation about AAA games (killzone iirc ???) that were designed with 4 GB in mind.
As for making thing clear, you reaction without much of a conversation and an emotional tone does not reflect well on your motives, it ain't a fan war.

EDIT
The point was more about how much do we really need? 8GB came out great, comfortable, room fo OS and services, though I suspect devs got a little "loose" (I mean saving man hours is always welcome, not a harsh comment, it is business and pretty large project managements) which you lose in loading time here and there. Killzone could have ran on 4GB fine, what I'm saying is that it is not like silicon has gone cheap and console manufacturers are operating outside of economical rules even with some level of subsidizing. I don't know how much memory price fluctuation are affecting Sony and MSFT at the moment. What I see is that Sony did not vouch it worse it to increase the RAM amount in the Pro, I guess there is price/costs but also ROI for such a move.
 
Last edited:
PSVR 2 could be using inexpensive ToF. It's always been a problem with cost because of the size of the sensor, meaning expensive silicon and lenses. Sony is launching a ridiculously small VGA ToF for smarthones. I guess it means probably below $20 on the BOM. VGA would be enough spatial resolution. Sensitivity will be a bigger concern.

https://www.google.ca/amp/s/www.bar...wave-of-3-d-sensing-says-bernstein-1521247768
On smartphones, the highest resolution we have seen for a small module was only 320 x 180. But here, there was a major breakthrough by Sony. Sony managed to reduce the pixel size of the time-of-flight image sensor significantly, and they are now able to produce a VGA resolution, the small sensor that is capable of being used on a smartphone. And they are ready to ship the sensor by the end of this year.
 
I honestly wonder if there's actually any need for more than 16-24GB. Sure, the 7th gen was starving for RAM, but those were stuck at 512MB which was as much as a single high-end graphics card at the time.
Is there any PC game demanding more than 16GB RAM nowadays?

Right now, console makers will have to make do with the ongoing market conditions, and that means saving up on RAM costs.
For example, 8GB HBM + 16GB DDR4 at 128bit. Or even 8GB HBM + 12GB LPDDR4X at 96bit.


Did it really? When would constitute 'late'?
To the point of release window games only using 3GB (1GB OS + 3GB game IIRC).

Killzone could have ran on 4GB fine
I think Killzone Shadow Fall specifically ran on 4GB, among others, for the reasons mentioned above. I probably won't be able to find the presentation where that was mentioned, though.
 
I honestly wonder if there's actually any need for more than 16-24GB. Sure, the 7th gen was starving for RAM, but those were stuck at 512MB which was as much as a single high-end graphics card at the time.
Is there any PC game demanding more than 16GB RAM nowadays?

Right now, console makers will have to make do with the ongoing market conditions, and that means saving up on RAM costs.
For example, 8GB HBM + 16GB DDR4 at 128bit. Or even 8GB HBM + 12GB LPDDR4X at 96bit.



To the point of release window games only using 3GB (1GB OS + 3GB game IIRC).


I think Killzone Shadow Fall specifically ran on 4GB, among others, for the reasons mentioned above. I probably won't be able to find the presentation where that was mentioned, though.
Around 4.6GB
 
VGA rez, what does that buy you in terms of distances and accuracy and whatnot, anyone know?
Difficult to figure out because it's usually smoothed and averaged. The problem is also noise and frame rate. I doubt it can be used by itself, it should be combined with higher resolution RGB. VR is missing full body tracking at a high enough frame rate, the rest is pretty much fine with simple cameras.

If they can make VGA res in a smartphone camera module, maybe they can do higher in a webcam format, or single chip RGB-Z. From what I understand, they are shrinking a tof cell coming from their softkinetic acquisition, and apply it into their BSI sensors technology. Which should make it extremely sensitive and high frame rate.
 
Softbank/ARM/NVIDIA (Saudi Vision fund has $4 billion worth of NVIDIA.)

Masayoshi Son CEO of Softbank-ARM and Saudi Crown Prince in New York today looking for investments.


ARM/NVIDIA announce AI partnership today.

Maybe not directly console business relevant but higher level technology developments unfolding that will definitely shape the future.

Recall that former Qualcomm CEO Paul Jacobs was spearheading an initiative to have Softbank buy Qualcomm after Broadcom deal got Trump-blocked. Softbank-Vision fund has major assets that will be behind major future tech game-changing developments.
 
I think at 7 nm could be possible to just clock the ps4 pro new HW higher and while keeping full compatibility with PS4 and PS4pro@16nm (just by downclocking) also get maybe 6 TF like the OneX... So will have a PS4 PRO 2.... And with little effort SW houses can use this extra power to make games looking even better specially on frame rate front (and this can be usefull for a new VR maybe HDR compatible)...

as I know 7nm is a bit strange... So it may be really usefull to sell the less performing chips as vanilla Ps4pro and the better as ultraPs4pro
 
Last edited by a moderator:
Can you please stop posting such ARM suggestions now (and by please, I mean I'll remove future OT posts and possibly issue a reply ban). That is clearly related to nVidia's existing use of ARM in mobile and automobile etc. There's no direct correlation between that and consoles, and the ARM in console discussion needs things like price and performance of actual silicon to be worthy of a tech discussion. So presently your just shilling for ARM and diluting the discussion.
 
I think at 7 nm could be possible to just clock the ps4 pro new HW higher and while keeping full compatibility with PS4 and PS4pro@16nm (just by downclocking) also get maybe 6 TF like the OneX... So will have a PS4 PRO 2.... And with little effort SW houses can use this extra power to make games looking even better specially on frame rate front (and this can be usefull for a new VR maybe HDR compatible)...

as I know 7nm is a bit strange... So it may be really usefull to sell the less performing chips as vanilla Ps4pro and the better as ultraPs4pro
Sony said there won't be any other ps4 upgrade for the rest of the generation. Only the Pro.
 
From gamersnexus... Hynix said gddr6 is going to be 20% more expensive than gddr5 at launch, and the cost difference will drop to 10% as production ramps up. I assume that percentage is based on the same node for both. No info on the price drop from new processes for either gddr5 or 6, but they will probably track together with that premium. It's hard to figure out since memory manufacturers technologies exist in a world of their own, and the price drops are based on supply/demand instead of production costs.

Depending on how long it takes to drop from 20 to 10%, it sounds like a 16GB next gen is the most reasonable :runaway:

EDIT: From previous announcements, Hynix is using 20nm-class at launch, while samsung said they would use 10nm-class which they claim provides a 30% productivity gain over their previous node. It certainly explains why samsung announced higher speeds than hynix or micron, and why they announced 16Gbits parts before anyone else.
 
Last edited:
amateur question here guys, the silicon wafers that are made to create our Processors. Are wafers also needed to create memory?
I recall reading about a power outage for 30 minutes that caused some sort of backlog at Samsung or something of that sort.
 
Status
Not open for further replies.
Back
Top