D
Deleted member 11852
Guest
Haha, I think we're all asking that!Maybe PlayGo system is a part of reserved RAM on PS4. Sony use a lightweight BSD system, sometimes I ask myself what are they doing with all the reserved RAM...

Haha, I think we're all asking that!Maybe PlayGo system is a part of reserved RAM on PS4. Sony use a lightweight BSD system, sometimes I ask myself what are they doing with all the reserved RAM...
But unlike Sony, we know that Microsoft has a plan for some of their OS memory reservation, which is support of universal apps. That's the frustrating thing about Sony's 3.5Gb reservation - it serves no clear purpose.Xbox one shows why we may not get huge memory, they opted for 3gb for OS and apps. As noted apps tombstone and close and a bit more memory would have been nice, the main mem is ddr3 so no doubt they could have gone huge if they wanted but cost gets in the way. They want to make the bom as low as possible.
I manage a server farm where none of the servers have below 512Gb and where we are able to monitor ECC bit error detection over time and error numbers are low. But we have calculated that in real terms, that is the cost of the number of bits in a server being used for actual storage rather than parity error detection (one bit per 64-bit word) and re-running a computation if an error is detected, that no parity RAM would serve us better with software error detection - which we're mandated to have as well.8 gigs of DRAM raises the hackles of OS developers, but breaking double or triple digits in capacity with volatile non-ECC memory makes a lot of system designers paranoid.
I am not sure consumer devices can count on the power quality of service for a server farm, or what level of software error detection we're expecting from game consoles.I manage a server farm where none of the servers have below 512Gb and where we are able to monitor ECC bit error detection over time and error numbers are low. But we have calculated that in real terms, that is the cost of the number of bits in a server being used for actual storage rather than parity error detection (one bit per 64-bit word) and re-running a computation if an error is detected, that no parity RAM would serve us better with software error detection - which we're mandated to have as well.
Linus Torvalds is one. He used the recent Rowhammer exploit disclosures to hammer on that point again, although he's been a major proponent for more pervasive ECC for far longer.I'm curious who is getting paranoid about this. Personally, the only machine with 8Gb in my house is an 11" Macbook Air! Everything else has 16Gb or 32Gb. Obviously the RAM supplier, your electromagnet environment and power supply are factors for big data environments.
There is no commercially available equipment in the server farm and for the memory capacity that we buy, non-ECC RAM is impossible to source.I am not sure consumer devices can count on the power quality of service for a server farm, or what level of software error detection we're expecting from game consoles.
Linus Torvalds is one. He used the recent Rowhammer exploit disclosures to hammer on that point again, although he's been a major proponent for more pervasive ECC for far longer.
Can you link? My, admittedly dated, reading of the row hammer exploit was that it was limited to very specific hardware and which could be fixed with better hardware rather than paranoia about large RAM configurations.
The time frame for the most heavily affected manufacturers is rather late in the game relative to when Linus Torvalds started campaigning for ECC as standard. The emphasis on density, which means lower voltages, tightly packed rows, and stretching refresh periods has made it easier to perform physical compromises of the devices, particularly with the cost aspect for commodity products. Stacked DRAM worsens a number of elements, such as how it uses a single array to service a whole burst--which means you can't tack on extra chips for ECC bits. Their density and thermal environments are also worse than they would be for DIMMs.Indeed it could work on low-RAM configurations and the density of the RAM seemed irrelevant.
Torvalds' point was that row hammer would have had serious problems getting off the ground if the integrity of DRAM data had been taken seriously since capacities have become so large.
The DRM-happy platform holders have a vested interest in mitigating exploits that could lead to access to the privileged stores or system reserve that might be holding their validation signatures and encryption keys. The monetary incentive to getting access is significant for well-funded criminal interests. OtherOS was enough motivation to compromise the PS3, and this was helped in part by successful bus glitching--something not related to row hammer but another case where assuming all is well physically compromises assumptions made by system software.While he has a point, I don't know that these types of exploits are high on Microsoft and Sony's issue lists.
I suppose it depends on what is meant by specific when one component of this specificity is every DIMM manufactured for years on end. The consoles are very specific hardware configurations, and there are those that will learn them very well.The vulnerability itself is limited to specific hardware configurations and being able to purposefully exploit this requires specific knowledge of the system in question.
Until there's another exploit they didn't think to test for. ECC and system monitoring would have provided more defense in-depth, because the system would have been able to correct or at least detect the mass of errors related to physical manipulation of the system, which would allow it to clamp down on the iteration rate.I also understand that rowhammer qualification testing is now becoming more common among OEMs and there's no reason console manufacturers couldn't specific this qualification to their DRAM supplier.
It's possible it's some indeterminate amount of time across many devices that make up the full range of the platform his organization winds up needing to bug fix or research.Torvald's second post is definitely interesting, he suggests that data corruption is a big problem although there's no data to support the claim, nor is there any context. Is he talking about an 8Gb machine working for a day or running for a year. Our data suggests different although it's quite possible that ECC DRAM is less error prone than non-ECC DRAM - excepting the parity detection built in.
The DRM-happy platform holders have a vested interest in mitigating exploits that could lead to access to the privileged stores or system reserve that might be holding their validation signatures and encryption keys.
I suppose it depends on what is meant by specific when one component of this specificity is every DIMM manufactured for years on end. The consoles are very specific hardware configurations, and there are those that will learn them very well.
They declared their sample size was not representative, and they could not determine the year of the DRAM's manufacture. There are laptops from the same vendor, same CPU, and same year with different results. From those numbers, 5 of 8 models are compromised and it is possible for there to be multiple false negatives for the same configuration.Google did comprehensive testing and found that not all machines were susceptible, although they anonomized the results. Like I said, specific configurations.
Last post because this is an egregious thread derail. There's no commonalty which pinpoints where issue lies although I wouldn't like to own Model #5 with DRAM vendor BFrom those numbers, 5 of 8 models are compromised and it is possible for there to be multiple false negatives for the same configuration.
Do consoles support virtual memory ?
PS4 has (or had at one point) 512mb of "paged' RAM may could be a yes but it was all bit vague.Do consoles support virtual memory ?
Well yes, do they support. do games at the moment absolutely have a memory limit that they must never exceed or are they free to use a swap file.
Could a dev if they so wish release a game that needed 10gb of memory to run (for example)