It's not hard, everyone 'gets that'. It's the amount that matters, and the justification. These consoles are an engineering challenge, developed by rational people employing logical thinking to set their targets and numerous compromises (we hope). At every step along the way, you can look at the choices and compromises and nod and say to yourself, "okay, I see why they did that. Maybe not what I'd have done, or maybe so, but it makes sense." Low powered consoles, BC, monster consoles, little RAM, eDRAM, etc., all make sense within the boxes they fit. Reserving 1 GB when your OS uses 500 MBs makes sense, for leg room. But reserving 1.5 GBs for OS, which isn't a full-fledged PC, adding 1 GB for 'apps' where there's nothing apparent that'd ever need that match, and then having another 1 GB loose that no-one's sure what to do with so lets the devs access it but may pull it away (8GBs unified RAM to make it easier for the devs, remember, and not 4.5 GBs unified RAM and 1 GB partitioned where you have to develop your code with attention to if it's pulled from under you, RAM) doesn't make sense (with the information currently available)
For me, the debate isn't whether games would use 7 GBs or not, or whether there should be room to grow or not. It's about why 8 GBs GDDR5 was chosen if only half of it is available, and whether the non-gaming content is being efficiently developed or is wasteful software-engineering bloat. Why not a physical RAM split between game and OS (GDDR5 and DDR3), giving devs a unified memory because they only operate in the game space? That'd have been a much cheaper solution. Why 1.5 GBs for the OS, especially when 512 MBs was enough when the system only had 4 GBs total? Why is 1.5 GBs not enough to do everything you could want to do? Why 1 GB additionally reserved?
Taking PS3 as an example, at launch we all said, "why on earth do they need so much RAM?" And they didn't. They didn't need 120 MBs, only 50. Now you can say that it was a mistake to drop the RAM and miss out on cross-game-chat, but that shows lack of foresight and wouldn't have needed another 70 MBs to implement either - Sony should have had CGC in mind from the beginning (and probably did because we had it hinted at numerous times, and there's a possibility the lack of CGC had nothing to do with RAM and everything to do with policy makers).
What I don't get is how portions of the B3D populace can see a deign choice and accept it without question and curiosity or solid, logical justification. "Just in case," could be applied to Sony putting in 16 GBs and only allowing 4 GBs to devs. There is a line drawn where 'enough is enough'. I want to know the justification for 2.5+ GBs for the system.