I think that we could manage with just 2 GB in the next generation if virtual texturing becomes the norm. Without virtual texturing something like 8 GB would be pretty good, but then again I don't personally want to see increased level loading times. Most current games have way too long loading screens already. More memory = more data needs to be loaded from HDD to fill it up.
Maybe this should be another thread (Competing Next Gen Rendering Approaches and Resource (RAM) Projections) but to throw this out: We saw this generation a swing from Forward Rendering to various Deferred Rendering approaches. I am not sure how anticipated this was and, reflecting back, how this knowledge would have changed the 2005/2006 consoles had this been better understood at the time. I mention this because while virtual texturing may address some issues now, and indicate a lower need for memory, I would deposit some skepticism. Are developers really ready to completely anchor themselves to such and stick with 2GB of memory until 2020? Especially when RAM is cheap and there are so many obvious benefits outside of just storing textures (and no one says having a lot of memory forces long load times--that is a design issue as virtual texturing is still an option) and historically memory footprint limits has been a point of tension/limitation. Don't get me wrong, I love what devs have done with virtual texturing but there do appear to be some limits it places technologically (just look at RAGE). Just throwing that out ... now if it was a toss up between 2GB RAM and a small fast SSD versus 4GB of RAM and only an optical drive (which the impression I get we could be lucky for such a scenario...) it becomes easier. Anyhow, just thought I would toss that out and see what you think. If you disagree we will just have to be men and settle it in some online Trials in the coming weeks