NGGP: NextGen Garbage Pile (aka: No one reads the topics or stays on topic) *spawn*

Status
Not open for further replies.
Don't kid yourself. Every tech company is loaded with those people. It's just a matter of degrees (bad pun intended). bkillian didn't seem to be happy they'd gone with more of a bottom-line mentality, but it isn't as if you have a bunch of business guys engineering the software and hardware.

Yes I was referring to the people leading the overall direction of current team instead of the actual personnel being business guys engineering the software and hardware.

Of course you wouldn't have the business guys actually engineering the stuff but you do have them having a huge influence on the design choices.
 
I have been the most stoic resistor to the notion of massive OS reservations. However, if all the rumours and noises are saying that, it's time to reconsider rather than indulge in denial. What exactly is a console OS? I mentioned Orbis's significant 512 MB OS footprint, and if that includes all the features of the PS3 optional functions that consumed significant MBs (things like Friend List taking 20 MBs, Voice Chat taking 20 MBs, all on top of the OS reservation), it's not such a huge chunk relative to PS3.

So what could 3 GBs be? Maybe there's 1 GB HDD cache, handled by the OS? Maybe enough for a webpage for in-game guides? There's going to be something taken up by Kinect and voice recognition. IIRC they are more data driven than processing driven.

In short though, I think it's time to remind ourselves these are consoles are the OS will be console-centric, serving needs that we don't associate with PCs and smart-devices.

You don't think a Windows 8 variant (already similar in look and feel to Xbox Dashboard) built to be navigated with Kinect and voice commands) could be a possibility? 3 GB is a lot of memory. There'd have to be something pretty significant running to make use of it.

A 1 GB HDD cache would be interesting. Could be a boost to games that require efficient streaming from disc and mitigate the need for installs. A web browser service that could be integrated into the game is also a neat idea. Enter a menu option that serves web content, like videos etc. How much memory did Kinect eat up on the 360? I imagine those numbers will go up quite a bit with higher resolutions, higher camera framerates, and maybe stereo camera. I thought Kinect used more processing resources than it did memory?
 
Yes I was referring to the people leading the overall direction of current team instead of the actual personnel being business guys engineering the software and hardware.

Of course you wouldn't have the business guys actually engineering the stuff but you do have them having a huge influence on the design choices.

Sometimes the worst thing you can do is let engineers come up with the overall concept/strategy for a product.
 
With regards to a rumored 3 GB OS reservation...

Think about it another way. If this is meant to be the center of your media room experience then there will be many things that will need large persistent memory pools.

- Fast task switching and/or multitasking of background applications. Media streaming, media recording, social feed updates, e-mail, etc. Something the 32 MB OS in X360 is unable to do. The only thing that remains in memory while in a game is a very light hypervisor. You cannot switch to the main UI from within a game without unloading the game because there isn't enough memory for it. Hence you have a relatively long pause as the game is unloaded and the main UI is loaded. This will not exist on a system with 3 GB of memory reserved for the OS. You will be potentially be able to easily switch from your game (without unloading it) to your main UI, browser, email, social feed, whatever, at anytime nearly instantly. Something that you cannot currently do while playing a game.

- Speaking of internet browsing. This can easily eat up far more than 512 megs of memory assuming they include a properly tabbed browser able to open multiple web pages. It'd be silly not to. And if they want to be the center of the living room, this is going to be something they will HAVE to have.

- Speaking of all the above. It may seem superfluous to some. But again. As cheap as DDR3 is, why not have enough memory to allow for the machine to offer features that are in other competitors that aren't gaming consoles. SmartTVs and GoogleTV, for example. Why not offer that functionality as it doesn't impact gameplay in the slightest. Any hardware you have available for gaming will be more than enough to handle those tasks. So the only increased cost is potentially on the software developement side. And that is Microsoft's biggest strength. Having 4 GB versus 8 GB of DDR3 at the bulk purchasing prices MS is able to negotiate is probably on the order of 10 dollars or so.

Basically it comes down to, "Why not?" The benefits are potentially huge. The drawback is basically nothing. ~10 dollars or so per console? Which probably still ends up being far cheaper than GDDR5.

But then why not GDDR5? Well, cost is pretty obvious. Less obvious is that while it has higher data throughput it also has significantly higher latency. Said latency is easy to hide for graphics rendering, hence why it is used for high end GPUs. But said latency is very bad for CPU tasks as CPUs are not good at hiding latency. Low latency is best. It is the main reason that DDR was generally faster in computing tasks over Rambus' RDRAM during the Pentium 4 era.

So, go all GDDR5 and you end up with high data throughput combined with high latency. Or potentially faster GPU performance for potentially slower CPU performance. While you have the opposite situation with DDR3.

The best would be a combination of DDR3 for the CPU and GDDR5 for the GPU, but that means a split memory pool and higher costs. Would the benefit be enough?

Lets look at that another way. The GPU appears to be somewhere between a 7770 and a 7850. A 7770 has a bandwidth of 72 GB/s while the 7850 has 153.6 GB/s.

Going to main memory alone it would appear that it might not be enough. The question then becomes. Does the DME combined with the 32 MB of ESRAM ameliorate the potential bandwidth discrepency? If it does then they have potentially maintained graphical performance while not sacrificing CPU performance and while also saving a fair bit of cash.

Of course, with regards to cost, is that then negated by the DME and ESRAM? Possibly. But not if those also bring additional benefits. And not if the combined cost of those two elements is less than the cost premium of GDDR5.

Regards,
SB
 
So, go all GDDR5 and you end up with high data throughput combined with high latency. Or potentially faster GPU performance for potentially slower CPU performance. While you have the opposite situation with DDR3.


No quite right,in fact most video cards of 2011,2012 had GDDR5 if the memory had such high latency they would not use it,XDR and GDDR5 are different designs.

Cell had XDR and worked well,the bottle neck was having a split poll,using 512MB of XDR wasn't business for sony it was to expensive,and using DDR3 for the whole memory would have cripple Cell performance..

XDR actually helped the PS3,DDR3 would have not deliver,hell the 360 rely on Edram to fix its bandwidth problems..
 
You don't think a Windows 8 variant (already similar in look and feel to Xbox Dashboard) built to be navigated with Kinect and voice commands) could be a possibility? 3 GB is a lot of memory. There'd have to be something pretty significant running to make use of it.
Win8 doesn't need 3GBs. 3GBs in a Win8 PC is multiple gigs of applications and data, like photo editing. I can't see these being uses on Durango. I can see a large amount for a web browser, and as Silent_Buddha says, there's something to "just put more in; it's cheap enough!" (we see the same in Wii U). On that bus, games can't access all of it every frame, so having more available to the devs won't yield any gains.

That's why I think we should look for a few big-ticket items that can easily gobble up lots of RAM. A persistent browser is one. We may also have some system-level functions like display - front buffer and system UI could be in this reserved OS RAM (same for Orbis).
 
Looking at it from another perspective.

Microsoft probably is expecting PS4 to be using maximum 4GB RAM.
They know they're using DDR3 and that does mean that they can't use THAT much bandwidth per frame, but because DDR3 is cheap, they went ahead and got 8GB of it anyway.

Both of these basically is telling Microsoft that there is probably no f***ing way that devs will fill up anything past the 5GB point with anything really meaningful other than cache.

We're looking at 68 GB/s here. Lets target 30fps which gives us the largest bandwidth per frame with a reasonable framerate.
That gives us a mere 2.28GB/frame, assuming perfect efficiency.
5GB is more than TWICE that.

If I were a Microsoft engineer and faced such a situation, I would surely say ya go ahead and reserve 3GB for the OS and all the other stuff we might stuff in later.

What's the difference for the game department if they reserve 3GB or1GB?
You'd still be looking at bandwidth of 2.28GB/ frame, with total memory of 5GB vs 7GB.
Large difference? hmm.
Reserve 3GB? go for it.

jumping into conclusion at its finest..

first, no way an OS will allocate 3GB on a console WHILE gaming
second, the real world:

Durango:
To render current frame access to first 2.3 GB in memory.. Done
[while loading assets to the rest of ram]
.
.
until, scene change
.
to render a whole different frame access to SECOND 2.3 GB in memory.. Done
[while reloading assets to the first 2.3 GB of ram]

go to start

Orbis:
To render current frame, access to first 3.5 GB in memory.. Done
[Ram busy, no more space to load more assets]
.
.
until, scene change
.
to render a whole different frame access to 3.5 in memory.. NOT Done
[PAIN: load assets FROM BLURAY slow..slow..slow]

and this for every time the frame nedds fresh assets and data.



conclusion: yes, 8 GB are useless because you can access 2.3 GB per frame.
and we are talking about loading from bluray, but you can use the extra memory to create procedural textures, to data not bounded to graphic frame etc

how could you think that engineers at microsoft are so stupid to put 8GB and then realize that are useless so they waste 3 GB in OS (Windows vista maybe will allocate 3 GB, not a lean kernel for a console)
 
Don't kid yourself. Every tech company is loaded with those people. It's just a matter of degrees (bad pun intended). bkillian didn't seem to be happy they'd gone with more of a bottom-line mentality, but it isn't as if you have a bunch of business guys engineering the software and hardware.
You are correct. They just set the direction, required features, and budget. Then the folks who want to make it awesome have to fit their awesomeness into those limitations. This happens in every company, the PS3, for instance, does not have dual-HDMI out and a 2 port gigabit ethernet hub, despite those features actually being announced. Why not? Someone realized their awesomeness would not fit in the budget.

At the start of the XBox and XBox 360 projects, you had J Allard, Peter Moore, and Robbie Bach setting those features and budgets. Budgets that were considered "strategic" (Meaning losses were irrelevant as long as marketshare was gained). Now you have Mark Whitten, Ben Kilgore and David Treadwell. All good people, but none of them visionaries. Working with a company directive that IEB must now be, and remain, profitable. And _grow_ profit from year to year.

In meetings, I got a similar feel from them as I got from the Kin leadership at times. I'd ask a question and get a handwavey response. "What about the growing trend of data caps?" I'd ask. (Answer: "These are not the droids you're looking for, move along" - paraphrased by me). "What about the large percentage of xbox users who have never connected their XBox to a network?", I asked. Hand wave... (I'm changing the questions from what I actually asked so as not to give info I shouldn't, and the questions were not necessarily related to future platforms - some of them have launched.)

So sure, the engineering guys are going to do their damndest to give you the absolute best they can come up with given their budget and priorities, but those budgets and priorities are not the same as they used to be in the XBox division.
 
We're looking at 68 GB/s here. Lets target 30fps which gives us the largest bandwidth per frame with a reasonable framerate.
That gives us a mere 2.28GB/frame, assuming perfect efficiency.
5GB is more than TWICE that.

you are assuming that a program is referring to the same chunck of memory most of the time...it's not..

it's better for microfot put 16gb :p
 
You are correct. They just set the direction, required features, and budget. Then the folks who want to make it awesome have to fit their awesomeness into those limitations. This happens in every company, the PS3, for instance, does not have dual-HDMI out and a 2 port gigabit ethernet hub, despite those features actually being announced. Why not? Someone realized their awesomeness would not fit in the budget.

At the start of the XBox and XBox 360 projects, you had J Allard, Peter Moore, and Robbie Bach setting those features and budgets. Budgets that were considered "strategic" (Meaning losses were irrelevant as long as marketshare was gained). Now you have Mark Whitten, Ben Kilgore and David Treadwell. All good people, but none of them visionaries. Working with a company directive that IEB must now be, and remain, profitable. And _grow_ profit from year to year.

In meetings, I got a similar feel from them as I got from the Kin leadership at times. I'd ask a question and get a handwavey response. "What about the growing trend of data caps?" I'd ask. (Answer: "These are not the droids you're looking for, move along" - paraphrased by me). "What about the large percentage of xbox users who have never connected their XBox to a network?", I asked. Hand wave... (I'm changing the questions from what I actually asked so as not to give info I shouldn't, and the questions were not necessarily related to future platforms - some of them have launched.)

So sure, the engineering guys are going to do their damndest to give you the absolute best they can come up with given their budget and priorities, but those budgets and priorities are not the same as they used to be in the XBox division.

Are those budgets rigid? Rigid in the sense of what happens when engineering guys come up with a significant strategic improvement which would slightly surpass the budget...is there any chance for them to argue and discuss? Maybe zhis happened with Wuu...
 
No quite right,in fact most video cards of 2011,2012 had GDDR5 if the memory had such high latency they would not use it,XDR and GDDR5 are different designs.

GPUs are built around being able to hide latency in the 100's of cycles. CPU's are built to deal with latency in the 10's of cycles and single digits where possible, hence the emphasis on low latency L1/L2 caches.

GPUs require bandwidth and their highly parallel nature allows them to hide the latency that comes with it. They are designed with an eye towards hiding latency rather than lowering it.

CPU's on the other hand are much more highly reliant on low latency. Much of a CPUs is designed around reducing the latency for memory access.

In other words, you'll never see a computer use GDDR of any sort for main memory because the latency is far too high.

So, coming back around to Durango. 3 GB reserved for the OS can really only mean one thing. There is an expectation that the console will run applications. Perhaps many of them. Perhaps even applications normally only found on a desktop computer. Hence CPU performance is going to be more critical than it was for X360 and PS3. DDR3 makes sense in that case where a large memory pool combined with low latency is beneficial.

The drawback, of course, is that DDR3 alone is unlikely to have enough bandwidth to drive high resolution 3D gaming. Hence the specialized hardware to assist (ESRAM and DME). And the rumored combined memory speed of 170 GB/s of memory speed which in theory is more than enough for the GPU included (7870 has only 153.6 GB/s). Of course that assumes that things work out like the AMD and/or MS engineers have predicted. And we'll only really see that when games finally come out.

Now will this all play out well? As a game console it just has to be close enough that there aren't huge discrepencies in game IQ and performance. As the PS3/X360 era has shown us, people aren't terribly moved to switch from one to the other even when the graphics quality is highly divergent (bayonetta, early COD, etc.). It just has to be close enough. And close enough is farther apart than what forum warriors think it is.

When it comes to the living room however. As a media portal it'll be potentially difficult for PS4 to compete. So lets assume it wins in overall gaming IQ slightly. Compare the IQ of some of the better AAA PC games at medium, high, and Ultra to see how small the diffence can be to your average consumer. Between High and Ultra the difference is hard to see. Between medium and high it can be difficult for some to see depending on the game unless they know what to look for and/or see them side by side. Now lets say it loses in media portal robustness and functionality. It's impossible to say which will win out.

Now compare to those other devices that this will potentially compete with. GoogleTV, SmartTV (indirectly) and the whole host of setop boxes designed around Android, Linux, and Windows.

And if we go by the few developer comments that have leaked out. With regards to gaming performance, they are indeed quite close. But ultimately we'll have to wait and see what the games are like. Not just within the first year, but 5 years down the road. If anything it's at that 5 year mark when PS4 may or may not be able to show up the next Xbox in a noticeable way.

Regards,
SB
 
Then again when the budget did not matter, it was about a fight for gaining market share. Now that goal has been reached so a lot of things aren't as important as they were 7 years ago. It does make a kind of sense...
 
jumping into conclusion at its finest..

first, no way an OS will allocate 3GB on a console WHILE gaming
second, the real world:

Durango:
To render current frame access to first 2.3 GB in memory.. Done
[while loading assets to the rest of ram]
.
.
until, scene change
.
to render a whole different frame access to SECOND 2.3 GB in memory.. Done
[while reloading assets to the first 2.3 GB of ram]

go to start

Orbis:
To render current frame, access to first 3.5 GB in memory.. Done
[Ram busy, no more space to load more assets]
.
.
until, scene change
.
to render a whole different frame access to 3.5 in memory.. NOT Done
[PAIN: load assets FROM BLURAY slow..slow..slow]

and this for every time the frame nedds fresh assets and data.



conclusion: yes, 8 GB are useless because you can access 2.3 GB per frame.
and we are talking about loading from bluray, but you can use the extra memory to create procedural textures, to data not bounded to graphic frame etc

how could you think that engineers at microsoft are so stupid to put 8GB and then realize that are useless so they waste 3 GB in OS (Windows vista maybe will allocate 3 GB, not a lean kernel for a console)

I was comparing reserving 5GB versus, say 7GB for games on the Durango for a comparison of 1GB reserved for OS+alpha versus 3 GB reserved for OS+alpha to suggest a point that reserving 3GB might not seem as idiotic and grossly inflated as some may think.

Even with a complete scene change, 5GB of memory could easily fit two entire sets 2.3 GB of data, let alone 2.3GB is a perfect scenario and in reality would be quite a bit smaller than that. Even more so if the game isn't 30fps capped.

Why put in 8GB then reserve 3GB?

I'd suggest because it's cheap to go to 8GB and then they think they can reserve a shitload of RAM without causing substantial problems to the gaming side. 5GB is already more than any competitor's offering.



Putting it another way.
Why NOT reserve 3GB?
Does anybody here think that 7GB instead of 5GB left for games on the Durango will make a noticeable difference?

May be apples to oranges, but the basic idea shouldn't be too far off.
360 had 22.40GB/s, max memory bandwidth per frame@30fps = 746MB/frame and 360 had only around 480 MB.
if we look at the memory available to max memory bandwidth per frame ratios, we see this:

360: 0.643
Durango: 2.19 @ 5GB, 3.07@ 7GB

From a bandwidth per frame standpoint Durango has more than 3 times the memory pool of the 360.

Assuming that the unified RAM is in fact 8GB, memory size is probably the least of Microsoft's worries.

And not that this is unprecedented from a ratio perspective. WiiU reserved a whopping 50%. 3GB is only 37.5%.
 
Last edited by a moderator:
GPUs are built around being able to hide latency in the 100's of cycles. CPU's are built to deal with latency in the 10's of cycles and single digits where possible, hence the emphasis on low latency L1/L2 caches.

GPUs require bandwidth and their highly parallel nature allows them to hide the latency that comes with it. They are designed with an eye towards hiding latency rather than lowering it.

CPU's on the other hand are much more highly reliant on low latency. Much of a CPUs is designed around reducing the latency for memory access.

In other words, you'll never see a computer use GDDR of any sort for main memory because the latency is far too high.

So, coming back around to Durango. 3 GB reserved for the OS can really only mean one thing. There is an expectation that the console will run applications. Perhaps many of them. Perhaps even applications normally only found on a desktop computer. Hence CPU performance is going to be more critical than it was for X360 and PS3. DDR3 makes sense in that case where a large memory pool combined with low latency is beneficial.


I know that but you are comparing 2 way different design,and expecting that latency will be a draw back for GDDR5..

This systems are not your average PC,they don't communicate by PCIE port,in fact the GPU and CPU are on the same chip,which will reduce latency and allow for faster communication between CPU and GPU...

Is like the theory of the xbox 720 been 100% efficient,so it will stay close to the PS4 in performance even with a 50% power advantage,the theory sound great,but is based on the PS4 not been very efficient which no one knows..
 
Is like the theory of the xbox 720 been 100% efficient,so it will stay close to the PS4 in performance even with a 50% power advantage,the theory sound great,but is based on the PS4 not been very efficient which no one knows..

so all started with gpu orbis have 50% more flops (this not means the gpu is 50% more powerful), now we have that orbis is 50% more powerful?

source?
 
I know that but you are comparing 2 way different design,and expecting that latency will be a draw back for GDDR5..

This systems are not your average PC,they don't communicate by PCIE port,in fact the GPU and CPU are on the same chip,which will reduce latency and allow for faster communication between CPU and GPU...

Is like the theory of the xbox 720 been 100% efficient,so it will stay close to the PS4 in performance even with a 50% power advantage,the theory sound great,but is based on the PS4 not been very efficient which no one knows..

What? PCIE has nothing to do with how quickly the GPU can acces data in it's memory pool. We're not talking about access to main system memory here, we're talking GPU <-> GDDR5 or CPU <-> DDR3. In the case of consoles that will be from GPU/CPU <-> memory pool. At no point are we dealing with the PCIE bus.

Regards,
SB
 
You are correct. They just set the direction, required features, and budget. Then the folks who want to make it awesome have to fit their awesomeness into those limitations. This happens in every company, the PS3, for instance, does not have dual-HDMI out and a 2 port gigabit ethernet hub, despite those features actually being announced. Why not? Someone realized their awesomeness would not fit in the budget.

At the start of the XBox and XBox 360 projects, you had J Allard, Peter Moore, and Robbie Bach setting those features and budgets. Budgets that were considered "strategic" (Meaning losses were irrelevant as long as marketshare was gained). Now you have Mark Whitten, Ben Kilgore and David Treadwell. All good people, but none of them visionaries. Working with a company directive that IEB must now be, and remain, profitable. And _grow_ profit from year to year.

In meetings, I got a similar feel from them as I got from the Kin leadership at times. I'd ask a question and get a handwavey response. "What about the growing trend of data caps?" I'd ask. (Answer: "These are not the droids you're looking for, move along" - paraphrased by me). "What about the large percentage of xbox users who have never connected their XBox to a network?", I asked. Hand wave... (I'm changing the questions from what I actually asked so as not to give info I shouldn't, and the questions were not necessarily related to future platforms - some of them have launched.)

So sure, the engineering guys are going to do their damndest to give you the absolute best they can come up with given their budget and priorities, but those budgets and priorities are not the same as they used to be in the XBox division.
Its become rare thing to read such an insightful post on B3D (actually any forum for that matter). I guess you can't have it all. Power, price and versatility, but I'm not sure if they understood how much hardcore gaming actually meant to 360 and that the only reason they are still in gaming business is because of great engineering job, they would maybe stretch their budget a bit.

Still, for tech freaks and "hardcore" gamers here, this is kinda disappointing to read. Apple/Google obviously got to close to MS comfort zone and they feel the need to really take over your living room this time. Its going to be interesting match up, but I feel they won't be able to overcome hardware deficit.
 
Honestly, the console business is pretty stupid. You sell at a huge loss for years. Whether we like it or not, these companies have a responsibility to their shareholders. You take a big big gamble when you launch one of these things, and you expect to lose money on them for the first couple of years. It doesn't make sense to continue doing that. Staying cutting edge is just far too expensive. Even with all the additional revenue streams they've added in terms of online content, advertising and online service fees(Xbox Live) they still only have a couple of years of big profits. This change in thinking was inevitable. As a gamer, it's disappointing, but in this business you can go from first to "out of business" pretty quickly. Look at the huge swing in market share from Xbox, PS2 to 360, PS3. 360 gained a lot of share, but could just as easily lose it. If you lose that market share, and you're selling the box at a loss, you are royally f'ed.

Whether MS has chosen the right path remains to be seen, but I imagine the thinking was along the lines of making the project profitable in a way that didn't rely on them being neck and neck with Sony or outselling them. I'm sure they hope to maintain or increase their market share, and probably believe they can do that, but believe they can remain profitable if they don't.
 
Its become rare thing to read such an insightful post on B3D (actually any forum for that matter). I guess you can't have it all. Power, price and versatility, but I'm not sure if they understood how much hardcore gaming actually meant to 360 and that the only reason they are still in gaming business is because of great engineering job, they would maybe stretch their budget a bit.

Define "a bit".
These are devices that sell tens of millions of units, so a tiny change in one part adds up.
How many gamers here want to be on the hook for a decision that could amount to tens of millions of dollars?
You can try to stretch, if you think it somehow will yield additional tens of millions of dollars in return, but this is in the face of competition, shifting economics, a top-heavy software sales model, and diminishing returns.
 
Status
Not open for further replies.
Back
Top