NGGP: NextGen Garbage Pile (aka: No one reads the topics or stays on topic) *spawn*

Status
Not open for further replies.
http://www.edge-online.com/features/risky-business-the-next-xbox-versus-playstation-4/

If our sources are correct (and we’re confident they are), Microsoft has made the move that publishers and developers have been asking for. Microsoft’s next Xbox will do what Steam and the App Store have been doing for years, and very successfully, too – a download-first, one profile, one purchase, one storefront system. Overnight, it’ll stop GameStop and GAME from selling on games without a penny heading back to its publisher, let alone its creator.

What’s trickier for Microsoft is in explaining its decision when faced with Sony’s plans for the PlayStation 4. Walk into a game retailer (should you be able to find one by the time these consoles arrive) and the choice could be simple: PlayStation 4 is more powerful, and plays second-hand games. One can imagine how fruitful a call between Kaz Hirai and Don Mattrick might have been had they both agreed to take the same measures against second-hand sales.

Edge seems pretty confidant in different directions Sony and MS are taking. Not only is Durango (according to them) weaker, it also won't allow second hand games to be played on it.
I'm truly baffled by direction MS is going, and thats not just for Xbox. Seems like they took Apple cues, except the people that are driving this unifying Metro idea and social integration of their products at MS headquarters don't have what it takes to hit a home run.

After lukewarm product that was RT, which supposed to be their showing off how to do Win tablet, and after disappointing product and launch what is Windows 8, they could go and get a "hat trick" with next Xbox. Only problem is in the fact that when it gets tough in console space, there is no second year to release new and improved product, you are in for a bumpy 6 year ride at least.

If thats what happens, if they push Kinect and social integration at expense of better gaming platform I really hope they fail and learn a hard lesson, just like Sony did this generation.

My post sounds like a yt comment, but since this is basically garbage troll thread I reckon there is no need for deleting it:p
 
At first, I was somewhat disappointed, as I expected something along the lines of on the fly full speed transcoding of JPEGXR and DXTC, which would literally increase bandwidth, but now everything falls into place.

From the tiling abilities of DMEs, it seems MS firmly believes in tiled renderers, which is supported by the fact of relatively small size of ESRAM (Wii had 24Mb iirc, so something like 128Mb would be more understandable). Together with multipass rendering, where GPU iterates over the same set of data several times, it makes perfect sense.

With tiled multipass, GPU would process one small (supposedly fitting into ESRAM) chunk of data several times over, every time using the full 102Gbps bandwidth and probably also having low latency, though it really shouldn't matter much. At once, DMEs would move the next chunk of data into ESRAM, not disturbing the GPU. Then DMEs would move the result out to be sent to screen, while GPU would switch to processing next already loaded chunk. Their capability of interworking tiled and linear memory suggests that such a behaviour would be easy to implement.

For example, let's say we have 4-pass rendering. Then processing a chunk of data from ESRAM would take the same time it would take the DMEs to load that chunk. Though, actually, 2-pass would suffice, as GPU also has to write changes back, and DMEs would mostly work one way. This is where reduced latency comes to play.
The relatively enlarged L2 cache is probably a measure to further reduce latency a bit and to give DMEs opportunity to actually access ESRAM - as GPU is probably fast enough to utilise all the memory bamdwidth by itself ,unless the very current working set is cached.

The DMEs don't do stuff that would be done by shaders on PS4. As there is no ESRAM in PS4, there is no point in DMEs. Though linear to tiled conversion is useable perhaps, would reduce memory wait times in tiled renderer as far as I understand them. They could also be used to defragment RAM, but I think that's not very needed in a console.

Furthermore, it doesn't even look too bleak compared to PS4. Granted, it's less powerful, but probably a bit more efficient. In worst case, PS4 would be 1080p60 and Xbox 720p30. Though IMO, just the switch to p30 would be enough. Moreover, there are some advanced interpolation techniques using motion maps that could be used to make 60fps out of 30 with very minimal overhead. And now, let's see. PS4 has 1.5 times the raw power on GPU. 30fps would require 2 times less power than 60 (I know that it doesn't scale so linear, but it's just an example). So you have to use only 9 CU instead of 18 for that, which leaves I' with 3 CUs free, which can be used for physics or other GPGPU.

So, I suppose, while first party would look better on PS4, though probably not very much, multiplatforms would be generally equal. Xbox first party would probably look the same as multiplatform though.
 
Nah, you just refuse to admit that a GPU with as many transistors as a HD7770 today is a GPU that could fit into a mobile SoC in 4 years using 14nm.

Maybe you're only considering area, but getting this level of performance is hard! Bandwith is a huge problem, so you have to account for interposer memory to match that 128bit gddr5. (or use something like ESRAM in Durango, plus ddr4, but this sort of thing is more amenable to a console than a general purpose device)

Then power use tends to scale down but not as much as you'd want.
With these warnings, I think that still you are not far off but you'd see that on hardware similar to the current Surface Pro (expensive, "only" 3-4 hours battery life) or ultrabooks.
 
At first, I was somewhat disappointed, as I expected something along the lines of on the fly full speed transcoding of JPEGXR and DXTC, which would literally increase bandwidth, but now everything falls into place.

From the tiling abilities of DMEs, it seems MS firmly believes in tiled renderers, which is supported by the fact of relatively small size of ESRAM (Wii had 24Mb iirc, so something like 128Mb would be more understandable). Together with multipass rendering, where GPU iterates over the same set of data several times, it makes perfect sense.

With tiled multipass, GPU would process one small (supposedly fitting into ESRAM) chunk of data several times over, every time using the full 102Gbps bandwidth and probably also having low latency, though it really shouldn't matter much. At once, DMEs would move the next chunk of data into ESRAM, not disturbing the GPU. Then DMEs would move the result out to be sent to screen, while GPU would switch to processing next already loaded chunk. Their capability of interworking tiled and linear memory suggests that such a behaviour would be easy to implement.

For example, let's say we have 4-pass rendering. Then processing a chunk of data from ESRAM would take the same time it would take the DMEs to load that chunk. Though, actually, 2-pass would suffice, as GPU also has to write changes back, and DMEs would mostly work one way. This is where reduced latency comes to play.
The relatively enlarged L2 cache is probably a measure to further reduce latency a bit and to give DMEs opportunity to actually access ESRAM - as GPU is probably fast enough to utilise all the memory bamdwidth by itself ,unless the very current working set is cached.

The DMEs don't do stuff that would be done by shaders on PS4. As there is no ESRAM in PS4, there is no point in DMEs. Though linear to tiled conversion is useable perhaps, would reduce memory wait times in tiled renderer as far as I understand them. They could also be used to defragment RAM, but I think that's not very needed in a console.

Furthermore, it doesn't even look too bleak compared to PS4. Granted, it's less powerful, but probably a bit more efficient. In worst case, PS4 would be 1080p60 and Xbox 720p30. Though IMO, just the switch to p30 would be enough. Moreover, there are some advanced interpolation techniques using motion maps that could be used to make 60fps out of 30 with very minimal overhead. And now, let's see. PS4 has 1.5 times the raw power on GPU. 30fps would require 2 times less power than 60 (I know that it doesn't scale so linear, but it's just an example). So you have to use only 9 CU instead of 18 for that, which leaves I' with 3 CUs free, which can be used for physics or other GPGPU.

So, I suppose, while first party would look better on PS4, though probably not very much, multiplatforms would be generally equal. Xbox first party would probably look the same as multiplatform though.

Best new poster we've had in a while.

Welcome to B3d!

PS. The bolded part is very very intriguing. I know TVs have motion smoothing, but they generally tend to add latency.

Can it be better on consoles?
 
Last edited by a moderator:
At this point, I don't see any point in continuing the discussion. You're either purposely ignoring my points or shifting to different topics that get away from the original intent of the discussion. I stand by my initial assertions: Durango will not be less than a 120W system and there will be no Durango tablet.

What does Durango being over 120W have to do with a Durango tablet? The realities of the the current console market doesn't mean the next generation will be stuck with those same realities.

If rumors are true we looking at consoles basically packed with PC hardware. In the PC space the range of performance and TDP for hardware is rather large and still accomodates single sku titles pretty well. Maybe not enough for console standards but we aren't talking about 1000s of different configurations when it comes to consoles.

A 4-core cpu/ 6-CU gpu tablet based off Durango playing 720 titles should be very much feasible as AMD apu parts with similar configurations are already targeted for the tab market.

The way it looks right now its very possible in the future we may have a coalescing of the console and PC market. A MS console game may never play on a Sony console or vice versa but if the underlying hardware continues to converge, a PC able to play any console game may end up a reality.
 
Granted, it's less powerful, but probably a bit more efficient. In worst case, PS4 would be 1080p60 and Xbox 720p30.

IMO this is a situation that Microsoft must to avoid, at least for multiplatform games.
 
Maybe you're only considering area, but getting this level of performance is hard! Bandwith is a huge problem, so you have to account for interposer memory to match that 128bit gddr5.

Which is why I said it could become possible for Durango (smaller GPU, lower bandwidth on main memory), but a lot less possible for Orbis.

My original point was that Orbis was made to be a better performer while Durango was made to be shrinkable. Microsoft has no handheld option at the moment, and this would be two kills with a single console.

Then power use tends to scale down but not as much as you'd want.
With these warnings, I think that still you are not far off but you'd see that on hardware similar to the current Surface Pro (expensive, "only" 3-4 hours battery life) or ultrabooks.

Yes, my "safe bet" would be to have a Durango Surface similar to a Surface Pro.
Expensive, yes. But it'd be a device that runs the exact same software as the home console lineup, and maybe even a full-fledged Windows 8.
IMO, there's a lot of value in such a device. Every gamer with a larger wallet would be tempted by it. Even with an autonomy of "just" 4 hours in gaming mode.
 
After lukewarm product that was RT, which supposed to be their showing off how to do Win tablet, and after disappointing product and launch what is Windows 8, they could go and get a "hat trick" with next Xbox. Only problem is in the fact that when it gets tough in console space, there is no second year to release new and improved product, you are in for a bumpy 6 year ride at least.

If thats what happens, if they push Kinect and social integration at expense of better gaming platform I really hope they fail and learn a hard lesson, just like Sony did this generation.

If they hit it out of the park with a Wii like performance going Kinect focus and low cost hardware they will be smiling from ear to ear. I think it'll be priced pretty low with locked in subs to get people on Live most importantly.

On the other hand if the sales are muted they can release Xbox 3.1 with 2 gpu's a year or 2 out and say, "this is the real Xbox 720". Kind of makes you wonder if those rumors of them simply calling it "Xbox" isn't without regard for this kind of situation. And forcing devs to code to a higher level API (I guess simply some DX11 subset), as Edge reports, buys them forward compatibility for this case.
 
Best new poster we've had in a while.

Welcome to B3d!

PS. The bolded part is very very intriguing. I know TVs have motion smoothing, but they generally tend to add latency.

Can it be better on consoles?

Thanks! :)

The interpolation would cause additional latency anyway, but perhaps if a V-synced triple buffer of main frames is used, there will be no additional latency. Google "bidirectional reprojection", the first link is even from Microsoft.
I don't have good knowledge of modern 3d engines, most of stuff is read out of
sebbi's posts and subsequently googled..

XpiderMX
Some sacrifices are inevitable. Though 1080p30+interpolation should suffice, IMHO.
 
Interpolation thing has already been done and shown by SW:FU2 devs for testing. They could reproduce a 60fps like effect from a 30fps game without the control latency issues. It actually ran better on PS3 though since they could code to the metal to better support their algos, while 360 had some TRC preventing them from fine tuning it as well.
 
Furthermore, it doesn't even look too bleak compared to PS4. Granted, it's less powerful, but probably a bit more efficient. In worst case, PS4 would be 1080p60 and Xbox 720p30. Though IMO, just the switch to p30 would be enough. Moreover, there are some advanced interpolation techniques using motion maps that could be used to make 60fps out of 30 with very minimal overhead. And now, let's see. PS4 has 1.5 times the raw power on GPU.

from what we know, we have 14 CU on orbis for graphic balanced work and 4 for other task that could give only marginal help for graphics

14 CU vs 12 means +16%
and only in GPU

how can this 16% make orbis 400% times faster than durango? (720p30 vs 1080p@60)


So, I suppose, while first party would look better on PS4, though probably not very much, multiplatforms would be generally equal. Xbox first party would probably look the same as multiplatform though.

this is more on earth, can be
 
I decided to do some rough calculations to test out a theory.

According to VGLeaks, Orbis has 1.84 TF of GPU compute power and .1 TF (102.4 GF) of CPU compute power being serviced by 176 GB/s of memory bandwidth. Proportionally, if you take 176/1.94 TF (total compute of GPU+CPU) you get 90.73 GB/s per TF of compute.

Now, assuming Durango's CPU is the same as Orbis's, if you take Durango's 1.2 TF of GPU and add the .1 TF of CPU compute you get 1.3 TF of compute. To receive the same proportion of memory bandwidth to compute resources as Orbis would require 1.3 * 90.73 or 117.95 GB/s of bandwidth.

Durango has 170 GB/s of total bandwidth between the DDR3 and EDRAM. Subtract 51.2 for the cost of the DMEs reading from one pool and writing to the other at their maximum speed of 25.6 GB/s and you have..... 118.80 GB/s.
 
from what we know, we have 14 CU on orbis for graphic balanced work and 4 for other task that could give only marginal help for graphics

14 CU vs 12 means +16%
and only in GPU

how can this 16% make orbis 400% times faster than durango? (720p30 vs 1080p@60)
You are assuming that Durango can make up the difference the +4 cu's of Orbis and not actually need to use some of it's 12 cu's to actually do some of the stuff done on Orbis's +4 cu's.

Is that really likely, I mean really?

If you want to play that game take off anywhere between 2 and 4 cu's from the 12 and then do the math.
 
IMO this is a situation that Microsoft must to avoid, at least for multiplatform games.

I don't think it will be that bad. I'm still not convinced the average gamer notices 1080p over 720p, and given the lack of complaints this gen about sub HD games. I really don't think you're going to hear much complaints next generation outside of the "hardcore forum going" gamers. Besides if Durango is less powerful, if development leads on it you'll still see parity overall.

Of course this is all my opinion with no factual data to corroberate any of it. I just don't think the difference between Durango and Orbis will be so much that MS is left hurting/struggling in any sense of the word.
 
Best new poster we've had in a while.

Welcome to B3d!

PS. The bolded part is very very intriguing. I know TVs have motion smoothing, but they generally tend to add latency.

Can it be better on consoles?

I dont think it can you still need to buffer the next and last frame to interpolate right?
And i think people will also feel it in the games 60hz is also a game that is better controlled. And dont forget that pc have a shitload of overhead too i heard some even have 300ms in game mode :mad:

A lot of folks where under i think because cod is 60hz it feels better to a lot of people.
Because im pretty sure that cod could rival bf3 in graphics when dropping to 30 fps or at least look graphically more impressive for the pixel junkies like most of us.

And i love my games to be responsive and had hope next gen would make 1080p@60fps standard for gaming. Now that im used to pc gaming now.
 
from what we know, we have 14 CU on orbis for graphic balanced work and 4 for other task that could give only marginal help for graphics

If the 14 + 4 CUs rumor is true, I suspect developers will continue to use the 4CUs for graphics work. Even on PS3, the SPUs spend most of their time on graphics while handling physics, AI, and other tasks that are now handled by dedicated hardware.

There should be plenty of graphics ideas and techniques they can try on these added compute units. Throwing all 18 on the same task may be overkill.
 
Status
Not open for further replies.
Back
Top