Middle Generation Console Upgrade Discussion [Scorpio, 4Pro]

Status
Not open for further replies.
in my opinion static options should go and flexible scaling on all options should be the target with weighting as to what is most important.

Do I want 60fps at all times
Do I want native res
Do I want all the eye candy

Forza seems to scale effects as well as resolution. Its seems to me as a layman if we can calculate the cost to render a scene we must have some insight into the cost of effects so we should be able to alter more than just resolution.

Scale from Xbox One to Scorpio and beyond if we assume forwards compatibility is now expected.

Easy to say when your not the one developing or testing.
 
In my opinion though, decent 4K adoption is some way off. Look at HD and BR for one. We are just about getting to a level where there are multiple HD tv's around peoples houses and even BR isn't properly adopted. Yeah some houses will have 1 or more BR drives while others may just have one and they never buy BR discs for it and still buy DVDs.

It's good going forward, but going by the specs, Scorpio will just about manage 4K based on the medium settings of current XBox One visuals. I personally think we should get to a point where home consoles are up there with the very high or ultra settings at 1080p before we move up to displaying 4K. There is planning in advance, but if the device is out for 3, 4 or 5 years before we see proper 4K adoption, then it will be ready to be replaced.
and miss out on vr in the console space, and for the niche 4k market.
as long as their not selling at a loss, achieve 4k or close to at current console quality levels, and priced at a point for people who want that level but don't want pc. I don't see the issue.
its a mid gen console, your arguing against mid gen consoles not against the scorpio possibly?
as a mid gen device seems like it could tick most of the boxes to me.

as you said 3-5 years when you think 4k hits mass adoption, it will be running at 1080 scaled to 4k, as it then will be the mainstream device. It's not being marketed as a mainstream device when it comes out, that is the xo slim.
 
Given the design goals I disagree, I also think that if the ps4 ended up with 4gb which was very possible, the ps4 would have been a very compromised machine overall imo.
As I said I'm in the minority and recognise that.

edit: Better way to put it, if xo ended up with 4gb gddr5, there's no way it could do all the things that it can and is planned to do

I'm specifically looking at it in hindsight. I'm not faulting them for the decision to use ESRAM. But, ESRAM made the box weak and Kinect made it expensive. Put a toxic launch messaging cherry on top and you get the current result.

It is from this perspective that I think you have to consider the XBOne's ESRAM a bad thing.
 
In my opinion though, decent 4K adoption is some way off. Look at HD and BR for one. We are just about getting to a level where there are multiple HD tv's around peoples houses and even BR isn't properly adopted. Yeah some houses will have 1 or more BR drives while others may just have one and they never buy BR discs for it and still buy DVDs.

It's good going forward, but going by the specs, Scorpio will just about manage 4K based on the medium settings of current XBox One visuals. I personally think we should get to a point where home consoles are up there with the very high or ultra settings at 1080p before we move up to displaying 4K. There is planning in advance, but if the device is out for 3, 4 or 5 years before we see proper 4K adoption, then it will be ready to be replaced.

who cares about bluray ? You can get 1080p or higher content from amazon , Netflix , hulu and many other sources. People don't use discs much anymore .

You can get a 4k tv for a few hundred dollars more than a 1080p tv. People with aging 1080p tvs are going to start buying the 4k tvs for the small premium which will continue to shrink.

In 2017 there may not be a premium for 4k
 
I'm specifically looking at it in hindsight. I'm not faulting them for the decision to use ESRAM. But, ESRAM made the box weak and Kinect made it expensive. Put a toxic launch messaging cherry on top and you get the current result.

It is from this perspective that I think you have to consider the XBOne's ESRAM a bad thing.
I'm not talking about Kinect, or messaging, I was talking strictly about the design decision.so that or 4gb gddr5, I think it was the right choice for them.
in hindsight.... I still think it was the right decision, given what they knew during the development period.
if it was a machine that was development and shipped over a course of a couple months, then I may have a different opinion.

like I said, I respect your opinion, even though i don't agree with it at all haha
 
I'm not talking about Kinect, or messaging, I was talking strictly about the design decision.so that or 4gb gddr5, I think it was the right choice for them.
in hindsight.... I still think it was the right decision, given what they knew during the development period.
if it was a machine that was development and shipped over a course of a couple months, then I may have a different opinion.

like I said, I respect your opinion, even though i don't agree with it at all haha

If you're restricting the knowledge to what they knew at the time it's not really hindsight. :p
 
MS wanted to make an always on, always silent device that had a 8GB of memory and drew very little power while sitting around recording crap and spying on your family using Kinect.

That's completely at odds with the choices Sony made (that were entirely right for them). 256-bit GDDR5 was never an option given MS's design goals. Even hindsight doesn't change that.
 
  • Like
Reactions: Jay
If you're restricting the knowledge to what they knew at the time it's not really hindsight. :p
:D
my hindsight, what I know know. Performance and visuals, compare favourably, in my eyes, running win10, etc. So as a design decision was right, or not wrong.
 
"We can render at 60hz, we can render fully uncompressed pixels" - bald dude

As much crap as Chris received for this quote, I can tell you the guy is brilliant and while he may not have chosen the best adjective, I believe he was referring to HDR support in where all current and previous gen games had to down sample HDR framebuffers to RGB color space for display. But as is always the case, people took a different more literal interpretation of a blurb to discredit the info as some phony marketing speak.
 
I think this is an pretty interesting part of that Eurogamer interview article with Phil Spencer:

"On a console to console experience, when we designed Scorpio and we said 4K console, we looked at games that are running at, let's say 1080p 60 on an Xbox One, and said we want that same game to be able to run at 4K 60 on a Scorpio. We looked at the design of the games we had on Xbox One today and said, if we increase the resolution and maintain the framerate we have, could we hit that?

I think framerate's more interesting than resolution in terms of competitive gaming, and we wanted to make sure teams were able to build the 4K version of their game at the same framerate they can hit, at whatever resolution: 900 or 1K or even 720 that they're hitting on this box. So, we thought specifically about that situation and talked to developers about it."

:mad:

So enough GPU to scale up the resolution without losing framerate, but no mention of a CPU that can hit higher framerates, even in VR.

... WiiPU.
 
My best guess for a 12 memory-chip system with 320 GB/s BW is currently:

12 x DDR4 2667 @ 64GB/s + 256 GB/s HBM2 = 12 chips on the mobo and 320 GB/s.

Basically, an X1 on steroids. DDR4 2667 will be mainstream next year, it won't need a stupid hot huge 384-bit GDDR5 bus, it avoids memory contention, Vega (that MS are supposedly waiting for) supports HBM2, and you don't dedicate too much of your SoC to either esram or external memory interface.

Time for a bold (hoho) statement:

Interface to 192-bit DDR4 + 1 stack of HBM1 should be less than half the area of a 384-bit GDDR5 interface, and deliver as much BW with far low power expenditure (while also reducing contention and being more optimal for GPU access patterns).
 
Might have been missed in the other thread:

That was very specific location to be showing for a representation of the internals, and I would guess that it's for the very specific reason for showing those things. It would be somewhat irregular to so that specific number of memory devices if the console has something different because everyone knowsthat the second press receive samples they will tear it down and compare those items.

ninja.gif
 
And you know what they probably will do it. I mean if they just going to take an Xbox One game and not change anything except the resolution it will probably come pretty close to 4K on a machine that is 4.5X as powerful.

It takes much less than 4x the power to run a game at the same graphics settings at 4k than at 2k (1080p).

Compare the scores from the GTX 980 Ti at 4k to the GTX 680 at 2.5k. Yes it's only starting from 2.5k but you can project it down to 2k relatively easily enough with some room for error.

http://www.anandtech.com/show/9306/the-nvidia-geforce-gtx-980-ti-review/5

For reference, the GTX 980 Ti ~5.6 TFLOPs while the GTX 680 is ~3.1 TFLOPs.

That's less than 2x more powerful but achieves roughly the same performance at the same settings. So something less than 3x more powerful would be needed to go from 2k to 4k with the same settings using those architectures.

But it's different architectures, etc. Well yes, but so is Polaris/Vega compared to the GCN 1.x iteration in the XBO. And even taking into account architectural improvements it's still not much more than 2x more powerful. Look at the scaling from GTX 680 to GTX 980 Ti at 2.5k res.

And yes, GTX 680 would scale a bit worse going from 2.5k to 4k, but that's unrelated to the TFLOPs. There are other considerations that affect 4k performance that are totally unrelated to TFLOPs. ROP count. Memory amount. Memory compression tech (reduces the bandwidth required is one benefit). Etc.

Keep in mind that GTX 680 was designed with nothing related to 4k in mind. While GTX 980 Ti at least gave some though to what would be needed to render at 4k.

All we know about Project Scorpio is the TFLOP count and memory bandwidth. We know pretty much nothing else about it.

If all they wanted to do was the same graphic settings at 4k. They would have only needed about 4.6-4.8k TFLOPs to be comfortable. And then make other relevant changes to the chip to facilitate rendering at 4k.

Regards,
SB
 
Last edited:
You need to store all these uncompressed pixels somewhere :D
The guy gets flak for the statement, but he has to be talking about either the textures, or the game he's working on uses a compressed framebuffer currently (e.g. LUV/YCoCg). :???:
 
Last edited:

Clamsheeeeeeeeeeeeeeeeeeeeell!

DXu4xgy.jpg
 
The guy gets flak for the statement, but he has to be talking about either the textures, or the game he's working on uses a compressed framebuffer currently (e.g. LUV/YCoCg). :???:

Hey, at least we get to read misterXmedia super secret sauce conspiracy theories about uncompressed pixels for the next year and a half :yep2:
 
who cares about bluray ? You can get 1080p or higher content from amazon , Netflix , hulu and many other sources. People don't use discs much anymore .

For the basic consumer buying cheapest TV that has the latest buzz words on the box sure. For those who care about full bitrate video and audio, streaming doesn't come close. It's this kind of thinking that will kill the market for high capacity high bitrate mediums and it really worries me.
 
Status
Not open for further replies.
Back
Top