Xbox One (Durango) Technical hardware investigation

Status
Not open for further replies.
Anyone think that Ms has chosen to reveal cpu clock at this stage because knows for sure that PS4 clocks are final??

50/50, still, even a 150mhz bump doesn't close the gap a great deal. At least the lowest common denominator is now not quite as low...in terms of multi plat development that is 8)
 
Anyone think that Ms has chosen to reveal cpu clock at this stage because knows for sure that PS4 clocks are final??
Or they just believe its too late for them to do anything about it....

Doesn't close the power?
CPU wise it may have more power. :D
 
Anyone think that Ms has chosen to reveal cpu clock at this stage because knows for sure that PS4 clocks are final??

No, I think they're dribbling out these details that they think will be seen as good news in an effort to rehabilitate the Xbox One's image. They've been doing it for weeks, even formally reannouncing things that were originally announce months ago. Anything to create some positivity. It would probably be working better if they didn't keep having setbacks like cutting down the number of launch countries, cutting back the availability of voice controls, etc.
 
No, I think they're dribbling out these details that they think will be seen as good news in an effort to rehabilitate the Xbox One's image. They've been doing it for weeks, even formally reannouncing things that were originally announce months ago. Anything to create some positivity. It would probably be working better if they didn't keep having setbacks like cutting down the number of launch countries, cutting back the availability of voice controls, etc.

yes it´s been like a sawed chart from a pr pov

But this late?
 
They think the dGPU is stacked ontop of the original SoC
Lol, that's hilarious. How do these village idiots suggest a nuclear pile of heat like that would be cooled, is there also a twin stage phase-change compressor/radiator crammed into the xbone case? That would probably get the job done despite thermal inefficiency of stacking RAM on top of two very hot dies, but such a setup might be difficult to get to run as whisper quiet as MS has claimed. ;) (Then again, MS also claimed the original launch 360 would run quiet, and we all know how true THAT was, so...STACKED DIES DOUBLE CONFIRMED!!! :LOL:)

Dont interpret this as supporting the dGPU thing, but that can actually be better argued as evidence for the dGPU.
Personally I would say, not really, actually... I doubt it'd be worth the cost, complexities and inefficiency adding in a dGPU which is only roughly the same performance of the APU GPU.

Or they just believe its too late for them to do anything about it....
Theoretically, sony could bump clocks in a day-one/post-launch firmware update. They did it with the PSP after all.
 
Anyone think that Ms has chosen to reveal cpu clock at this stage because knows for sure that PS4 clocks are final??

50/50, still, even a 150mhz bump doesn't close the gap a great deal. At least the lowest common denominator is now not quite as low...in terms of multi plat development that is 8)

It's a 150 mhz bump in a totally different and important parameter though. The CPU is 50% of the two key blocks of a system. The CPU FLOPS are well more weighted in importance.

That's why I had to laugh at gaffers spinning as a "irrelevant 30 GFLOPS increase" or something. As if CPU power is additive to the GPU flops. It's more like a ~10% increase in one half of the system.

Speaking of, what is the new GFLOPS of the CPU? 110X1.09? 120 GFLOPS?

I think the clock upgrade only means they had enough reserves for the targeted 100W TDP.

BeyondTed thought they'd be able to clock higher because on die memory access (to the ESRAM) take a lot less power than ones to external RAM. Therefore he thought in some constricted envelope MS would be able to go a lot higher.

Dont know if 150/53 mhz really qualifies as a lot, and as Shifty said I dont know if we're on a scale where it matters, but it's a thought.
 
In the division it's nothing but a fairly basic top-down view of the world. That is very basic. Nothing in that even showed half of what a tablet is capable of running.

Last comment on this promise :) ...

I know that there are some here that believe its the Device that is rendering the surface where as I still firmly believe the Xbox One will do the rendering and stream to the device..

We'll probably never agree on this BUT just for arguments sake I want to point to the VGLeaks post on "Smart Glass" and I know its a leak and not official BUT its pretty solid tech documentation that I believe has merit.


http://www.vgleaks.com/use-your-mobile-or-tablet-with-durango-xbox-companion/


Remote rendering & input

Unlike typical controllers, Companion features rich output through XAudio2 and a Direct3D surface. The title renders graphics and audio to the Companion just as it does to the main screen and audio system. This output is encoded as H.264 and transmitted over a Wi-Fi connection to the device, where it is decoded and displayed.
Companion captures touch, accelerometer, gyroscope, and text input and transmits them over a Wi-Fi connection to the title for processing.

I'm a dev that attends many Microsoft developer conferences, and the past \\build2013\ conference (4mths ago) that went into Windows 8.1 and DirectX 11.2 ... I believe that this remote rendering, low latency/high bandwidth api's, PRT's these are things that MS really want to push over the next year+ on Windows, Xbox One and Windows Phone ... That's my personal opinion ofcourse!
 
50/50, still, even a 150mhz bump doesn't close the gap a great deal. At least the lowest common denominator is now not quite as low...in terms of multi plat development that is 8)

Sure, so basically now the lowest common denominator with regards to...

GPU - Xbox One.
CPU - PS4.

And at the end of the day when multiplatform games come out, the majority of people won't be able to tell the difference between the platforms.

Regards,
SB
 
Last comment on this promise :) ...
And then I wade in! :mrgreen: Remote rendering is good for Remote Play. "Play your XB1 titles anywhere, on any device". Remote access is a standard technology now across devices and OSes. However, designing your game to render complex scenes for streaming to a subset of devices as a game enhancer is bad business practice from the POV of developers. See Wii U talk for why it's not a super convenient addition to a lot of games. You want to be optimising your Ryse or TitanFall to render your lovely HD graphics without worrying about designing a fancy overhead 3D view for a small percentage of your audience who may want to use it. There's enough trouble getting devs to target your official peripherals even without a high cost to the rest of the game. Hoping they'd spend significant resources rendering out to a mobile device would be foolhardy. Shifting the mobile app to the mobile hardware means your local copy is unaffected for all players, rather than being 10% diminished for all players in support of 1% of players who benefit.

Here the argument seems to look to a second GPU to power this remote device, like adding a second GPU in Wii U to drive the screen (which many of us suggested would be better served with a decent mobile chipset in the Wuublet instead). Here logical falls part completely. Who would design their hardware to have a second GPU to render to mobile devices as an optional feature that devs are unlikely to target in a big way - it's a significant expense for a feature that isn't a key USP to your platform? Who would build their console with such hardware and then not show it to anyone?? Who would then go to great lengths to hide the existence of the second GPU from everyone, creating a fake motherboard without the second GPU, developer documents that don't even mention the second GPU, yet alone describe how it connects to the RAM and rest of the system, and engage in press releases and interviews without mentioning the second GPU? Why would someone designing such a system not just put a few extra CUs on the GPU??

By looking at the hardware, we know exactly what is in the box. We know what it's capable of rendering. We know devs can choose to render on the console and stream to another device if they want (it's just a video-over-IP protocol with an input stream, and with hardware encoding, this is easy to support), or they can choose to have the device do the rendering (it's just a mobile app with a communications protocol). There's no need, nor sense, in looking at a remote play feature and working back from that to the idea that there must be more hardware than currently described because the hardware in the console isn't enough to drive two displays.
 
Sure, so basically now the lowest common denominator with regards to...

GPU - Xbox One.
CPU - PS4.

And at the end of the day when multiplatform games come out, the majority of people won't be able to tell the difference between the platforms.

Regards,
SB

MS may have ended up with a slight CPU advantage anyway with those discrete audio processors. That may be as much as a CPU core in general, since most games have audio.
 
IMO, the increase in flop performance due to the upclock isn't as important, as general single threaded performance. Chances are the highly parallel and flop heavy processing will already be run on the GPU.

This should help all the non-parallel code where non-SIMD processing is done (general integer, branching, control, etc.)
 
Are you suggesting - by using the phrase "the dGPU" - that there is, or at least might be, an additional, discrete GPU in xbone? If you want to answer that question with "yes", then I got news for you.

There's no discrete GPU in xbone.

Why:
* We've seen the motherboard (prototype) with no trace of any dGPU.
* No mention of any dGPU by MS in their reveal presentation, or at E3.
* John Carmack - a man known for never bullshitting - has already said PS4 and xbone are very close, so unless you're saying PS4 also has dGPU then there can't be one in xbone.
Also:
* Cooling vents on the case don't match up with any other major heat-generating device other than the main APU and its single heatsink/fan.
I gotta go to work, but they say the dGPU is somewhere in the SOC -thanks to javisoft for the pic-.
dk5YVRF.jpg


When did they say anything of the sorts at hotchips?
I haven't seen the presentation myself because it's not been released, but that is the first I've heard that.
SHAPE is in the SOC, also wasn't discussed, was that also NDA?
Or did they just choose not to discuss it.
That NDA thing is just a massive leap.

As for the LinkedIn pages, I've seen from developers to CEO's massage the truth on those things. So I wouldn't read to much into that either.

Is it possible maybe, but nothing that has been presented even all together leads to that conclusion.


Look in the esram astrophysics spin-of thread.
Apparently Microsoft said that they couldn't reveal everything about the console until next month, so they didn't disclose some extra details because of NDA stuff. I cant elaborate on it now. cheers
 
Via an email I received from the Xbox indie program. Hmm how do I quote from my phone??

We are also going to be sharing access to platform development white papers and forums in the coming weeks and months so that you can prepare to build your game with a clear understanding of the Xbox One platform.

So it will be interesting to see what all we get.

Now for some /crazy talk

dGPU is fun for talk but I just don't see how. ;)
 
Via an email I received from the Xbox indie program. Hmm how do I quote from my phone??



So it will be interesting to see what all we get.

Now for some /crazy talk

dGPU is fun for talk but I just don't see how. ;)

Ooh yes hope that stuff leaks on the web so some of us hobby programmers can start reading into the API used.

Edit: Dual gpu sounds so cray cray..
 
I dunno, if you wanna go into conspiracy stuff this old gaf post by Penello always struck me as pretty weird, especially the bolded

http://www.neogaf.com/forum/showpost.php?p=66980146&postcount=542
I do want to remind people – this interview was done six weeks ago, before E3. It’s not like we just decided to talk about it. So it wasn't that I just decided to call up OXM and give the my opinion :)

I would like to pose this question to the audience. There are several months until the consoles launch, and any student of the industry will remember, specs change.

Given the rumored specs for both systems, can anyone conceive of a circumstance or decision one platform holder could make, where despite the theoretical performance benchmarks of the components, the box that appears “weaker” could actually be more powerful?

I believe the debate on this could give some light to why we don’t want to engage in a specification debate until both boxes are final and shipping.

It could just mean the ESRAM though. But it never made perfect sense to me.
 
Status
Not open for further replies.
Back
Top