Xbox One (Durango) Technical hardware investigation

Status
Not open for further replies.
sure why not
Because HEVC encoding is very processor intensive by accounts and you don't get realtime encodes as well as spending 100% of your console resources on running a game. Those realtime software encoders you may point to are for doing nothing but encoding, dedicating a large slice of CPU and/or GPU silicon to work. Here's some early x265 benchmarks - over 100 seconds to encode 500 frames on i7. That's 5 fps encode speed. How is XB1 supposed to run a game and a software encoder in realtime?
 
Nobody had HEVC decoder when APU was made.
AMD first implementation is in Carrizo. Not out yet.
Kaveri has to use software implementation (maybe using OpenCL or HSA). Still needs GPU and CPU resources. http://forums.anandtech.com/showthread.php?t=2387592
That's doesn't mean Xbox or PS4 can't have it.
You should be careful about absolutes, we do know that windows 10 is coming to Xbox and we know that it will be streaming games from Xbox to Windows 10 devices in real time at 1080p@30fps.
It is clear today with how Xbox is struggling with game power that it's clearly not a software based solution while its gaming. I just don't see how the game would have spare cycles for something like that if it did perhaps that should have been put towards games yea?

So either HEVC is wrong definition or there is a possibility despite how unlikely it is, that there is a hardware encoder.
 
Now that it's out in the wild finally. I can finally say as well that Xbox One is FL12_0. Lol. That was more or less the last piece of information I was looking for that was confirmed o me at //Build/. But I didn't think AMD would outright say something like this.

http://m.neogaf.com/showthread.php?t=1056976
 
So either HEVC is wrong definition or there is a possibility despite how unlikely it is, that there is a hardware encoder.
They could be using h.264 to stream like everyone else. h.264 support is already supported in GCN's VCE. HEVC is extremely unlikely due to the timelines. It only got its first final draft in April 2013, mere months before XB1's silicon needed to be final. Unless AMD had been working on HEVC hardware off the WIP codec and had something ready to plop in at the last minute, it's unrealistic. Then there's the fact that AMD haven't bothered to add the same HEVC block n their other APUs including laptops who'll benefit from it for streaming. And nVidia didn't have an HEVC encoder until 2014 in 970/980. So AMD created an HEVC encoder block in 2013 but didn't include it in their GPUs until 2015. :???:

Best chance is maybe if it was an MS in-house design.
 
going off linkedin profiles of people suggesting they can

Beware this very thing, unless you are a person of exceptional honesty I'm sure you've applied embellishment to your own career on CVs. "Worked on s/w hevc" != "produced a working encoder on one core" it could also mean "wasted 6 months of my life proving water does not flow uphill".

There is not a hope in hell of producing a working realtime Hevc encoder on one Jaguar core. I'd even doubt you could get a decode working on that modest budget let alone the far more computationally expensive encoder.

If MS have the Carrizo hevc block in their ASIC then they're good otherwise the streaming is an all h.264 affair. The HEVC support could well be a software decoder used by an app which have more corw s available to them than the system reserve does.
 
They could be using h.264 to stream like everyone else. h.264 support is already supported in GCN's VCE. HEVC is extremely unlikely due to the timelines. It only got its first final draft in April 2013, mere months before XB1's silicon needed to be final. Unless AMD had been working on HEVC hardware off the WIP codec and had something ready to plop in at the last minute, it's unrealistic. Then there's the fact that AMD haven't bothered to add the same HEVC block n their other APUs including laptops who'll benefit from it for streaming. And nVidia didn't have an HEVC encoder until 2014 in 970/980. So AMD created an HEVC encoder block in 2013 but didn't include it in their GPUs until 2015. :???:

Best chance is maybe if it was an MS in-house design.
Agreed, MS did indicate that they went ahead (of AMD) and completed their own customization to their own GPU of features that could later show up on AMD cards; the list of things that was customized is still fairly unknown. Provided that this device was first launched and developed as an all-in-one entertainment device, I don't think it's impossible, just highly unlikely. I'm just going to wait and see. We will know soon enough though, given the trend of hardware and Xbox, IMO you are likely the closest to pinning the tail on the donkey.
 
No, that's impossible. Near final silicon should be available almost a year before launch. That's 2012.

MS and Sony are using AMD h264 encoder/decoder.
 
Again, Xbox already struggels with cpu and gpu resources. It would not be possible to stream a game to the pc. So there must be some kind of hardware the games aren't using. 2% reserved gpu and 1.5 cpus are not enough.
Can the result of the known encoder chip somehow get converted?
 
No, that's impossible. Near final silicon should be available almost a year before launch. That's 2012.

MS and Sony are using AMD h264 encoder/decoder.
You and I have clearly different dictionary definitions of 'impossible'.
Rosetta was launched in 2004 and landed, though abruptly, on the comet 10 years later.

What exactly makes this 'impossible' as in 0%? Unless you have complete and clear view into this being researched at AMD and/or MS I don't see how you can use such absolute terms. You should know that I have many Apple devices that contain 802.11N wireless before the standard was complete (published draft 2009 after all revisions) my macbooks are 2,1 from 2006.

I don't disagree that it is likely using h264. But not even leaving a little percent chance that you could be wrong is foolish.
 
What exactly makes this 'impossible' as in 0%? Unless you have complete and clear view into this being researched at AMD and/or MS I don't see how you can use such absolute terms. You should know that I have many Apple devices that contain 802.11N wireless before the standard was complete (published draft 2009 after all revisions) my macbooks are 2,1 from 2006.

I don't disagree that it is likely using h264. But not even leaving a little percent chance that you could be wrong is foolish.

I'd say that 0% or close to 0% is probably the right answer, I cannot recall the chipset but at least one version of the Geforce or Radeon had a broken h264 decode block that wound up representing wasted silicon. The key thing with video standards is that in order to make a hardware encoder the standard has to be very nailed down. Software decoders can be easily tweaked to take into account different iFrame intervals, quantisation vars, etc, etc but a hardware decoder usually produces only one type of h265. If you've made a h/w decoder and a last minute change alters one of those variable you are stuffed.

WiFi NICs are different because frankly most of the yet to be agreed changes in the n standard were related to the backend management protocols everyone had already agreed the waveforms and basic handshakes (ie use Cisco's extensions to abg) as it was an evolution of the pre-existing 802.11 standard. By contrast there was still some debate about what should be the focus of h265 a year prior to release between pure bandwidth optimisation or accept a bit more b/w usage for an easier downstrem decode (important for h265 decode h/w blocks).

Of course hey maybe there is some h/w encoding in there in which case well done MS you took a big gamble and it paid off.
 
I'd say that 0% or close to 0% is probably the right answer, I cannot recall the chipset but at least one version of the Geforce or Radeon had a broken h264 decode block that wound up representing wasted silicon. The key thing with video standards is that in order to make a hardware encoder the standard has to be very nailed down. Software decoders can be easily tweaked to take into account different iFrame intervals, quantisation vars, etc, etc but a hardware decoder usually produces only one type of h265. If you've made a h/w decoder and a last minute change alters one of those variable you are stuffed.

WiFi NICs are different because frankly most of the yet to be agreed changes in the n standard were related to the backend management protocols everyone had already agreed the waveforms and basic handshakes (ie use Cisco's extensions to abg) as it was an evolution of the pre-existing 802.11 standard. By contrast there was still some debate about what should be the focus of h265 a year prior to release between pure bandwidth optimisation or accept a bit more b/w usage for an easier downstrem decode (important for h265 decode h/w blocks).

Of course hey maybe there is some h/w encoding in there in which case well done MS you took a big gamble and it paid off.
Indeed, or...
the fixed function HVEC encoders/decoder are coming for the next revision of Xbox/PS4.
 
Indeed, or...
the fixed function HVEC encoders/decoder are coming for the next revision of Xbox/PS4.
Still though that would be a significant change to the silicon they're shipping today, how awkward would it be advertising
"PS4/XB1 with 4K Streaming!!!*
*post 201x manufactured new consoles only"

It seems like the kind of thing that would needlessly annoy existing owners, a s/w codec can handle decoding h265 for any streaming service app like Amazon/NetFlix/etc and the known h264 h/w encode block is fine for encoding gameplay anyway so why bother with a h/w h265 encode or decode block at all? It's not like these are 4k gaming consoles where the b/w optimisations of h265 really start to pay off.
 
Still though that would be a significant change to the silicon they're shipping today, how awkward would it be advertising
"PS4/XB1 with 4K Streaming!!!*
*post 201x manufactured new consoles only"

It seems like the kind of thing that would needlessly annoy existing owners, a s/w codec can handle decoding h265 for any streaming service app like Amazon/NetFlix/etc and the known h264 h/w encode block is fine for encoding gameplay anyway so why bother with a h/w h265 encode or decode block at all? It's not like these are 4k gaming consoles where the b/w optimisations of h265 really start to pay off.

I don't disagree.. once again.. until it's out there in the flesh for everyone to see bare we're left with speculation, granted I agree with your views on h264; MS released Xbox without HDMI and then at a later date introduced models with HDMI. I don't see the purpose of doing more than what Xbox has done already as well. But I don't really know their road plan either.
 
Here's some early x265 benchmarks - over 100 seconds to encode 500 frames on i7. That's 5 fps encode speed. How is XB1 supposed to run a game and a software encoder in realtime?
This. Obviously it depends on the quality settings but every implementation of H.265 encode that I've tried has brought my i7 to its knees, even when using the more basic profiles.

If Xbox One is using HVEC to encode in realtime it's only doing so with either a) highly optimised software that nobody else has access too (not impossible but unlikely) or b) it's using specialised hardware. Again not impossible but also unlikely for this to have gone uncommented on before. Having said that AMD (not Sony) confirmed the existence of TrueAudio in PS4. It's not impossible both Sony and Microsoft's consoles have AMD hardware they can't reveal until AMD are ready.

So basically, anybody's guess ;)
 
I would say there's 0% that its not using h264.

I just don't believe that an atom 1.6Ghz tablet / 2-1 would be powerful enough to decode h265, it has h264 hardware decoding. Am I incorrect in believing this? Forgetting about killing the battery etc

If xbox could encode h265, then I guess I can see reasons for supporting both, especially in the future though.
 
The hot chips presentation for Xbox One had this:

Xbox-One-GPU-Architecture.png



H.264 AVC/MVC

Nothing about HEVC
 
Status
Not open for further replies.
Back
Top