Because HEVC encoding is very processor intensive by accounts and you don't get realtime encodes as well as spending 100% of your console resources on running a game. Those realtime software encoders you may point to are for doing nothing but encoding, dedicating a large slice of CPU and/or GPU silicon to work. Here's some early x265 benchmarks - over 100 seconds to encode 500 frames on i7. That's 5 fps encode speed. How is XB1 supposed to run a game and a software encoder in realtime?sure why not
Can you please qualify your assertions with some data. Unsupported assertions don't make for good conversation as there are no points to debate.No, xbox one does not have hardware support for HEVC. It's impossible.
Nobody had HEVC decoder when APU was made.Can you please qualify your assertions with some data. Unsupported assertions don't make for good conversation as there are no points to debate.
That's doesn't mean Xbox or PS4 can't have it.Nobody had HEVC decoder when APU was made.
AMD first implementation is in Carrizo. Not out yet.
Kaveri has to use software implementation (maybe using OpenCL or HSA). Still needs GPU and CPU resources. http://forums.anandtech.com/showthread.php?t=2387592
They could be using h.264 to stream like everyone else. h.264 support is already supported in GCN's VCE. HEVC is extremely unlikely due to the timelines. It only got its first final draft in April 2013, mere months before XB1's silicon needed to be final. Unless AMD had been working on HEVC hardware off the WIP codec and had something ready to plop in at the last minute, it's unrealistic. Then there's the fact that AMD haven't bothered to add the same HEVC block n their other APUs including laptops who'll benefit from it for streaming. And nVidia didn't have an HEVC encoder until 2014 in 970/980. So AMD created an HEVC encoder block in 2013 but didn't include it in their GPUs until 2015.So either HEVC is wrong definition or there is a possibility despite how unlikely it is, that there is a hardware encoder.
going off linkedin profiles of people suggesting they can
Agreed, MS did indicate that they went ahead (of AMD) and completed their own customization to their own GPU of features that could later show up on AMD cards; the list of things that was customized is still fairly unknown. Provided that this device was first launched and developed as an all-in-one entertainment device, I don't think it's impossible, just highly unlikely. I'm just going to wait and see. We will know soon enough though, given the trend of hardware and Xbox, IMO you are likely the closest to pinning the tail on the donkey.They could be using h.264 to stream like everyone else. h.264 support is already supported in GCN's VCE. HEVC is extremely unlikely due to the timelines. It only got its first final draft in April 2013, mere months before XB1's silicon needed to be final. Unless AMD had been working on HEVC hardware off the WIP codec and had something ready to plop in at the last minute, it's unrealistic. Then there's the fact that AMD haven't bothered to add the same HEVC block n their other APUs including laptops who'll benefit from it for streaming. And nVidia didn't have an HEVC encoder until 2014 in 970/980. So AMD created an HEVC encoder block in 2013 but didn't include it in their GPUs until 2015.
Best chance is maybe if it was an MS in-house design.
You and I have clearly different dictionary definitions of 'impossible'.No, that's impossible. Near final silicon should be available almost a year before launch. That's 2012.
MS and Sony are using AMD h264 encoder/decoder.
What exactly makes this 'impossible' as in 0%? Unless you have complete and clear view into this being researched at AMD and/or MS I don't see how you can use such absolute terms. You should know that I have many Apple devices that contain 802.11N wireless before the standard was complete (published draft 2009 after all revisions) my macbooks are 2,1 from 2006.
I don't disagree that it is likely using h264. But not even leaving a little percent chance that you could be wrong is foolish.
Indeed, or...I'd say that 0% or close to 0% is probably the right answer, I cannot recall the chipset but at least one version of the Geforce or Radeon had a broken h264 decode block that wound up representing wasted silicon. The key thing with video standards is that in order to make a hardware encoder the standard has to be very nailed down. Software decoders can be easily tweaked to take into account different iFrame intervals, quantisation vars, etc, etc but a hardware decoder usually produces only one type of h265. If you've made a h/w decoder and a last minute change alters one of those variable you are stuffed.
WiFi NICs are different because frankly most of the yet to be agreed changes in the n standard were related to the backend management protocols everyone had already agreed the waveforms and basic handshakes (ie use Cisco's extensions to abg) as it was an evolution of the pre-existing 802.11 standard. By contrast there was still some debate about what should be the focus of h265 a year prior to release between pure bandwidth optimisation or accept a bit more b/w usage for an easier downstrem decode (important for h265 decode h/w blocks).
Of course hey maybe there is some h/w encoding in there in which case well done MS you took a big gamble and it paid off.
Still though that would be a significant change to the silicon they're shipping today, how awkward would it be advertisingIndeed, or...
the fixed function HVEC encoders/decoder are coming for the next revision of Xbox/PS4.
Still though that would be a significant change to the silicon they're shipping today, how awkward would it be advertising
"PS4/XB1 with 4K Streaming!!!*
*post 201x manufactured new consoles only"
It seems like the kind of thing that would needlessly annoy existing owners, a s/w codec can handle decoding h265 for any streaming service app like Amazon/NetFlix/etc and the known h264 h/w encode block is fine for encoding gameplay anyway so why bother with a h/w h265 encode or decode block at all? It's not like these are 4k gaming consoles where the b/w optimisations of h265 really start to pay off.
This. Obviously it depends on the quality settings but every implementation of H.265 encode that I've tried has brought my i7 to its knees, even when using the more basic profiles.Here's some early x265 benchmarks - over 100 seconds to encode 500 frames on i7. That's 5 fps encode speed. How is XB1 supposed to run a game and a software encoder in realtime?
Why would they alienate early adopters? They'll just use software solution for decoder.Indeed, or...
the fixed function HVEC encoders/decoder are coming for the next revision of Xbox/PS4.