Apple A8 and A8X

Discussion in 'Mobile Devices and SoCs' started by ltcommander.data, Sep 9, 2014.

  1. anexanhume

    anexanhume Veteran

    FWIW, Apple had been claiming 2x CPU improvement every generation from A4 to A5 and on until now. A5 did it with dual cores and OoO. A6 did it with a big clock boost and wider design. A7 did it with even wider design and ISA jump. This time they simply couldn't go wider efficiently and clock boost seems out of character with their power/speed balance historically.

    It helps in terms of schedule by a few months at most, so I don't see it as a huge advantage. It will be interesting to see if they have the same GPU as seen in A7, though.
     
  2. rpg.314

    rpg.314 Veteran

    If there is a 12" ipad pro with active digitizer out, I'll like it
     
  3. rpg.314

    rpg.314 Veteran

    No.

    The SOC TDP remains the same. hence the GPU power budget remains the same, roughly speaking. Thus any power efficiency differences convert directly into performance differences.
     
  4. rpg.314

    rpg.314 Veteran

    AFAIK, the VoLTE should be entirely in the modem/transceiver/baseband and not in the apps processor, except as a sw.

    I could be wrong about this.
     
  5. wco81

    wco81 Legend

    I wouldn't hold my breath about anything involving a pen from Apple.
     
  6. I'm hopeful the force sensitive touchscreen tech from the iWatch will eventually make it into the iPad. Apple may not release their own stylus, but it would enable pressure sensitive third-party pens. Seeing the iWatch isn't even released yet I'm guessing there's no details yet on how force sensitivity on the iWatch works? Is it like the nVidia Tegra Note?

    This discussion might be getting off-topic for this thread though.
     
  7. Ailuros

    Ailuros Epsilon plus three Legend Subscriber

    You most likely aren't wrong; if I read it twice from different folks in the same thread it's hardly a coincidence. Oh and blah to the sw solution :roll:
     
  8. patsu

    patsu Legend

    Where do the H.265 encoder + decoder sit ?
     
  9. iMacmatician

    iMacmatician Regular

    TechNews claims that the iPad "Pro" will have an A8X and the iPad Air 2 will use the A8.
     
  10. Lodix

    Lodix Newcomer

    It is just decoder, and it is in the A8 SOC, I don't know where.
     
  11. mavere

    mavere Newcomer

    Using HEVC for FaceTime pretty much guarantees hardware encoding + decoding.
     
  12. patsu

    patsu Legend

    Exactly my thoughts.

    See their model comparison page regarding H.265:
    https://www.apple.com/iphone/compare/
     
  13. Why would they say Facetime over cellular? Unless the coding is software based and they use it for bandwidth reduction only.
     
  14. mavere

    mavere Newcomer

    Probably because MPEG-4 Part 2 doesn't need in-loop deblocking or CABAC and uses much less power per pixel. (Edit: I just noticed patsu linked to the compare page. which doesn't detail the codec discrepancy between Wifi and Cellular)

    I think it's more reasonable to assume that they use MPEG-4 when convenient (e.g. when bandwidth is plentiful and power-cheap). Otherwise, it's H.264 when paired with older devices and HEVC when paired with A8 devices. The modem can then relax and offset the encoder/decoder's increased power profile.

    Though even ignoring what codec they use and don't use, realtime software encoding on a brand new codec with limited time for computational and rate-distortion optimizations is ridiculously belligerent on a mobile platform.
     
  15. davygee

    davygee Newcomer

    I wonder when we will hear what the difference is with Geekbench and GFXBench results when they are tailored to run using Metal rather than OpenGL? Will be interested to see if the framerates go up considerably.
     
  16. tangey

    tangey Veteran

    The whole point of these tests is to test performance with standard APIs.

    There is no way they are going to do metal versions of these IMO.
     
  17. davygee

    davygee Newcomer

    I know the basis is to compare based on standard APIs over multiple platforms, just thought it would be interesting to see how the performance changes based on using METAL compared to OpenGL. I see that Geekbench have been developing tools to check it out.

    https://twitter.com/jfpoole/status/513850325649072129
     
  18. tangey

    tangey Veteran

    I totally agree that it would be interesting to see how Metal improves performance over GLes3.0, when doing the same takss, which would give a good indication of the performance improvements that developers might see if designing for Metal.

    However, glbench isn't about comparing relative API performance, it's about comparing devices running the same applications using the same APIs. I know next to nothing about graphics design, but the little I've read suggests that it is far from straightforward to switch from es3.0 to Metal, and certainly wouldn't be on the radar of the benchmark people IMO.
     
  19. davygee

    davygee Newcomer

    Yeah maybe not, but I won't think it will be long before we see real world differences in games using Metal. We can already see the changes in games like Asphalt and Plunder Pirates and these have been reworked to use Metal rather than OpenGL over only a few months.

    What will be interesting is if we start seeing an influx of Metal enhanced games and whether it will have a positive effect for iOS gaming over Android gaming?
     
  20. pcchen

    pcchen Moderator Moderator Veteran Subscriber

    If you use game engines (such as Unity 3D) then it can be relatively straightforward to switch to Metal, if the game engine supports it.
     
Loading...

Share This Page

Loading...