Apple is an existential threat to the PC

Discussion in 'PC Industry' started by MfA, Apr 3, 2018.

  1. Jawed

    Legend

    Joined:
    Oct 2, 2004
    Messages:
    11,708
    Likes Received:
    2,132
    Location:
    London
    Yep, it seems merely based upon IPC, Apple is years ahead. The single-threaded tests are limited seemingly by a 3.2GHz peak clock, versus 4.5GHz or more Intel/AMD.
     
    Laurent06 likes this.
  2. Laurent06

    Veteran

    Joined:
    Dec 14, 2007
    Messages:
    1,091
    Likes Received:
    489
    That's another advantage they have, but again that's not enough to explain their lead :) They also have one of the best if not the best micro architecture, and also a rather large silicon area. The lead they have can't be reduced to a few things, it's multidimensional, and that's what makes it so impressive IMHO.
     
  3. Laurent06

    Veteran

    Joined:
    Dec 14, 2007
    Messages:
    1,091
    Likes Received:
    489
    That's right, some people started making silly claims. But there are as many fanboys as there are haters. It's always funny to see x86 lovers trying to find excuses; for many years they were saying it was only a phone chip optimized to run Geekbench and that Apple would never kick Intel and that they'd better choose AMD. Some are still trying to make it look like it's nothing. As always the truth is in between those extremes, though I find those in denials funnier to read.
     
    Scott_Arm likes this.
  4. Malo

    Malo Yak Mechanicum
    Legend Subscriber

    Joined:
    Feb 9, 2002
    Messages:
    8,929
    Likes Received:
    5,528
    Location:
    Pennsylvania
    Apple also have a big advantage of not necessarily having to deal with decades of legacy support for CPU compatibility, an SoC that can be tuned for their OS and vice versa. It's something few platform companies can possibly achieve.
     
    Entropy, milk and Laurent06 like this.
  5. Laurent06

    Veteran

    Joined:
    Dec 14, 2007
    Messages:
    1,091
    Likes Received:
    489
    Yes, another advantage. They could get rid of 32-bit application code within a few years. While Intel/AMD still have to support their 16-bit ISA and Windows still has to support IA32.
     
  6. pipo

    Veteran

    Joined:
    Jun 8, 2005
    Messages:
    2,628
    Likes Received:
    30
    It's a strategic choice. Apple forces devs (and partners and customers for that matter) regularly to move forward. Might be harder for other companies to pull off, but still, it's a choice.
     
  7. DSoup

    DSoup Series Soup
    Legend Subscriber

    Joined:
    Nov 23, 2007
    Messages:
    16,775
    Likes Received:
    12,690
    Location:
    London, UK
    There are a fair few games with native Apple Silicon code, the biggest probably being World of Warcraft, but most are not what you would call graphically-demanding games or games with Windows versions.

    They did in 2019. macOS Catalina released in 2019 dropped support for 32-bit apps. There are obviously ways if you really need to, through virtualisation of a 32-bit maOS system.
     
    #667 DSoup, Oct 26, 2021
    Last edited: Oct 26, 2021
  8. JasonLD

    Regular

    Joined:
    Apr 3, 2004
    Messages:
    463
    Likes Received:
    105
    I think that is the luxury that only Apple can afford to do atm since they are fully vertically integrated being able to supply their own SoC. Technically, there is nothing stopping AMD and Nvidia to make big die/conservative clock approach with much better performance/power ratio, but wouldn't be economically viable for them.
     
    PSman1700 likes this.
  9. Gubbi

    Veteran

    Joined:
    Feb 8, 2002
    Messages:
    3,661
    Likes Received:
    1,114
    Comparing IPC is pointless when the M1 is clearly designed around a much lower operating frequency.

    Cheers
     
    Laurent06 likes this.
  10. Wesker

    Regular

    Joined:
    May 3, 2008
    Messages:
    299
    Likes Received:
    186
    Location:
    Oxford, UK
    iPhones, like Macs, don't have the marketshare lead of Android. Apple isn't interested in selling <$299 phones to the mass market, which inevitably makes up the vast majority of the market.

    Where the iPhone is dominating are the enthusiast, premium, and flagship markets. As such, if you broke down revenue or profit share of smartphones by manufacturer, Apple dominates everyone else.

    ...OK...

    I suggest you look at the benchmark numbers again for M1 and M1 Pro/Max.

    The "special use" cases you refer to are encoding/decoding, editing, compiling, rendering, simulations... You know, the kind of stuff that people who rely on their computer to earn an income do.

    The reality is that Apple Silicon is changing the paradigm: It dominates in the vast majority of casual use and productivity tasks, and really only struggles with games (which are clearly just bad ports or run in x86 emulation). Gaming is becoming the "special use" case where PCs retain an advantage.

    The M1 MacBook Air pretty much decimated the important $900-1100 price range. The performance per watt, display and build quality, and battery life was unrivalled in that class. Unless you absolutely required Windows, there was basically no point in purchasing a Windows laptop.

    Windows laptops with a 3080M aren't exactly chump change.
    A 14" MacBook Pro with a top-bin M1 Max (and requisite 32GB of RAM) is $3300. You're also getting a laptop with arguably one of the best displays (mini-LED) and sound systems in the industry. Not to mention battery life (17 hours video playback, 11 hours web surfing), 7.4GB/s SSD, and TB4 I/O.

    I'm not sure where you're finding competing devices with a 3080M for $660, $825, or even $1100 as you suggested.
     
  11. Lurkmass

    Regular

    Joined:
    Mar 3, 2020
    Messages:
    565
    Likes Received:
    711
    It turns out all along that tile-based rendering architectures weren't a good fit for high-end graphics performance. The outcome could've been far worse if games started using indirect rendering to push higher geometry density because sorting earlier in the graphics pipeline as seen in tile-based GPUs would've exhibited significant negative consequences. Most high-end graphics developers don't care about optimizing their render passes or will never deal with it either so it's no surprise as to why the M1 Max does poorly at higher resolutions as well especially in the presence of deferred renderers where the cost of G-buffers is proportional to the resolution and not optimizing for render passes effectively means throwing out the advantage of an on-chip tile memory but the problem is only going to get more dreadful as time goes on since developers are planning on ballooning up the size of the G-buffer by storing in more attributes to specialize/optimize their RT shaders ...

    IMR architectures has been the industry standard for years now when it comes to high-end graphics because it's easier for developers to extract more performance out of them in this case ...
     
    Malo and PSman1700 like this.
  12. Clukos

    Clukos Bloodborne 2 when?
    Veteran

    Joined:
    Jun 25, 2014
    Messages:
    4,688
    Likes Received:
    4,353
    TBDR is still a very good fit for deferred rendering, as you can avoid writing/sampling the GBuffer back into system memory entirely. But it requires practically a different rendering path so it's unlikely we'll see such optimisations from most games ported to Mac.
     
  13. Nebuchadnezzar

    Legend

    Joined:
    Feb 10, 2002
    Messages:
    1,060
    Likes Received:
    328
    Location:
    Luxembourg
    You don't know that. Apple purposefully avoids top frequency and higher voltages just to keep efficiency up.
     
    Gubbi and PSman1700 like this.
  14. PSman1700

    Legend

    Joined:
    Mar 22, 2019
    Messages:
    7,118
    Likes Received:
    3,088
    Dont want to quote the whole post, but oh wow. Well lets agree then, Apple sillicon (and software) is the best and apple pc's is where it is at. So when do you expect the turn over, as per the topic, where Apple pc's will be what Windows/MS pc's are now in terms of marketshare? (for gaming aswell) With the release of the M1 laptop last year, with the thing being on another level compared to similar priced windows laptops, why arent people going for that instead of windows based laptops.

    To clarify (as per your post), for me personally i wouldnt mind Apple instead of Microsoft. If the latter evaporates into thin air (windows devices and probably consoles), then i'd guess that developers will port and focus their games on apple pc's/hardware. In fact i'd be happy to have monstrous performance at a fraction of the power draw which also means smaller sized pc's or just a laptop would suffice.
    It doesnt matter whats powering my boxes, be it apple or something else, whos got the performance is king.
     
  15. Jawed

    Legend

    Joined:
    Oct 2, 2004
    Messages:
    11,708
    Likes Received:
    2,132
    Location:
    London
    Milan Zen server processors top-out at around 4GHz "boost" it seems, with 64 core processors having a maximum boost of 3675MHz (7713):

    Epyc - Wikipedia

    so "clearly" is not looking like a justifiable excuse.
     
    Gubbi likes this.
  16. Pressure

    Veteran

    Joined:
    Mar 30, 2004
    Messages:
    1,655
    Likes Received:
    593
    [​IMG]
    This is an example of something that have been optimised for M1.

    Unfortunately nearly no games are optimised for macOS and Metal but productivity applications really shine and show the power of these SoCs.

    Performance is the same whether plugged in or not, which is impressive. Can't say that for any of the competing products.

    The potential is certainly there but no game developer have gone the extra mile yet.
     
    #676 Pressure, Oct 27, 2021
    Last edited: Oct 27, 2021
    Wesker likes this.
  17. DSoup

    DSoup Series Soup
    Legend Subscriber

    Joined:
    Nov 23, 2007
    Messages:
    16,775
    Likes Received:
    12,690
    Location:
    London, UK
    A lot of games are written in common engines that do a very good job of optimising for Apple's Metal API. Like Unreal engine and Unity. World of Warcraft was pretty much the first big game to support Apple's Metal API and other Mac-friendly developers like Creative Assembly (Total War, Alien Isolation), Paradox (Stellaris, Crusader Kings, Harts of Iron, Europa Universalis IV, Surviving Mars), and Firaxis (Civilization) embraced native Metal graphics engines years ago.

    What do people think games are using to drive graphics on Mac? The deprecated OpenGL 4.1 driver code from July 2010? The first Metal API was introduced seven years ago. :yep2:
     
    Pressure likes this.
  18. BRiT

    BRiT (>• •)>⌐■-■ (⌐■-■)
    Moderator Legend Alpha

    Joined:
    Feb 7, 2002
    Messages:
    20,502
    Likes Received:
    24,397
    Logo: pen down, move, rotate turtle.
     
  19. Gubbi

    Veteran

    Joined:
    Feb 8, 2002
    Messages:
    3,661
    Likes Received:
    1,114
    But I do know that. The L1 cache's size and latency is a very clear indicator that the M1 cannot clock much higher.

    As you say, this is a purposeful design choice.

    For active logic, power dissipation rises with the frequency cubed, so building a slower and much wider cpu core increases effiency while (at least!) maintaining performance. It also costs more because silicon is not free. Some structures are less dependent on switching time and more on wire delays, these can be increased without hitting the cycle time limit, like ROB, schedulers and caches.

    AMD's zen/2/3 are very clearly engineered for highend desktop. Intel has three large market segments: Mobile, datacenter and desktop, the latter two demands fast processors. I'm kind of surprised Intel hasn't done a mobile centric design á la M1 for mobile, with the kind of market share and resources they have. The big-Little they have going on is stillborn IMO.

    Cheers
     
  20. Gubbi

    Veteran

    Joined:
    Feb 8, 2002
    Messages:
    3,661
    Likes Received:
    1,114
    You can probably find a low power mobile U SKU that does even less. Doesn't change the fact that the same microarchitecture does 4.9GHz in an inferior process node.

    Cheers
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...