Apple is an existential threat to the PC

Compiling is not something that scales linearly. From the benchmark results here:

https://github.com/devMEremenko/XcodeBenchmark

You can see that even comparing between M1 Max and M1 (8+2 vs 4+4) M1 Max is only about 40% faster, and both are monolithic chips.

Intresting, i was just wildly under the impression that this was going to scale just about perfectly, but seems thats not truly possible eitherway for everything. In special gaming, which aint just that simple to overcome i think.


They sure are. Raw performance though? Its competing with a 12900/3070 in synthetic benchmarks, being outperformed quite many times, quite much so.

Regardless I think the performance coming out of that little 3.5 liter box that uses less power than the 3090 alone is impressive.
They are both tools to get work done, so use whichever suits your needs best.

Thats what i have been saying all along, this Apple hardware aint replacing, as per the topic, NV, AMD etc hardware, its replacing Apples Intel offerings. This with 'both get work done, use whichever suits your needs' has been true all along, it hasnt really changed now.

I have no clue why Apple would even compare the M1 Ultra to the 3090 to begin with.

Same here, but, they did.

Anyway, efficiency is a true leap over whatever they had before and other competing x86 solutions, in special for those who can put these to work where they perform the best.
 
Last edited:
I also don’t think the 3090 is faster at exporting large RAW images or video for that matter. On the other hand I also don’t expect M1 Ultra to be competitive in gaming, blender or octane.

Even if Apple made a discrete GPU with four times the GPU resources as a single M1 Max die, they would still be uncompetitive in gaming.

Apple's strategy with gaming on the Mac is laughable. For starters, they need to get serious and adopt Vulkan.
 
Apple's strategy with gaming on the Mac is laughable. For starters, they need to get serious and adopt Vulkan.
Don't hold your breath. I imagine that for Apple, gaming on a Mac is a bit of a chicken-and-egg scenario. Apple might be willing to support non-Metal APIs if there was more evidence of gaming being a significant and relevant Mac market. And for gaming to become a significant and relevant Mac market, there needs to be a wider support of non-Metal APIs. I think that DirectX (no chance) or Proton would be better software investments at this point and the issue there is that Microsoft probably wouldn't want to facilitate DirectX elsewhere than Windows and Steam are doing all the running on Proton so why should Apple? - from Apple's perspective.

I game on Mac a lot and the situation is a real head-scratcher.
 
Don't hold your breath. I imagine that for Apple, gaming on a Mac is a bit of a chicken-and-egg scenario. Apple might be willing to support non-Metal APIs if there was more evidence of gaming being a significant and relevant Mac market. And for gaming to become a significant and relevant Mac market, there needs to be a wider support of non-Metal APIs. I think that DirectX (no chance) or Proton would be better software investments at this point and the issue there is that Microsoft probably wouldn't want to facilitate DirectX elsewhere than Windows and Steam are doing all the running on Proton so why should Apple? - from Apple's perspective.

I game on Mac a lot and the situation is a real head-scratcher.
Totally agree. I’ve long given up on gaming on my Macs.

Here’s hoping that Valve can shake things up and challenge the supremacy of DX.
 
Totally agree. I’ve long given up on gaming on my Macs.

Mostly what I play on a computer (not console) are RTS and grand strategies, fortunately Paradox and Creative Assembly released almost every on Mac. Stellaris is easily my biggest time sync with Crusader Kings (II, now III) and Total War being close behind. But also well supported are gamed like EU IV, and sim-like games like Cities Skylines, Then there are endless variations of these genres, like Rimworld, Northgard, Prison Artiest, The Long Dark, Parkitech, Surviving Mars, The Universe, Wasteland. It feels rare to find something that doesn't have a decent Mac version but I have a ASA Zephyrus G14 for those few occasions. And most of those its' not about the graphics API, it's because they use .NET or some weird engine they developed themselves, like They Are Billions.
 
Don't hold your breath. I imagine that for Apple, gaming on a Mac is a bit of a chicken-and-egg scenario.
I think it's more of a question of how much scraps they want to leave.

If Apple released a M1 Pro/Max loss leader Apple TV with console marketing, they could push adoption of metal much faster.
 
I just received my Mac Studio base model, so I’m hoping it will run my Dell UP3218K
Do you have a Thunderbolt 4 to DP adapter? I know the DP 1.4 protocol supports 8K rez, however I can't find any documentation on Apple's site regarding support for individual monitors with resolutions beyond 6K... The only reason I could think it wouldn't work is some software limitation.
 
Swedish article on the Studio m1 Ultra

https://www.sweclockers.com/nyhet/3...bwtLe9QL3rdBNUq5Ptdzo1sF6n3KGPatRBuaHLiKO3-Xc

It is in swedish but the test graphs should be understandable.
Looking at the geekbench and Tomb Raider scores (which is a pretty thin comparison, granted) the Max appears to perform in the same space as the 3060Ti. The 3060Ti also comes at a whopping power discount compared to the 3090, easily 1/3rd less or even better depending on the situation. And again, if we're talking about video transcoding, the power difference becomes even more remarkable as the main compute of the GPU itself doesn't have to spin up for the NVENC stuff to be active.
 
Do you have a Thunderbolt 4 to DP adapter? I know the DP 1.4 protocol supports 8K rez, however I can't find any documentation on Apple's site regarding support for individual monitors with resolutions beyond 6K... The only reason I could think it wouldn't work is some software limitation.

I do but it uses two cables, each driving 3840x4320@60Hz. Technically it’s USB-C to DP cables.

The monitor doesn’t support DSC so it can also do 7680x4320@30Hz with a single cable. At least it could on my older Mac Pro with VEGA FE.
 
Looking at the geekbench and Tomb Raider scores (which is a pretty thin comparison, granted) the Max appears to perform in the same space as the 3060Ti. The 3060Ti also comes at a whopping power discount compared to the 3090, easily 1/3rd less or even better depending on the situation. And again, if we're talking about video transcoding, the power difference becomes even more remarkable as the main compute of the GPU itself doesn't have to spin up for the NVENC stuff to be active.
Tomb Raider runs under Rosetta 2, so not sure there is a viable reason to compare it to Windows performance.
 
Tomb Raider runs under Rosetta 2, so not sure there is a viable reason to compare it to Windows performance.
Yeah, that's a pretty crap comparison.

I really dont know how useful Geekbench is between the Mac and Windows platforms, either.
 
I really dont know how useful Geekbench is between the Mac and Windows platforms, either.
Geekbench uses Metal, so it's the best showcase for Apple SoCs right now, the M1 Ultra scales badly compared to M1 Max, only achieving 41% better performance, and gets beaten by the 3070 on CUDA, which managed to be 50% faster than the M1 Ultra.

https://arstechnica.com/gadgets/2022/03/mac-studio-review-a-nearly-perfect-workhorse-mac/2/#h1

Also in Blender "without RT", the RTX 3090 is 4.9 times faster than M1 Ultra. The scaling from M1 Max to Ultra is only 60%.

 
Back
Top