Apple is an existential threat to the PC

You must be the last person alive using objective C! Also, just for info, there has never been a requirement to Xcode for macOS/OSX development. And almost every major cross-platform IDE has supported iOS for years. ¯\_(ツ)_/¯

You appear to have made your own life difficult, a bit like trying to eat soup with chop sticks! :LOL:
This was a long time ago > 10 years. I minimized by Obj C stuff just the bear bones needed to get something running.
From memory back then it was pretty much Xcode required, perhaps in theory you didnt need if you were willing to jump through a lot of hoops and make your life hard
 
Anandtechs investigation of the M1Pro and M1Max is out.
Worth a read. A surprise or two.

Biggest surprise for me was the GPU never going above 90GB/s while the CPU goes all the way up to 243GB/s (whaaaat?!).
Previous Apple SoCs even had half their memory channels exclusive to the GPU but now the CPU pushes more than twice the GPU's bandwidth?


@Nebuchadnezzar isn't there a GPU-based ETH miner for the Apple M1 for you to try with the M1 Max? Those are usually some serious GPU bandwidth pushers.
 
Anandtechs investigation of the M1Pro and M1Max is out.
Worth a read. A surprise or two.

Interesting comment here, especially considering I saw more than a few "yeah well see how it performs unplugged" comments:

wFjQi4S.png
 
Very underwhelming gaming performance. You get 3080m performance in synthetic benchmarks for over double the price, not even half that performance in practice.
 
Last edited:
Synthetics are optimised for TBDR (by Apple) so the M1 Max shines there. If you don’t optimise for the architecture the perf cliffs are very steep. CPU performance is insane though.
 
Thats the thing I can't understand M1 has been out nearly a year. Surely theres at least one game title built for it.? (OK I answered this later)
I could understand before with the x86 as you could always dualboot into windows to play games but now you can't, you have to run in an emulator, which is gonna give an inaccurate picture of the machines performance.

found this
https://www.applegamingwiki.com/wiki/Home

Can't anadtech or someone do some benchmarking with some of these games? native vs rosetta vs windows

Keep in mind performance for whatever they show with M1 native will most likely see large gains in the future as its a new architecture unlike what we've seen on the x86 which already has years of improvements
 
I believe Divinity Original Sin 2 is optimised for the M1, would be an interesting comparison.
 
I believe Divinity Original Sin 2 is optimised for the M1, would be an interesting comparison.
yes it would be a good game but according to that page I linked last post its not its only rosetta while this is interesting (so no native),
its like towing a trailer in a F1 car it can be used as an absolute minimize speed but its not gonna tell you how fast it will go without the trailer
 
World of Warcraft is native on M1 and it has reasonable complexity of models and scenes, although it's probably not the easiest game to benchmark.
I used to run World of Warcraft on my M1 Macbook Pro once just to test if it works and it ran reasonably well.
 
The games are all running x86 emulation. Until Mac OS games get native M1 support you won’t get great gaming performance. MacBook’s really aren’t gaming devices.

True, and reading the comments section on the anandtech article, it really seems AT has a strong bias to Apple (or atleast andrei as many say). These new mac products are great for content creators and alike (just like the previous macs were), great performance for certain tasks.
 
True, and reading the comments section on the anandtech article, it really seems AT has a strong bias to Apple (or atleast andrei as many say). These new mac products are great for content creators and alike (just like the previous macs were), great performance for certain tasks.
I hate iPhones and I don't use any Mac whatsoever, so the Apple bias claim is always weird. The silicon performances and characteristics speak for themselves, I find it rather weird for people dismissing what Apple is doing there, the industry is very much in agreement here.
 
I hate iPhones and I don't use any Mac whatsoever, so the Apple bias claim is always weird. The silicon performances and characteristics speak for themselves, I find it rather weird for people dismissing what Apple is doing there, the industry is very much in agreement here.

I like iphones/ipads, they have the best aviation apps and have the longest software support without the need and hassle to revert to custom roms. Yes their very restricted but for a phone that doesnt really matter for me. Apple pc's on the other hand, while i dont hate them, their nothing i would use over a windows pc/laptop. Seems that the general public thinks the same, iphones are immensly popular, macs not so.

People dismissing what Apple is doing, im not one of them atleast, i think its very impressive what they achieve in synthetic benchmarks and special use cases (video encoding/decoding, streaming etc etc). But as per this topic (Apple pc taking over Windows pc's), i just dont think its all that impressive in a different light, you pay 3, what, 4 times more to equal a windows laptop performance? Probably 5 times. And then your matching 3080m laptop performance in benchmarks. In real world use, why would the average joe/windows user opt for a Mac pro/max other than less power draw?
These macs are for people who have dlsr cameras and software as pricey as the mac max, and for them its a nice upgrade from the Intel macs.

On the performance, yeah CPU/GPU performance is a huge leap from Intel/AMD/NV, and these notebooks perform much better than a PS5/XSX. Their also on a bleeding edge 5nm node on a entirely different architecture. As many comment on the AT article, its not magic ;)
 
On the performance, yeah CPU/GPU performance is a huge leap from Intel/AMD/NV, and these notebooks perform much better than a PS5/XSX. Their also on a bleeding edge 5nm node on a entirely different architecture. As many comment on the AT article, its not magic ;)
Nothing is ever magic. But thinking the advanced node is the only reason why Apple lead is so large is a serious mistake.
 
Nothing is ever magic. But thinking the advanced node is the only reason why Apple lead is so large is a serious mistake.

Obviously not the only reason and they do have industry-leading engineers, but especially with the M1 Max we're now noticing how the more parallel execution units you have, the more complicated, power-consuming and area-expensive it gets to develop a fabric that maintains the necessary inner bandwidth and coherence to make things scale up.
At some point many people thought "if M1 can do this, then a bigger M1 will break the market and bring alien technology jumps". Well, the bigger M1 is here and other than Apple's very focused workloads (where the GPU is basically treated as a FP coprocessor), it's really not as perfect as some thought. Despite being a >400mm^2 5nm behemoth clocked at what is probably the best possible clocks for an ideal power-performance ratio and using many channels of the most expensive smartphone memory out there.
 
Back
Top