Apple is an existential threat to the PC


His old Intel Mac Pro was faster than the new M1 Ultra. When it came to Prores M1 was faster, and as the reviewer notes, due to the media accelerators. Thanks for the review mkbhd!
Mac Pro will come, could double the ultra performance, wonder what the price will be. It would be trying to compete with pro-grade Intel/NV etc hardware for sure by then.
 
Last edited:
For video encoding I use Handbrake, for no particular reason than I know how to use it and it supports Apple's hardware encoding hardware, general GPU encoders, any eGPU hardware encoders and the CPU of course. Any device that can rub two bits together, Handbrake can use it to encode video and it uses open source libraries and has a great command line functionality. :yes:

Do people who do a lot of video encoding buy expensive hardware?

Do they upgrade when new codecs (like AV1) become more common?
 
Geekbench uses Metal, so it's the best showcase for Apple SoCs right now, the M1 Ultra scales badly compared to M1 Max, only achieving 41% better performance, and gets beaten by the 3070 on CUDA, which managed to be 50% faster than the M1 Ultra.

https://arstechnica.com/gadgets/2022/03/mac-studio-review-a-nearly-perfect-workhorse-mac/2/#h1

Also in Blender "without RT", the RTX 3090 is 4.9 times faster than M1 Ultra. The scaling from M1 Max to Ultra is only 60%.
Has there been a study comparing scaling on AMD or NVIDIA GPUs of these benchmarks?
 
Provided render scenes fit into the VRAM of a single GPU, blender will scale practically 100% across multiple. Most renderers will.
Note I’m not talking specifically about renderers or about multiple GPU each with their own VRAM but rather performance vs number of computation units on the same card.

I’m wondering what prevents scaling on M1 Ultra. Poor drivers? Blender being badly optimized? Some HW limitation? RAM bandwidth seems plenty enough.
 
Multiple GPUs have their own memory pools, the Ultra does not. Also even with TB/s of interconnect, those GPU cores are still more separated, higher latency. If they try to get smart and present them as one GPU it could simply scale worse than letting Blender treat it as two GPUs.

Frame parallel rendering for offline renderers which don't try to reuse previous frames is so simple it's going to be hard to beat.
 
That’s what I thought and why I am asking if there is scaling on a single GPU analysis somewhere. If not I will try to do something based on available benchmark results.
 
Thing is 3090 doesnt have to scale, one outperforms the Ultra by quite large margins. In the even higher range theres other products available (A100).
 
Well it is limited by its VRAM size, so there is that.
such an incredibly painful bottleneck when you hit vram limitations. If you don't have a libary that can do batch work or equivalent to reduce VRAM pressure, you're going to have to do the job differently. Such a PITA.
 
Blender is just generally badly optimised for macOS. Apple first joined the Blender Development Fund 6 months ago.

Ignore the clickbait title but this is an interesting look:

Tl;dw: Big time GPU utilisation issues. The GPU is rated for about 100-110w by Apple. But on the benchmarks where the M1 Ultra woefully underperforms, GPU utilisation is low (about 20-30%) and the GPU is only consuming about 30-40w; often times the GPU is underclocking. In apps and benchmarks where performance scales, the M1 Ultra is consuming its appropriate 100-110w power budget, and is running at high clocks.

Pretty strong sign that apps and benchmarks need to be written to scale up with Apple's scalable GPU architecture. A familiar tale: Performance is being hampered by poor software. Thankfully, software can be updated and rewritten.

It also fits the description by Andrei F on why Geekbench failed to scale with the M1/Pro/Max. No surprise why scores didn't scale for the Ultra.


EDIT: Tried listening to that LTT video. But had to stop after it became way too cringe. Listening to Linus trying to understand Apple and its decisions is about as pleasant as nails on a chalkboard.

"[Apple] puts real work and money into makings things worse and it makes me angry."

This man is either an absolute f***ing moron, or he is playing his audience along masterfully.
 
This man is either an absolute f***ing moron, or he is playing his audience along masterfully.
Linus plays to his audience, the only time he ever backs down and admits he was wrong are when his own audience pushes back. His videos are popular because he can feign faux outrage or disappointment about something so when faced with the need to focus on products where he can't do that, he'll conjure something else. It doesn't matter it it's accurate or not.
 
Curious if applications need to be updated or a system update will come out and fix the issues. If it requires explicit development on the application side that’ll be a problem. Curious if there’s a new Xcode sdk version for studio.
 
What kind of apps are going to maximize those GPU core utilization? I don't have a strong GPU in my Intel iMac but saw little or no difference with Lightroom.

Also, single core CPU performance isn't much better than even a standard M1?
 
That's all you had to say. There's just something about all of their videos that is extremely off-putting to me.

It's made for a youtube audience and I never was into Linus's stuff. I have to be really into the subject matter to bother (like steam deck). But they aren't wrong about everything they said
 
Curious if applications need to be updated or a system update will come out and fix the issues. If it requires explicit development on the application side that’ll be a problem. Curious if there’s a new Xcode sdk version for studio.

Apple’s own Final Cut Pro X benchmarks for the M1 Ultra are based on an unreleased beta build. :LOL: Bizarre.
This, combined with the undercooked software for the Studio Display, makes me believe that Apple released these too early.

What kind of apps are going to maximize those GPU core utilization? I don't have a strong GPU in my Intel iMac but saw little or no difference with Lightroom.

Also, single core CPU performance isn't much better than even a standard M1?

I recall that even the Da Vinci Resolve devs said that they’re only just getting started on extracting performance from M1. That was after the big Metal update they released recently too. A lot more optimisation yet to come.

GPU has been neglected in macOS for quite some time, and the deprecation of OpenCL and OpenGL on the platform hasn’t exactly helped. Apple’s commitment to their own GPU arch and Metal should hopefully be a sign of good things to come.

M1 and M1 Ultra share the same CPU architectures. I think they even clock the same (3.2GHz on the P cores, IIRC). So no surprise that ST performance is identical.

It's made for a youtube audience and I never was into Linus's stuff. I have to be really into the subject matter to bother (like steam deck). But they aren't wrong about everything they said

Once in a while they provide really good content. But more often than not their stuff is terrible, gimmicky, and presented in an overly opinionated manner. It’s the same problem I have with a lot of newspaper outlets nowadays: I’m looking for a journalistic piece not an opinion piece.
 
Back
Top