Apple M1 SoC

PSman1700

Legend
Didnt know where to put the thread, the current M1 is a mobile SoC right. Didnt find any existing topic on Apple's M1 series.

Anyway, new LTT video.

 
'Nother M1 review. Conclusion was he liked the M1 more, personally.


Though in tests they where very close, thats not using the 3070, as pointed out in the comments section.

''Haizalio
8 days ago
I love this style of comparison. However, I see that the Blender test @ 13:36 seems unfair.
If someone is using Blender on the ProArt laptop, they would of course render with the RTX 3070 GPU using Optix rather than on their CPU.
According to aggregated data in Blender Open Data's website for the BMW Scene, here's the difference:

M1 Pro - CPU rendering: 211.36s ( ~3m 30s - Tallies with video)
R9 5900HX - CPU rendering: 190.59s (~3m 11s - Tallies with video)
RTX 3070 Laptop - Optix rendering: 17.8s.

That's over 10 times faster for a device 600$ cheaper.
This trend is comparable for the Classroom scene. About 10 times faster as well.

Maybe this should have been mentioned? Or maybe I missed something out.
You can check this out yourself at Blender Open Data's website.''
 
'Nother M1 review. Conclusion was he liked the M1 more, personally.


Though in tests they where very close, thats not using the 3070, as pointed out in the comments section.

''Haizalio
8 days ago
I love this style of comparison. However, I see that the Blender test @ 13:36 seems unfair.
If someone is using Blender on the ProArt laptop, they would of course render with the RTX 3070 GPU using Optix rather than on their CPU.
According to aggregated data in Blender Open Data's website for the BMW Scene, here's the difference:

M1 Pro - CPU rendering: 211.36s ( ~3m 30s - Tallies with video)
R9 5900HX - CPU rendering: 190.59s (~3m 11s - Tallies with video)
RTX 3070 Laptop - Optix rendering: 17.8s.

That's over 10 times faster for a device 600$ cheaper.
This trend is comparable for the Classroom scene. About 10 times faster as well.

Maybe this should have been mentioned? Or maybe I missed something out.
You can check this out yourself at Blender Open Data's website.''
So the guy says it's unfair and goes on to compare the x86 PC running on GPU versus the M1 running on CPU though there's Blender running on M1 GPU now? At least the original comparison is fair as it's CPU vs CPU.
 
The new iPad Air has the M1 and USB4 (Type-C). What I don't understand: Do iPads with USB-C support Alternate Mode for DP or not.

https://www.macrumors.com/2022/03/10/studio-display-1440p-res-4th-gen-ipad-air/
https://www.macrumors.com/2022/03/09/studio-display-incompatible-ipad-air-4th-gen/

Apple says that iPads with USB-C support native DP, but there is some kind of dependence on (USB) bandwidth?
From Macrumors:
"The ‌iPad Pro‌ models supported by the Studio Display feature USB-C with 10Gbps throughput (also known as USB 3.1 Gen 2), whereas the fourth-generation ‌iPad Air‌ and ‌iPad mini‌ 6 include a USB 3.1 Gen 1 5Gbps USB-C connection. This connectivity standard supports a single external display with up to 4K resolution at 30Hz."
 
So the guy says it's unfair and goes on to compare the x86 PC running on GPU versus the M1 running on CPU though there's Blender running on M1 GPU now? At least the original comparison is fair as it's CPU vs CPU.

Reading it , I believe he is saying that the windows laptop also has a gpu in it and the total cost of the laptop is $600 less than the mac. If you buy the cheaper windows pc you get the additional graphics card which runs it 10 times faster than the mac.
 
Reading it , I believe he is saying that the windows laptop also has a gpu in it and the total cost of the laptop is $600 less than the mac. If you buy the cheaper windows pc you get the additional graphics card which runs it 10 times faster than the mac.
Yeah, my head aligns with eastmen's statement here.

If you're just strictly doing CPU benchmarks, then I completely understand the perspective of a purely CPU vs CPU showdown. However, if we're talking about a complete product (which is how I interpret the review: laptop v laptop) then you should be using the full capabilities of the whole product. Said another way: why would someone who paid for a capable GPU for this work decide to NOT use it? There very well could be reasons why, and if they were relevant to this review, then it would've been nice to hear those reasons.

Absent other supporting details, this specific Blender result seems contrived.
 
Yeah, my head aligns with eastmen's statement here.

If you're just strictly doing CPU benchmarks, then I completely understand the perspective of a purely CPU vs CPU showdown. However, if we're talking about a complete product (which is how I interpret the review: laptop v laptop) then you should be using the full capabilities of the whole product. Said another way: why would someone who paid for a capable GPU for this work decide to NOT use it? There very well could be reasons why, and if they were relevant to this review, then it would've been nice to hear those reasons.

Absent other supporting details, this specific Blender result seems contrived.

Yea , so the question is if this is a cpu review or laptop review. If its a cpu review than it be valid to only compare it to the cpu. If its a laptop review then its obvious that the mac falls behind because it lacks the high end gpu.
 
What I was saying is that the guy is running CPU+GPU on the Windows laptop, no issue here. But he fails doing the same on the Mac, using only the CPU while there was a Blender version with CPU+GPU. That's the issue I had, at least if I remember correctly.
 
What I was saying is that the guy is running CPU+GPU on the Windows laptop, no issue here. But he fails doing the same on the Mac, using only the CPU while there was a Blender version with CPU+GPU. That's the issue I had, at least if I remember correctly.
Agreed. If Blender also provides a GPU accelerated mode on the Mac, the my statement above should apply equally: why wouldn't someone use a more efficient option if it's available? Honestly it makes the review even more suspect IMO.
 
Agreed. If Blender also provides a GPU accelerated mode on the Mac, the my statement above should apply equally: why wouldn't someone use a more efficient option if it's available? Honestly it makes the review even more suspect IMO.
Agreed. And there was another review that showed good results for M1 but was using an older version on the Intel laptop. Another suspicious review in the other direction...

I think the Blender version with Metal support is now officially out so I hope we'll see more fair comparisons in the coming week.
 
What I was saying is that the guy is running CPU+GPU on the Windows laptop, no issue here. But he fails doing the same on the Mac, using only the CPU while there was a Blender version with CPU+GPU. That's the issue I had, at least if I remember correctly.

I dunno , I am just going based on the comments here. Does the mac also have a gpu that is supported in blender ? If so then yea they should compare equally. But to me the mac would have to be much faster to justify the $600 price difference to me. Aside from that I wouldn't buy a laptop for work unless I was doing work while traveling. So spending $3 grand or whatever on a laptop is a huge joke for me but that is why I don't really comment on laptop threads.
 
I dunno , I am just going based on the comments here. Does the mac also have a gpu that is supported in blender ? If so then yea they should compare equally.
Yep it has. It was in beta back in December and has been officially released some days ago.

But to me the mac would have to be much faster to justify the $600 price difference to me. Aside from that I wouldn't buy a laptop for work unless I was doing work while traveling. So spending $3 grand or whatever on a laptop is a huge joke for me but that is why I don't really comment on laptop threads.
I definitely agree! I know people who game on laptop, I'd never do that either.

That being said I have no choice but to use a laptop for work when working for home for security reasons. But I didn't pay for it :)
 
Yep it has. It was in beta back in December and has been officially released some days ago.


I definitely agree! I know people who game on laptop, I'd never do that either.

That being said I have no choice but to use a laptop for work when working for home for security reasons. But I didn't pay for it :)
so was this review before or after it was released officially ?

Gaming on a laptop is fine if your traveling. To have one as your main rig is odd to me. Had a friend like that and offered to build him a pc. This was before the current crazy prices but he was floored by the difference in price and performance. One of his biggest blockers was the size of the case but I made him a small build and it was the size of a shoebox . Half the cost and almost 3 times the performance.
 
It was before official release but after the beta release. That's why I wrote above I was hoping to see more fair comparisons now that the official release is out.

maybe a new review will come out then. But like I said you still have $600 worth of additional price that the performance has to over come
 
maybe a new review will come out then. But like I said you still have $600 worth of additional price that the performance has to over come
I'm certainly not disputing that. I just dislike unfair performance comparisons; part of my day job is to help make sure CPU we design have good performance and doing wrong comparisons is the best way to make a very poor job.
 
I'm certainly not disputing that. I just dislike unfair performance comparisons; part of my day job is to help make sure CPU we design have good performance and doing wrong comparisons is the best way to make a very poor job.

I think at the time it was valid. If your using blender for your business or job your not going to want to use a beta version that could introduce issues. You'd be using an official release. Hopefully they go back and update the review or do a new review now that the official is out
 
I'm certainly not disputing that. I just dislike unfair performance comparisons; part of my day job is to help make sure CPU we design have good performance and doing wrong comparisons is the best way to make a very poor job.

Both platforms/architectures are going to benefit from different softwares.... Benchmarks are quite useless in my eyes though (goes for both sides).
 
Back
Top