Apple is an existential threat to the PC

Apple really only care about the parts of the industry who will pay, i.e. creators.

Compared to gamers that's truly a tiny user base. They care about the parts of the consumer base which will pay, i.e. normies.

Creators are only useful for halo, but even then there have been long stretches where Apple ignored them entirely ... what they going to do, not buy Apple?
 
Compared to gamers that's truly a tiny user base. They care about the parts of the consumer base which will pay, i.e. normies.

Apple sell more high-end laptops (probably to content creators) than Nvidia sell high-end graphics cards. I guess this is the reason why Apple continue to develop pro software like Logic, Final Cut and Motion and also why Apple felt compelled to develop the Afterburner card for the Mac Pro - which the new M1 Macs far exceed in terms of performance.

Pros are willing to spend a lot of hardware and software because they can pass the cost on to their customers.
 
I did some digging on this and clearly Lightroom Classic is not using the resources as well as the newer code base in Lightroom CC.

Plenty of code out there will not be optimised to make best use of both CPU and GPU hardware for some tasks because traditionally it's not been worth the hit waiting for data to transfer back and forth between the two RAM pools whilst these two components do what they do best. Shared RAM pools change that. There was an interesting series of tweets by Affinity Photo's lead on this because their tech stack is.


You're best off clicking through to the twitter link and reading the series of tweets which explains the problem. It'll be interesting to see how long it'll take Adobe to adapt their code to leverage what the M1Pro/Max chips can do.
 
Imagine if the PS5 was running Apple hardware or if it was a Apple console. With 200+ watts to play with and the size, the performance would have been quite something else.
Ofcourse for a product that doesnt do specific content creation they could leave out the pro-res etc accelerators.
 
Imagine if the PS5 was running Apple hardware or if it was a Apple console. With 200+ watts to play with and the size, the performance would have been quite something else.
Ofcourse for a product that doesnt do specific content creation they could leave out the pro-res etc accelerators.

What would the cost be though? How could you price such console, or rather, how many cutbacks would be necessary for it to reach similar costs to a PS5? I would love to see that discussion happen but I'm not informed enough on these matters to kickstart it myself...
 
What would the cost be though? How could you price such console, or rather, how many cutbacks would be necessary for it to reach similar costs to a PS5? I would love to see that discussion happen but I'm not informed enough on these matters to kickstart it myself...

If it's Apple, price doesnt matter. Look at other Apple products.
 
I’m theory Apple could make a very viable gaming console with their existing hardware and software ecosystem. They just won’t. AppleTV will be the closest thing they have to a console but it will target mobile-style games.

I dont see them as a threat either. But as per the topic, Apple could take over everything if they wanted....

Another comparison video. Quite intresting, really like the mac's screen and audio capabilities for a laptop. Also impressive is the GE66's SSD/IO performance, didnt know laptops where pushing +7gb/s on the nvme side of things. Both are unplugged during the benchmarks (i think?).

 
Another comparison video. Quite intresting, really like the mac's screen and audio capabilities for a laptop. Also impressive is the GE66's SSD/IO performance, didnt know laptops where pushing +7gb/s on the nvme side of things. Both are unplugged during the benchmarks (i think?).

I've been 'borrowing' one of work's 14" MBP (M1MAX) laptops at home for the past week and it's a very impressive piece of equipment - speaking as somebody who has long used Macs so I'm used to the extra effort Apple put into their computers, i.e. providing good quality screens and decent(ish) audio. I'm not sure I'm quite ready to spring that much money on a new laptop for myself but it feels like very few compromises were made.

Stupid fast, lasts forever and even things like plugging in an external monitor is just instant. The most intensive work I personally do is transcoding ripped Blu-ray disc moves to to H.265 (HEVC) and it's stupid fast compared to my Ryzen 9 5900HS, or any of my desktops including hardware-accelerated Nvidia 3080 transcodes.

I'm really curious what they can delivery in the next gen MacBook Air.
 
I’m really curious how the Mac mini could perform as a live-streaming device for OBS. Seems like video encoding is a real strength. Not sure what’s available for usb capture cards and if that would make it feasible.
 
Just use something like the Blackmagic Design ATEM Mini for $295 or the Blackmagic Design ATEM Mini Pro for $495. You can also pair it with a ATEM Streaming Bridge if needed for more locations.

I don't really have a need for it. I just think it's an interesting use case. Streamers build these really insane pcs to use as their capture pcs, because they do cpu encoding for best quality. The M1 devices seem to really excel at video encoding, and I'm wondering if they could actually end up being an economical option with lower power consumption. Have a feeling the issue will strictly be native support of the M1 architecture.
 
The M1 devices seem to really excel at video encoding, and I'm wondering if they could actually end up being an economical option with lower power consumption. Have a feeling the issue will strictly be native support of the M1 architecture.

I agree, the M1MAX has two dedicated 10-bit HEVC encoders and if software could leverage those along with the CPU and GPUs core to boost overall encoding performance it could turn out really bonkers performance for people who encode a lot of video. Right now I can have Handbrake encode video in the background and really not notice any impact to performance for browsing, Netflix and office tasks - something I've never really been able to do before.
 
Last edited by a moderator:
I don't really have a need for it. I just think it's an interesting use case. Streamers build these really insane pcs to use as their capture pcs, because they do cpu encoding for best quality. The M1 devices seem to really excel at video encoding, and I'm wondering if they could actually end up being an economical option with lower power consumption. Have a feeling the issue will strictly be native support of the M1 architecture.
It seems I may be misunderstanding you but those ATEM devices do multi-camera live streaming in device directly to your streaming platform of choice.
 
I've been 'borrowing' one of work's 14" MBP (M1MAX) laptops at home for the past week and it's a very impressive piece of equipment - speaking as somebody who has long used Macs so I'm used to the extra effort Apple put into their computers, i.e. providing good quality screens and decent(ish) audio. I'm not sure I'm quite ready to spring that much money on a new laptop for myself but it feels like very few compromises were made.

Stupid fast, lasts forever and even things like plugging in an external monitor is just instant. The most intensive work I personally do is transcoding ripped Blu-ray disc moves to to H.265 (HEVC) and it's stupid fast compared to my Ryzen 9 5900HS, or any of my desktops including hardware-accelerated Nvidia 3080 transcodes.

I'm really curious what they can delivery in the next gen MacBook Air.

A family member got the M1 max (32core) yesterday, thing just oozes quality, from the screen, the keyboard, trackpad and in special the audio system, easily the best audio from any laptop so far ive ever heard. How they got so much audio quality and volume in a small laptop, something other manufacturers really need to take note on. its a whole-day on-battery pc, you could charge it over-night, take it work, use it all day, go home and use it again untill you charge it overnight again.
From an energy-perspective i can see why companies would invest in those, employees wouldnt even have to bring their chargers in most cases.

I dont do any video encoding or content creation etc, but its a fast machine, hardly gets warm and its silent for most of the time. Impressive machine and something Intel/AMD/NV etc really should improve on.
Its not for me (windows user, gamer), but for work-environments its a must-have, the larger the company the larger the savings on just energy consumption alone.
The screen btw, its a notch brighter then the GE66 (my gaming laptop), not as saturated though, but probably more accurate. Used crystalmark for the first time, their both ultra fast in IO speeds.
 
It seems I may be misunderstanding you but those ATEM devices do multi-camera live streaming in device directly to your streaming platform of choice.

Does it handle all of the overlays that are common for live streaming? Honestly not sure why most streamers use capture cards and encode on the cpu. I’m assuming OBS does things that require re-encoding,

Edit: Yah, it doesn't seem like streamers could do animated or highly customized overlays. If you feed the atem mini into obs then obs has to re-encode. I actually think a lot of streamers could do without all of the bullshit they throw on screen, but if they want to use stream elements or something for alerts, they need to use software like OBS. Assuming the M1 chip gets full support, it could make for a pretty compelling streaming device. There is the problem of audio vsts as well. I think they need to be recompiled for arm64.
 
Last edited:
Does it handle all of the overlays that are common for live streaming? Honestly not sure why most streamers use capture cards and encode on the cpu. I’m assuming OBS does things that require re-encoding,

It does but you still need a computer for the software control to load the assets in.
 
its a whole-day on-battery pc, you could charge it over-night, take it work, use it all day, go home and use it again untill you charge it overnight again.

It's not like Athena (now Evo) ultrabooks weren't already making the screen the dominant power consumer for office work. Even without EUV, using little power while idling 99% of the time isn't a huge achievement hardware wise, just required a central authority to stop all the random fuck ups piling up with normal PC hardware development.

The only part Intel doesn't have authority over would be windows, windows randomly fucking up and causing power consumption spikes is the biggest problem ... not the hardware.

PS. Microsoft does seem understand the importance of low power consumption, but I get the feeling they always put most the blame on third party apps and rarely look at their own internal and increasing complexity in Windows. More and more hurdles for third party stuff to screw things up in the background, more and more of their own background processes screwing up.
 
Last edited:
Back
Top