Apple (PowerVR) TBDR GPU-architecture speculation thread

Apple has been fairly active in taking advantage of the GPU as a computational resource, but they have also been adding closely coupled coprocessing to their SoCs to adress particular needs. It’s one of the areas of interest in their transition to their own silicon. Will they simply make wider SoCs for the Macs, or will they add new functionality that doesn’t fit the constraints and use cases of phones? Nobody knows.
 
I know, I'm very naive, but Apple will never release an Apple silicon based Mac (iMac, MacBook/Pro, Mac mini etc), that is slower in games than Intel based Macs. Ok, it's not hard to beat a Mac mini with Intel Graphics HD630, but newer Macs have Intel Iris Plus Graphics or dedicated Polaris/Vega/Navi graphics cards.
 
I fully expect Apple to eventually join the dark side (immediate mode) like the others (AMD/Intel/Nvidia) did a decade ago if they care about high-end next generation graphics unless they're totally content iOS quality graphics or this soon to be last generation quality graphics. Before this generation started, desktop vendors found out that running duplicated vertex shader invocations and sorting primitives per-tile wasn't a sane idea for their case and it's still the case for the current generation and likely for the next generation as well with UE5's demo of micropolygon rendering ...

Most developers also rely on common patterns and behaviour of immediate mode GPUs too so AAA games are still going to be optimized around them. Apple doesn't have a choice since consoles and all desktop vendors have converged in this path. High-end GPU designs are practically rigged against tile-based GPUs because many developers are biased against them too ...
 
Unreal is going there and that means a significant chunk of the game industry is.
Not a chunk that necessarily contains Apple/iOS. But I suspect Unreal Engine will want to remain viable middleware on that platform. It represents, as you put it, a fair chunk of gaming revenue, and at the end of the day the gaming industry is a business. Publishers decide these things, not tech.
From now on Apple aligns, or rather co-evolves their MacOS graphics with their mobile, whatever that may mean in the future.
 
Is it likely that Apple intends to switch over to internal silicon for GPU's across the board though? I can definitely see them leveraging their own designs for iGPU's, which all Apple products will ship with. Extending that design as far as is practical, maybe reaching a decent low- to midrange more or less unchanged? It'd certainly maximise their ROI. But at some point optimising for higher performance could very likely necessitate an off-shoot design with a consequently minimal ROI. Of course Apple is wont to do as Apple is wont to do, maybe they'll just swallow that cost thanks to the higher margins elsewhere. Still I see it as a bit of a wasted effort on their part when the likes of AMD can offer a ready solution for higher power dGPU needs for, presumedly, far less.
 
Is it likely that Apple intends to switch over to internal silicon for GPU's across the board though? I can definitely see them leveraging their own designs for iGPU's, which all Apple products will ship with. Extending that design as far as is practical, maybe reaching a decent low- to midrange more or less unchanged? It'd certainly maximise their ROI. But at some point optimising for higher performance could very likely necessitate an off-shoot design with a consequently minimal ROI. Of course Apple is wont to do as Apple is wont to do, maybe they'll just swallow that cost thanks to the higher margins elsewhere. Still I see it as a bit of a wasted effort on their part when the likes of AMD can offer a ready solution for higher power dGPU needs for, presumedly, far less.

Does apple really need anything but iGPU? If we look at something like next xbox apple could likely create massively faster soc on 5nm. Even on 7nm apple could likely double the xbox perf via doubling chip size and allowing higher power budget(better clocks). This hypotethical 7nm product would be 16core/32thread cpu and 25+ TFlops gpu. This is of course assuming apple cpu and gpu designs are roughly similar transistor count per perf as amd socs are. Might be good or bad assumption.

If mac pro model requires more power then stuff more than 1 of those socs into motherboard and call it done. The new mac pro anyway has extension board for faster videoediting(optional afterburner card). Apple could offload some tasks to special chips that come as extension boards. igpu might allow for some great benefits as the communication between cpu/gpu wouldn't need to go via slow external bus. Only soc to soc communication would then require pci-e or similar.
 
Last edited:
Does apple really need anything but iGPU? If we look at something like next xbox apple could likely create massively faster soc on 5nm. Even on 7nm apple could likely double the xbox perf via doubling chip size and allowing higher power budget(better clocks). This hypotethical 7nm product would be 16core/32thread cpu and 25+ TFlops gpu.

In some cases I think they will be forced to at least offer performance options as they do today. And for the high end I remain sceptical of their ability to do so. For the simple reason that what you're suggesting (and you're certainly not alone in suggesting) as a performance scaling option hasn't been credibly done by anyone else to produce a high performance GPU among those with existing low-end architectures. And I fail to see how Apple would magically make that approach work. It seems to me that designing dedicated high performance graphics silicon is a resource intensive, expensive, and difficult undertaking regardless of prior work.

As you allude to: It certainly begs the question as to wether or not Apple needs or wants to compete on the same playing field as the rest of the GPU producing industry, and wether the software industry sees merit in supporting whatever way they choose forward. But that's sort of a larger question I think is even more difficult for anyone but Apple to answer. And maybe not even them.
 
I would expect apple to be different than other companies as they control os, software and hardware. It's different business if you only build one or the other and have to get the other half from open market.

If apple sees video editing being important they will just use the afterburner card instead of cpu/gpu(they already do). If apple has need for neural networks they will likely upsize the IP block they already use in mobile and use that in mac pro's instead of gpu. GPU is great but apple is in position where they can purpose build solutions as they control both software and hardware. Apple doesn't necessarily need to repurpose gpu to do something that purpose built accelerator would do better. Depending what one uses the mac pro for one buys the right extension cards from apple.
 
Apple doesn't necessarily need to repurpose gpu to do something that purpose built accelerator would do better. Depending what one uses the mac pro for one buys the right extension cards from apple.

I'm not arguing that that's not true - and this line of discussion is veering away from the more technical topic and into corporate politics - but Apple would do well to make them either an obvious alternative around which to build a production pipe-line (which is a massive investment), or edge themselves into existing ones by being better than the alternative and somewhat compatible with the rest of the pipe-line. I'm not sure they are at a point where an all Apple eco-system would provide such a value. Yet. Especially not to such a substantial market that software providers would automatically jump on-board. Without them Apple would have to rely entirely on their own products for that sales pitch. Making it much harder in my opinion. But again. This is both speculation and off-topic.
 
They don't control anything 3rd party.
3rd party stuff which is necessary to make AAPL toys into something besides world's most expensive lineup of Facebook machines.

Control is bad word, my mistake. I meant that control only in context of apple hw, apple os and apple applications.

Example of apple power reaching out is the usage of these afterburner cards for video editing on new mac pro's. No need to put in gpu for that use case.

https://www.apple.com/shop/product/MW682AM/A/apple-afterburner-card

Another example to show apple reach outside is that developers have to use metal api to get stuff done. Apple can hide their special hw behind their own metal api and not worry about vulkan, opencl, dx,...
 
Unreal is going there and that means a significant chunk of the game industry is.

Doesn't mean anything since Unreal Engine have "low-end" builds with less features. Until Apple's in-house GPUs are capable of running Lumen/Nanite or other advanced features included in the console/PC backends, they are not worth entertaining the idea for usage in high-end graphics ...
 
Example of apple power reaching out is the usage of these afterburner cards for video editing on new mac pro's.
Only for a very specific AAPL-approved codec the likes of RED slaves don't even touch.
tl;dr Apple needs a proper not-a-joke GPU.
 
What I was trying to claim is that it's possible to build a very fast 5nm soc. But one must be willing to make it big and have a nice cooling+power delivery. It's completely possible to supplement such soc with special accelerators and/or insert 8+ socs into one machine. Apple could do their own ai/video edit/... accelerators if they are better from apple's pov than trying to stuff everything into gpu. Apple could also bring in bigger gpu as accelerator.

When one only has a hammer everything looks like a nail. Apple has luxury of using more tools than just a hammer. It's going to be interesting to see what they do considering they are not bound by similar constraints like other companies. Apple already has adobe&co onboard. Whatever approach apple took is being supported. One thing not yet mentioned is chiplets. Who knows if apple is doing socs, cpu+dgpu or possible chiplets with cpu/gpu/... chips in it.

Why I'm thinking socs is that it would give nice scaling from smallest apple device to largest. In essence sw development is very similar be it iphone, tablet, laptop, imac or mac pro. And the accelerators is where divergence could be introduced. Bigger gpu could be one of those accelerators. SW cost is giant and will continue to be there always. Having solution making sw cost less is very valuable. Also HW design cost might be less as the design would be "more of the same in bigger chip" type problem.
 
Last edited:
What I was trying to claim is that it's possible to build a very fast 5nm soc. But one must be willing to make it big and have a nice cooling+power delivery. It's completely possible to supplement such soc with special accelerators and/or insert 8+ socs into one machine. Apple could do their own ai/video edit/... accelerators if they are better from apple's pov than trying to stuff everything into gpu. Apple could also bring in bigger gpu as accelerator.

When one only has a hammer everything looks like a nail. Apple has luxury of using more tools than just a hammer. It's going to be interesting to see what they do considering they are not bound by similar constraints like other companies. Apple already has adobe&co onboard. Whatever approach apple took is being supported. One thing not yet mentioned is chiplets. Who knows if apple is doing socs, cpu+dgpu or possible chiplets with cpu/gpu/... chips in it.

Why I'm thinking socs is that it would give nice scaling from smallest apple device to largest. In essence sw development is very similar be it iphone, tablet, laptop, imac or mac pro. And the accelerators is where divergence could be introduced. Bigger gpu could be one of those accelerators. SW cost is giant and will continue to be there always. Having solution making sw cost less is very valuable. Also HW design cost might be less as the design would be "more of the same in bigger chip" type problem.

They already sell a plugin video accelerator for the new Mac Pro. Right now it only works for Pro-Raw, but apparently other formats like Redcode-Raw are being added.

I'm not actually sure what other accelerators they'd even add though. Go too deep down the tradeoff pipeline and you get stuck in a narrow hole where the fancy hardware you just bought only works for one exact program that works one exact way and you can never update how it works or it'll break compatibility.

In fact with Arcturus coming out, maybe next year, I'm not sure the video accelerator is even that good of an idea. Apparently the Vega Pro II already does pretty well with rendering 8k video, and if Arcturus say, doubles that performance, then how useful really is the accelerator?
 
In fact with Arcturus coming out, maybe next year, I'm not sure the video accelerator is even that good of an idea. Apparently the Vega Pro II already does pretty well with rendering 8k video, and if Arcturus say, doubles that performance, then how useful really is the accelerator?
Arcturus can't into 3D, has no display engines and is not a GPU at all.
 
MfA may well have a point that Apple (and eventually the industry) may re-imagine the GPU as a more general compute device. It would certainly fit their overall vision for the Mac. On the other hand from now on their GPUs are tied to the evolution in mobile technology, where efficiency is the overriding concern.
It’s not clear to me how that will ultimately play out, but at least in the short term, that means that we will see TBDR graphics outside of mobile space for the first time in 20 years or so. Which, for someone like me who has always had a soft spot for the approach, counts as fun. :)
Macs have never targeted gamers. They have been about making computers more accessible as tools for creators, aiming at removing having to learn computer arcana as a barrier. (While in its second wave building on a foundation, NeXT, that was designed explicitly for programmers.) And while Jobs is dead, and Ive is finally gone, that basic idea of what a Mac is about seems to remain, making even more sense now that they their iPhones and iPads and AppleTV serve consumption of media and anyone really who don’t want to care about "file systems" better.

Gaming is a lot more than the AAA games that publishers put out on consoles, and sometimes port to PCs. Apple won’t chase those, they will come if the expected ROI for the publishers is sufficient. If they stay off the platform, that just means that the money Mac or iOS users spend on entertainment will go elsewhere.

For those of us that are interested in the inner workings of computers, Apples new SoCs are likely to be interesting in terms of memory subsystems, closely attached coprocessors and thus on-chip/package communication, multi-core organisation and yes, GPU capabilities. It’s an opportunity for something other than just extrapolating PC evolution, and that in itself is, at least for me, exciting.
 
ImgTech got their RT from buying Caustic back in 2010, and introduced their Wizard stuff (PowerVR GR6500 was first I believe) in 2014.
Ironically, Caustic was founded by ex-Apple folks. I wonder if they have wound up back at the mothership by now. I doubt this is what Apple licensed in this recent round since they never showed interest in the RTRT stuff back when they licensed ImgTech GPU solutions, but who knows?

https://www.viewmagic.com/team

Not really ;)

IMHO the recent license from Apple was merely to cover legalities because they're still using parts of Imagination GPU IP. My gut feeling tells me that Apple has not licensed Alborix yet and if they ever do then most likely a future generation of it which integrates ray tracing into its pipeline.

This could also explain why Imagination/PowerVR was looking again at high-performance GPUs a little while back

https://semiaccurate.com/2020/04/04/chinese-government-takes-control-of-imagination-technologies/

The link might be misleading. Either way since Apple, Samsung, Qualcomm and others develop these days their own or semi custom GPU IP, GPU IP is completely dead for consumer markets and there's hardly much left for Imagination to survive like the automotive market.

Charlie probably was on the right track with his writeup above in the beginning, but must have lost track somewhere down the line. IMHO China doesn't want to absorb IMG or use for any [insert any absurd conspiracy theory you can think of...], but has most likely something in mind that goes along your chain of thought above.
 
Last edited:
IMHO the recent license from Apple was merely to cover legalities because they're still using parts of Imagination GPU IP. My gut feeling tells me that Apple has not licensed Alborix yet and if they ever do then most likely a future generation of it which integrates ray tracing into its pipeline.
They're unlikely to license any microarchitecture which the recent press release talked about due to its lack of "multi-use" language, it's probably again a wide-range architecture and IP license as an extension of what they're already had for half a decade. Also note people saying Apple broke up with them; those press releases are scrubbed from the web and as far as I know the royalties never stopped flowing (till the new deal). Apple played dumb in regards to the "clean-room GPU" claim, indicating again that it never existed.
Either way since Apple, Samsung, Qualcomm and others develop these days their own or semi custom GPU IP, GPU IP is completely dead for consumer markets and there's hardly much left for Imagination to survive like the automotive market.
There's still a larger market outside of those, for example they're still leading the TV SoC market.
 
Back
Top