New Qualcomm silicon for new Surface May 2024

Surprisingly here, the Adreno X1 uses a rather large wavefront size. Depending on the mode, Qualcomm uses either 64 or 128 lane wide waves, with Qualcomm telling us that they typically use 128-wide wavefronts for 16bit operations such as fragment shaders, while 64-wide wavefronts are used for 32bit operations (e.g. pixel shaders).
You'd think that the author of the article wouldn't make glaring mistakes like this since they've been following graphics technology longer than a good chunk of us here ...
 
it is basically due to cloud and AI with their corresponding bubbles, it is something that denotes health in PC gaming
It doesn't matter if it's a bubble or not. The problem is that the money isn't coming from PC, so it means nothing to the health of PC.

You can paint AI over everything and the ARM cores can be fantastic, but WinRT is still a technological disaster and Microsoft still can't fully commit to ditching it. Sure all the high flyer teams at Microsoft are allowed to use fluentui webshit, but the penal teams still get forced to futily dogfood Winui3 and make windows worse for the users. The money doesn't help shit.

At least NVIDIA mostly uses the same cores across product classes, so PC piggybacks there. But there are some perverse incentives there too. Are they going to try to be the saviours of PC if it potentially hurts their AI revenue?
 
A few days ago, a report by Reuters disclosed the fact that major system integrators such as Microsoft, Acer, and ASUS sat down with Qualcomm's CEO Christian Amon, with the primary discussion involving the ongoing lawsuit between ARM and Qualcomm on an issue which is now existent for more than two years. While the lawsuit hasn't been conclusive yet, if ARM wins the case, then Qualcomm's CPUs in the markets might be in huge trouble.
...
With this lawsuit hopefully concluding in the future, things could go sideways for Qualcomm if ARM wins its case since it could potentially halt the sales of the firm's Snapdragon X Elite SKUs, potentially affecting plenty of system integrators that have already put out a lineup on the new processors and ARM has seen massive adoption out of it.

It is said that the Qualcomm-ARM dispute has been a primary concern for manufacturers. In light of this, companies are now leaning towards an NVIDIA-MediaTek AI PC chip solution for the markets, with Microsoft and others in line for immediate adoption. We'll have to wait and see how the situation pans out.
 
From what I remember the main points are:

1) ARM gives out licenses for specific segments when companies want to make something, so server, laptop/whatever - all different licenses. Agree to the license and you agree to use it for that purpose and that purpose only

2) Nuvia had a licensing agreement to make server CPUs

3) Qualcomm buys Nuvia for the IP and since they have other licensing agreements they say they can use Nuvia's IP for what they want - phones, laptops, PCs etc. They think their newly owned IP now falls under their prior agreements

4) ARM says the licenses are non-transferable so they can't do that, it is in violation of the original licensing agreement. That IP was not made for consumer products so can't be used for consumer products

5) Spirited discussion ends in upcoming court battle

So it's either ARM don't know how to write non-transferable licensing agreements/they don't hold up in court, or Qualcomm can't read contracts and spent a billion on IP they can't use. Either way someone's messed up big time
 
To have the Nuvia license be relevant it doesn't have to be non transferable, it has to be infective. It doesn't just have to give Nuvia a limited right to distribute, but it has to confer some rights to ARM over the derived parts of the Nuvia design other than Nuvia needing a license to distribute.

Nuvia had a license to distribute a derivative design. Qualcomm has a license to distribute a derivative design. Transferability of the license is irrelevant.
 
The “40% faster than M3” is total BS;

it is regarding the highest end 80 watt Qualcomm model with active cooling, compared to the M3 with passive cooling after running for a while and having throttled down.

Only in that specific scenario, in a probably very specific benchmark, are they able to claim up to 40% faster then M3.

I am pretty sure M3 even in the iPad would destroy the 80 watt Qualcomm most of the time. We will see tomorrow when some embargoes lift
 
The “40% faster than M3” is total BS;

it is regarding the highest end 80 watt Qualcomm model with active cooling, compared to the M3 with passive cooling after running for a while and having throttled down.

Only in that specific scenario, in a probably very specific benchmark, are they able to claim up to 40% faster then M3.

I am pretty sure M3 even in the iPad would destroy the 80 watt Qualcomm most of the time. We will see tomorrow when some embargoes lift
There is no M3 iPad though 😅

The M4 iPad Pro already matches the Qualcomm in multicore (GB6) and is vastly faster in single core (upwards of 800-1,000 points).
 
What Blu-ray drive can you recomend for the iPad with Mx?

These days, are Blu-Ray drives in heavy use? Actually, UHD Blu-Ray is what people would want, if they want an optical drive at all?

So it can run Crysis?

Can you post some Handbrake results on iPad?
Have these been ported to iPad?

Could it run a Crysis like game just not like a high-end PC with a $500 or more graphics card?

I agree though that transcoding large video streams would not be ideal on an iPad but it would probably have more to do with storage and IO than CPU, though the limited RAM on iPad SOC would probably be a factor as well.
 
These days, are Blu-Ray drives in heavy use? Actually, UHD Blu-Ray is what people would want, if they want an optical drive at all?


Have these been ported to iPad?

Could it run a Crysis like game just not like a high-end PC with a $500 or more graphics card?

I agree though that transcoding large video streams would not be ideal on an iPad but it would probably have more to do with storage and IO than CPU, though the limited RAM on iPad SOC would probably be a factor as well.

I own some HD DVDs and Blu-rays and a LG HD DVD/Blu-ray drive, so I can rip them on my Surface Pro X, encode them and watch them on my SPX and Surface Duo. And I can play BioShock (and Crysis) on it.

Who cares about iPad's numbers in Geekbench, if it cannot run Crysis? It's a crippled iPadOS device without a calculator.

And macOS... Sry, no.
 
Forget handbrake; the M3 iPad can handles multiple 8K ProRes streams simultaneously. Handbrake is a legacy app which is not even properly multithreaded
It has hardware blocks for ProRes. Even if that were not true, near lossless is easy mode ... there is no proper multithreading for high ratio compression of a single video, it's just a mess of compromises.

PS. and obviously hardware encoding sucks, if you are professionally doing offline encoding to AVC and you're not using x264 you're not good at your job.
 
I own some HD DVDs and Blu-rays and a LG HD DVD/Blu-ray drive, so I can rip them on my Surface Pro X, encode them and watch them on my SPX and Surface Duo. And I can play BioShock (and Crysis) on it.

Who cares about iPad's numbers in Geekbench, if it cannot run Crysis? It's a crippled iPadOS device without a calculator.

And macOS... Sry, no.
I don't think it matters if you could do those things on the iPad from your perspective 😅

We are simply calling out Qualcomm for their "marketing" performance numbers and pointing to a product that doesn't even have active cooling that can beat that at much better performance per watt.

But the M4 (as all other Apple Silicon) SoCs are not limited to the iPad, although the M4 is for now. You can obviously do all those things in macOS.
 
I own some HD DVDs and Blu-rays and a LG HD DVD/Blu-ray drive, so I can rip them on my Surface Pro X, encode them and watch them on my SPX and Surface Duo. And I can play BioShock (and Crysis) on it.

Who cares about iPad's numbers in Geekbench, if it cannot run Crysis? It's a crippled iPadOS device without a calculator.

And macOS... Sry, no.
But these are not typical uses.

Every year, fewer and fewer people bother with optical media.

Issue isn't whether you like the OS or not. Citing niche functions which few people bother with is not going to be the criteria people rely on to make purchasing decisions.
 
It has hardware blocks for ProRes. Even if that were not true, near lossless is easy mode ... there is no proper multithreading for high ratio compression of a single video, it's just a mess of compromises.

PS. and obviously hardware encoding sucks, if you are professionally doing offline encoding to AVC and you're not using x264 you're not good at your job.
Well, if you have to rely on handbrake then you probably don’t even have a job xD

But for real people who can afford iPad Pro probably don’t need to use piracy
 
I don't think it matters if you could do those things on the iPad from your perspective 😅

We are simply calling out Qualcomm for their "marketing" performance numbers and pointing to a product that doesn't even have active cooling that can beat that at much better performance per watt.

But the M4 (as all other Apple Silicon) SoCs are not limited to the iPad, although the M4 is for now. You can obviously do all those things in macOS.
Does a M4 make a SSD replacement faster on a Mac?
 
Back
Top