The Intel Execution in [2022]

Status
Not open for further replies.
All of Gen 9 graphics has been moved to legacy driver support. That's 6th through 10th gen CPUs along with related Atom, Pentium and Celerons.

As a side note that is a 1.1GB driver download! It's like 2 Windows XPs lol.
 
According to Intel's report, there was a 25 percent decline in consumer chip sales last quarter, and that the total PC market has also shrunk by 10 percent this year.
...
During the earnings call, Intel CEO, Pat Gelsinger, said that some of Intel's largest customers are “reducing inventory levels at a rate not seen in the last decade”. Due to the shrinking PC market, Intel's revenue was down by 22 percent in Q2 and there was a huge decline in profit compared to Q2 2021.
 
Look's like Intel is confirming it's plan to offset lower volume via increasing prices and believes demand will have recovery even with pricing increases.


David Zinsner -- Chief Financial Officer


Thanks, C.J. So when you look at the fourth quarter, first of all, we are, at this point, given the inventory burns in both CCG and DCAI, shipping it below the rate of consumption for those markets. And so there is a natural recovery that occurs that we would expect as we progress through the rest of the year once inventory is in a good place. So we do feel we're at the bottom here in terms of revenue in the Q2, Q3 time frame, and Q4 would recover just based on that alone.


I'd say the other thing is we've got a good set of products coming out over the course of the second half of the year. And I think we're kind of operating with wind at our sails in terms of product offerings in all of our businesses. And then third, we are increasing pricing, and pricing generally takes effect in the fourth quarter. We've done a fair amount of time.

David Zinsner -- Chief Financial Officer

On CCG and DCAI, I think the inventory correction in CCG was definitely more pronounced. And so you get this -- the benefit of the shipping back to the consumption level is more pronounced when things recover. That's No. 1.

We also will see more pricing improvement in CCG than DCAI. They're both -- we're adjusting pricing, but the pricing is more significant in CCG. And so that also gives CCG a lift in the later part of the year.

CCG (Client Computing Group). Whether or not this just more so towards volume customers (OEM) or will also affect retail isn't clear.
 
TrendForce's Intel Meteor Lake report seems to have caught the company's attention. Intel's PR firm sent out an unsolicited note this afternoon reiterating that Meteor Lake will be delivered in 2023, and manufacturing remains on schedule
— Ryan Smith (@RyanSmithAT) August 4, 2022
 
You would think the business analysts at Intel would have known it wouldn't be the work of an afternoon to break into the GPU market and accepted the fact they'd probably have to trow years and billions of dollars at it. Why give up now when they are finally getting some products out? Yes the products are (very) underwhelming but again, did anyone really expect otherwise? Wouldn't it take at least a generation or 3 to start competing with nvidia/amd?
 
You would think the business analysts at Intel would have known it wouldn't be the work of an afternoon to break into the GPU market and accepted the fact they'd probably have to trow years and billions of dollars at it. Why give up now when they are finally getting some products out? Yes the products are (very) underwhelming but again, did anyone really expect otherwise? Wouldn't it take at least a generation or 3 to start competing with nvidia/amd?
Also worth pointing out is that all GPU R&D goes from AGX expenses while the IGPs their R&D produces bring money to CCG instead. So the 3.5 billion isn't actually whole truth.
 
You would think the business analysts at Intel would have known it wouldn't be the work of an afternoon to break into the GPU market and accepted the fact they'd probably have to trow years and billions of dollars at it. Why give up now when they are finally getting some products out? Yes the products are (very) underwhelming but again, did anyone really expect otherwise? Wouldn't it take at least a generation or 3 to start competing with nvidia/amd?
The point is that it's unlikely that they will ever be able to compete in gaming and should look into pushing into data center compute only. This is a reasonable suggestion considering that the whole idea of making GPUs again was for that, not for gaming.
 
The point is that it's unlikely that they will ever be able to compete in gaming and should look into pushing into data center compute only. This is a reasonable suggestion considering that the whole idea of making GPUs again was for that, not for gaming.
They all seem to ignore the fact that big portion if not majority of AGX expenses is the architectural development, which they simply can't abandon.
Sure there would be a little less R&D if you'd just continue in IGP-world, but not that much - Intel has been keeping at the forefront of technologies even though they've been just in IGP world.
Also it's not unlikely they would be ever be able to compete in gaming, they got damn close already on first try (A380) and when we get 3rd party reviews of the rest, we'll see how they'll do - based on the previews it's not catastrophe at least.
 
They all seem to ignore the fact that big portion if not majority of AGX expenses is the architectural development, which they simply can't abandon.
Sure there would be a little less R&D if you'd just continue in IGP-world, but not that much - Intel has been keeping at the forefront of technologies even though they've been just in IGP world.
I dunno about that. Seems to me that IGP R&D would be some orders of magnitude less than what Nv/AMD spend on GPU R&D.

Also it's not unlikely they would be ever be able to compete in gaming, they got damn close already on first try (A380) and when we get 3rd party reviews of the rest, we'll see how they'll do - based on the previews it's not catastrophe at least.
They did?
A380 is slower than 6400 - that's a 7.2BT chip vs 5.4BT chip on the same process.
It's even worse against Nvidia I suppose.
This doesn't look like they "got close" to me at all.
 
I dunno about that. Seems to me that IGP R&D would be some orders of magnitude less than what Nv/AMD spend on GPU R&D.


They did?
A380 is slower than 6400 - that's a 7.2BT chip vs 5.4BT chip on the same process.
It's even worse against Nvidia I suppose.
This doesn't look like they "got close" to me at all.
Limit it to DX12/Vulkan and it looks already better, Intel is likely to gain more from drivers in near future than competition since they're still in so early stages (and by more I mean a lot more).
Wasn't everything relevant with RT now in your books anyway? It loses less FPS from RT than AMD does, NVIDIA doesn't have RT in same category to compare. It's media features are the best there are.
Yes, I would say they got close.
 
Limit it to DX12/Vulkan and it looks already better
Why would I do that? I don't limit myself to any API when I choose which games I play.

Intel is likely to gain more from drivers in near future than competition since they're still in so early stages (and by more I mean a lot more)
Competition will switch to a new generation over the same period though and I doubt that Intel will be able to get on par with that with drivers.

Wasn't everything relevant with RT now in your books anyway?
Not in the RX6400 performance range.
 
@DegustatoR you're not even at homer Simpson level of commitment , if at first you dont succeed , try again then give up!
Given the complete failure NV1 was Nvidia should just totally have exited the market as well.

Your fear of marketed threats to your precious is showing
 
@DegustatoR you're not even at homer Simpson level of commitment , if at first you dont succeed , try again then give up!
Given the complete failure NV1 was Nvidia should just totally have exited the market as well.

Your fear of marketed threats to your precious is showing
We are talking about the recommendation from JPR. And what's showing here is your own fanboyism, nothing else.
 
"Driver factor" is an inherent part of the product. You can't "remove it", it would make the product unusable.
It depends what the question is. If the question is "how good is Intel at making GPU products", then it's relevant. If the question is "how good is Intel at designing GPUs", then it's arguably not relevant, or relevant only in as much as hardware design imposes limitations on driver development.

If none of Intel's planned GPUs can be competitive even with perfect drivers, due to sub-optimal design, then that provides a strong case for cutting their losses and leaving the PC GPU space. But if Intel can be competitive (eg. with Battlemage) with sufficient investment in their driver team, then the situation is not as clear cut.
 
Status
Not open for further replies.
Back
Top