Intel Launches Comet Lake (Core 10th Series)

D

Deleted member 7537

Guest
I haven't seen any post about the new line of chips. The chips are already out as well as reviews, main takes:
- Intel has greatly improved thermal performance.
- The 10900k consumes A LOT of power.
- Intel keeps the absolute lead in gaming and some productivity tasks.
- The 10600k (6/12) is a very good value proposition.
- You'll need a new socket, it will support gen 11th.

10600k reviews:
https://www.guru3d.com/articles-pages/intel-core-i5-10600k-processor-review,1.html
https://www.techradar.com/reviews/intel-core-i5-10600k

10900k:
https://www.tomshardware.com/reviews/intel-core-i9-10900k-cpu-review
https://www.anandtech.com/show/15785/the-intel-comet-lake-review-skylake-we-go-again/15

116041.png


116038.png


116061.png


116093.png
 
Last edited by a moderator:
Just going to leave this here. You also need a new motherboard and better cooling. Still stuck at PCIe v3. You can currently get the Ryzen 9 3900X for $390 at Micro Center. Curiously the review lack any thermal measurements.

116012.png
 
Yikes, nearly 2x the power rating than the equivalent spec AMD? Am I reading that right, the 10600 Intel and 3600 AMD.

And double yikes, still stuck on PciE v3 for Intel. When were they planning on Pcie v5, for consumers, next year?

Those prices for AMD that AT used seem way higher than current retail costs, as Pressure pointed out. What are they using, launch prices from 6 to 9 months ago?
 
Yikes, nearly 2x the power rating than the equivalent spec AMD? Am I reading that right, the 10600 Intel and 3600 AMD.

And double yikes, still stuck on PciE v3 for Intel. When were they planning on Pcie v5, for consumers, next year?

Those prices for AMD that AT used seem way higher than current retail costs, as Pressure pointed out. What are they using, launch prices from 6 to 9 months ago?

Not to mention the 3600 is currently $160. Not to derail the thread.
 
By the way, if you spend some money on a high quality ddr4 memory, the 10600k will give you 10900k results in gaming.


As for the rest of the comments, Skylake is a dying arch, I myself have the 6600k bought 5 YEARS ago. It's too long and it shows how Intel really dropped the ball with 10nm and also shows how far behind AMD was, as they just been able to match Skylake in gaming performance.
 
By the way, if you spend some money on a high quality ddr4 memory, the 10600k will give you 10900k results in gaming.
The sobering realization hits when you let it sink in that an AMD 3300x with fast memory is pretty much up there as well. PC hardware stagnation, typical game code and the fact that GPUs limit performance too, makes spending a lot of money on CPU upgrades rather pointless as far as gaming is concerned.
(I run an intel 3770 at 4.2GHz all cores, conditions where it draws 75 easy to cool Watts. (I’ve got 24GB of fast low latency DDR3.) There is no product available that will net me more than 50% in gaming, and since the GPU introduces limitations as well, there’s no rational point to upgrading my game box. It’s been 8 years.)
From a tech enthusiast point of view, this stagnation is terrible. From a gaming point of view, it’s a good thing. I can enjoy playing whatever interesting new games that are published and spend my money on other than PC hardware components.
 
The sobering realization hits when you let it sink in that an AMD 3300x with fast memory is pretty much up there as well. PC hardware stagnation, typical game code and the fact that GPUs limit performance too, makes spending a lot of money on CPU upgrades rather pointless as far as gaming is concerned.
(I run an intel 3770 at 4.2GHz all cores, conditions where it draws 75 easy to cool Watts. (I’ve got 24GB of fast low latency DDR3.) There is no product available that will net me more than 50% in gaming, and since the GPU introduces limitations as well, there’s no rational point to upgrading my game box. It’s been 8 years.)
From a tech enthusiast point of view, this stagnation is terrible. From a gaming point of view, it’s a good thing. I can enjoy playing whatever interesting new games that are published and spend my money on other than PC hardware components.

Even tho I can still play most games, I do notice the lack of enough threads. For example, in Destiny 2 I'm hitting 100% CPU usage most of the times, and when I fight a lot enemies at once I get a lot of FPS drops. I've been waiting, hoping Zen 3 was good enough but seeing only matching Skylake performace (it has higher IPC but cannot hit Intel clocks) disappointed me. I'm hoping AMD will be able to offer a substantial jump with Zen 4. 20% would be good enough for me.

Edit: On destiny 2, hitting 100% on all cores just doing some public events.
upload_2020-5-24_18-34-29.png

edit2: not getting the i7 6700k was a mistake btw, I will probably go for 10 cores next time. I do not want to change my CPU until PS6 comes around.
 
Even tho I can still play most games, I do notice the lack of enough threads. For example, in Destiny 2 I'm hitting 100% CPU usage most of the times, and when I fight a lot enemies at once I get a lot of FPS drops. I've been waiting, hoping Zen 3 was good enough but seeing only matching Skylake performace (it has higher IPC but cannot hit Intel clocks) disappointed me. I'm hoping AMD will be able to offer a substantial jump with Zen 4. 20% would be good enough for me.

Edit: On destiny 2, hitting 100% on all cores just doing some public events.
View attachment 3933

edit2: not getting the i7 6700k was a mistake btw, I will probably go for 10 cores next time. I do not want to change my CPU until PS6 comes around.
Now, I don’t want to stop anyone from shopping their way to happiness, but how on Gods green earth do you manage to get CPU limited like that in Destiny 2? Do you have a 240Hz monitor, or what’s going on? Reviews tend to do 200fps or more in CPU affected scenarios, and I never saw drops below 60Hz when I played it (I used my old 2560x1440 IPS monitor for this game).

More cores might come into play more as this next console generation moves on. They are a static target though, so anything you buy that is sufficient, is likely to stay so for the remaining decade.
 
Now, I don’t want to stop anyone from shopping their way to happiness, but how on Gods green earth do you manage to get CPU limited like that in Destiny 2? Do you have a 240Hz monitor, or what’s going on? Reviews tend to do 200fps or more in CPU affected scenarios, and I never saw drops below 60Hz when I played it (I used my old 2560x1440 IPS monitor for this game).

More cores might come into play more as this next console generation moves on. They are a static target though, so anything you buy that is sufficient, is likely to stay so for the remaining decade.

I play at 1080p@144hz, but the CPU hits 100% at 110-120 fps depending on the scenario. BTW, I've overclocked the CPU to 4.3 Ghz.

I had the same issues playing BF1 and 5.
 
Last edited by a moderator:
Isn't Intel planning to bring 10nm+ CPUs to the desktop anytime soon?

Looks like they put themselves into a corner where they only have 2 choices:
- Higher core-count using 14nm+++++ Skylake cores with old I/O and iGPU
- Low core-count (up to 4 cores max?) using the new 10nm+ cores with new I/O and iGPU

What I'm getting from looking at core count on newer architectures is that Intel still isn't confident enough to make larger chips on 10nm on a large scale.
Otherwise they could e.g. repurpose part of the Ice Lake SP cores to a prosumer platform so they could fight Threadripper.



New By the way, if you spend some money on a high quality ddr4 memory, the 10600k will give you 10900k results in pre-Gen9 gaming.

Fixed that for you.
Games are about to be developed with 3.5GHz 8 core / 16 thread CPUs with current-gen IPC in mind. A 6-core / 12-thread model can only compensate for that in instructions-per-time if it's running at 4.7GHz (3.5*1.33) assuming L2 cache doesn't become a bottleneck, and with a PC OS' higher overhead we'd probably need around 5GHz sustained instead.

Unless someone is intending to upgrade their CPU (and probably motherboard too, knowing Intel) next year, purchasing a 6-core CPU is just a mistake nowadays.
Sure, it might be good value for today, but it's terrible future-proofing nonetheless. And so are the 4-core Ryzen 3000 BTW, but at least those are so cheap that it could make sense to save some money now to compensate for a lower-priced 8-10 core model next year.
 
I play at 1080p@144hz, but the CPU hits 100% at 110-120 fps depending on the scenario. BTW, I've overclocked the CPU to 4.3 Ghz.

I had the same issues playing BF1 and 5.
Then it makes sense!
I actually went the extra mile here, and downloaded the game again (all 87GB....) to check my memory. And true enough, I never dropped below 100fps, and yours should be a tad faster. I played at 60fps at the time, and never ever saw frame drops. I can see that it would be a different ball game at 144Hz. Unless you run with Active-/G-sync, where I don't think it makes a lot of difference actually.
Running at very high frame rates could be tricky in the future, where 3.5GHz Zen2 is the baseline. (And I do believe that adding cores beyond 8 isn't going to make one hell of a lot of difference. Actual speed per thread will be the metric I look out for.) The next generation consoles will have pretty good CPU capabilities. Getting a factor 2 up in per thread performance could be a long wait.

One might actually have to adjust settings (gasp!) to achieve higher level frame rates.
 
Last edited:
Isn't Intel planning to bring 10nm+ CPUs to the desktop anytime soon?

No, and I don't expect them to release any 10nm product on desktop. They will probably skip it, and move from 14nm directly to 7nm.

However, next year they are moving away from Skylake into Willow Cove, but it will be still a 14nm node. Some rumours are pointing to a 30% increase in IPC from Skylake, but I don't know if they will be able to match the same clocks.

Then it makes sense!
I actually went the extra mile here, and downloaded the game again (all 87GB....) to check my memory. And true enough, I never dropped below 100fps, and yours should be a tad faster. I played at 60fps at the time, and never ever saw frame drops. I can see that it would be a different ball game at 144Hz. Unless you run with Active-/G-sync, where I don't think it makes a lot of difference actually.
Running at very high frame rates could be tricky in the future, where 3.5GHz Zen2 is the baseline. (And I do believe that adding cores beyond 8 isn't going to make one hell of a lot of difference. Actual speed per thread will be the metric I look out for.) The next generation consoles will have pretty good CPU capabilities. Getting a factor 2 up in per thread performance could be a long wait.

One might actually have to adjust settings (gasp!) to achieve higher level frame rates.

Wow, you didn't need to download the game lol. I think Destiny's engine scales very well with additional cores/threads, I've seen some post on reddit of people with issues on 6600k that have moved to the 7700k and have noticed a substantial improvement, from cores being fully utilized to 60-70% max utilization.
 
Unless someone is intending to upgrade their CPU (and probably motherboard too, knowing Intel) next year, purchasing a 6-core CPU is just a mistake nowadays.
Sure, it might be good value for today, but it's terrible future-proofing nonetheless. And so are the 4-core Ryzen 3000 BTW, but at least those are so cheap that it could make sense to save some money now to compensate for a lower-priced 8-10 core model next year.
The R5 3600 is generally <$200, and I think it will last at least a couple years given many games will still be designed to run on Xbone/PS4. Doesn't seem like a bad purchase at that price. Also even in PS5/XboxSX exclusives I don't think a 6 core Ryzen would be terrible. Console OS overhead is no joke, I think the OS actually reserves an entire core. Typical background CPU usage while gaming in Windows is negligible in my experience, and the desktop Ryzens have a lot more L3 IIRC.
 
The R5 3600 is generally <$200, and I think it will last at least a couple years given many games will still be designed to run on Xbone/PS4. Doesn't seem like a bad purchase at that price. Also even in PS5/XboxSX exclusives I don't think a 6 core Ryzen would be terrible. Console OS overhead is no joke, I think the OS actually reserves an entire core. Typical background CPU usage while gaming in Windows is negligible in my experience, and the desktop Ryzens have a lot more L3 IIRC.
And higher clocks. An R5 3600 will definitely be on par with the console CPUs.
The issue is rather if, for whatever reason, you want significantly higher framerates, say 144Hz in CPU bound scenarios. Then, if you believe that per thread performance will increase at the same rate as last decade, you’re in for a wait of ten years or so. The only reasonable approach is rather to actually leverage the advantage PC-gamers have historically enjoyed, that you can tweak game parameters to improve responsiveness.
To what extent this is possible will depend on the individual game.
 
And higher clocks. An R5 3600 will definitely be on par with the console CPUs.
The issue is rather if, for whatever reason, you want significantly higher framerates, say 144Hz in CPU bound scenarios. Then, if you believe that per thread performance will increase at the same rate as last decade, you’re in for a wait of ten years or so. The only reasonable approach is rather to actually leverage the advantage PC-gamers have historically enjoyed, that you can tweak game parameters to improve responsiveness.
To what extent this is possible will depend on the individual game.
Well assuming games are better with multiple threads in the next gen, you can get a 16 core Ryzen today that could theoretically more than double your performance. Core counts, clocks, and IPC will probably continue to rise now that there is legitimate competition from AMD.
 
Well assuming games are better with multiple threads in the next gen, you can get a 16 core Ryzen today that could theoretically more than double your performance. Core counts, clocks, and IPC will probably continue to rise now that there is legitimate competition from AMD.
Check out gaming benchmarks for the 3300x. There are relatively few codes that fulfill the demands of doing a lot of work on very little data (or you’re bandwidth limited), preferably little enough that you are effectively core resident so threads don’t thrash each others cache content, you need to achieve almost perfect load balance or you’re wasting cores and/or creating stalls when threads need to wait for the slowest to finish up before you can sync, or you have to limit yourself to scenarios that don’t need that at all. Maybe game code is simplistic enough, but it doesn’t seem so in the general case, or you’d see eight core CPUs do twice as well as four core, and 16 core do twice as well as eight core. That’s just not what we observe implying that the more cores you add, the less relevant they will be. (Graphics is something else.)

I have every possible doubt that "I want twice the frame rate, so I’ll use a CPU with twice the number of cores" will be an effective strategy going forward. When the baseline is the new consoles, you’re already struggling against Amdahls law.
 
Last edited:
I have every possible doubt that "I want twice the frame rate, so I’ll use a CPU with twice the number of cores" will be an effective strategy going forward. When the baseline is the new consoles, you’re already struggling against Amdahls law.
I doubt it as well, but if they want more performance it's going to have to come from more cores. Single thread performance isn't likely to increase by very much in the near future. It's not like the Xbone/PS4 which had terrible CPUs on release.
 
Last edited:
I doubt it as well, but if they want more performance it's going to have to come from more cores. Single thread performance isn't likely to increase by very much in the near future. It's not like the Xbone/PS4 which had terrible CPUs on release.
It's easy to be misled by the tech sites. They are totally dependent on advertising and shopping, so it's not surprising that they hype the new shiny. But as far as games are concerned, the return on investment in cores is terrible.
Looking at for instance techpowerup.com tests of the latest CPUs, (link to 10600K test), where they set the 10600K to 100%, using an RTX2080Ti graphics card, we find at 1280x720
3300x - 85.3%
3600x - 87.6%
3700x - 89.6%
3900x - 92.4%

note that the scaling by core is exaggerated by the higher end models having higher boost clocks (and power consumption). At ISO clocks the differences would have been effectively zero, as the scaling we see here is below clock scaling (!). As it would, of course, the instant the GPU would start to affect performance at all. (2080ti at 1280x720 is hardly a realistic scenario for anyone).
The benefit of increasing core count for gaming is completely negligible.
I expect that it will be nigh on impossible for me to buy a 5nm CPU in a couple of years with less than eight cores because as you point out, single thread performance isn't likely to evolve particularly fast in this segment of processors in the future, which simply means that it's an aspect of system building that is becoming a complete non-issue as we have moved from two to four and now further up in "standard" core count.

I see forum warriors blaming "lazy developers" for this, but that's just ignorance talking. Of course the entire industry isn't incompetent, and they have had really good reason to push multithreading as far as they could with the atrocious CPUs of the current console generation. The above is the result of their efforts.
 
Seems like an untypical use case scenario for such high end processors.
I don't mind that they showed it, but they need to come up with better benchmarks than reducing the resolution all the way down to showcase its performance in games.
If you want to showcase a purchasing decision to a gamer, reducing it to 720p seems unlikely. (at least not for spending that amount on a CPU for). Like for me I would like to know the following on how a CPU performs when they are
  • Setting the game to 1080p to achieve the framerates one would expect 120-240 max
  • Have them live streaming said game to twitch, with video and mic etc all on
  • Have discord on
  • Have music on
And then you get a realistic view of how the different CPUs handles a heavier gamer load. At least a heavier twitch streamer load.

I need some benchmarks on Call of Duty shader compile optimization times ;) lol, god I hate patch days.

Would also appreciate some Unreal and Unity compile times ;)
 
Back
Top