DDR4 and DDR5 compatible
Issues with the RTX 4090
About the Intel Core i9-13900K and Core i5-13600K processors
- Total Cores: 24 / 14
- Performance Cores: 8 / 6
- Efficient Cores: 16 / 8
- Total Threads: 32 / 20
- Max Turbo Frequency: 5.80GHz / 5.10GHz
- Performance-core Base Frequency: 3.00GHz / 3.50GHz
- Efficient-core Base Frequency: 2.20GHz / 2.60GHz
- Maximum Turbo Power: 253W / 181W
- Processor Base Power: 125W / 125W
- Cache: 36MB / 24MB
- Total L2 Cache: 32MB / 20MB
Intel also has a Core i7-13700K processor ($410) that sits between the Core i9 ($589) and Core i5 ($320). It has eight performance cores and eight efficient cores with a max turbo frequency of 5.40GHz.
For our tests, we used the following PC configurations:
- Graphics card: Nvidia GeForce RTX 3080 Ti / RTX 4090
- Motherboard: MSI MPG Z790 Carbon WiFi / Gigabyte Z790 Aorus Elite AX
Memory: Corsair Dominator Platinum RGB 32GB (16 x 2) 5200-DDR5
- Storage: Sabrent Rocket 1TB M.2 NVMe PCIe 4.0
- Power: Corsair RM1000
- Cooling: Corsair H150i Pro RGB
What we like
Massive framerate jump over the previous generation
Intel’s gains in gaming performance have remained almost stagnant for its last three processor generations. In my experience testing the company’s 10th, 11th, and 12th-gen chips, the 10th, and 11th generations pulled almost the same frame rates (within one to four frames) when testing the same games, while the 12th-generation lost between five and 20 frames. However, Intel has increased its 13th-gen chips’ maximum clock speed to 5.8GHz from 5.2GHz, which increased in-game frame rates by a whopping 30-50 frames per second (fps) when using the same graphics card.
When we tested Intel’s Core i9-12900K, Shadow of the Tomb Raider reached 151 fps set to the highest graphics preset and 1080p. But this time around, the Core i9-13900K cranked out over 200 fps in the same game at the same settings. This massive jump in performance extended to all the usual games we test, too. We saw the smallest fps increase with Cyberpunk 2077 (just under 30 fps), but that’s still a big increase for such a graphically intense game.
We did see some larger fps jumps in gaming performance from the previous generation with the Core i5-13600K compared to the Core i9-13900K, but more on that later.
Intel’s 13th-gen chips are also faster in gaming performance compared to AMD’s Ryzen 7000 chips. The Core i9 pulled about 10-40 frames more than the Ryzen 9 7950X, depending on the resolution and game. (The most gains were at 1080p, of course.) The Core i5 had a smaller, yet still significant framerate lead over both the Ryzen 9 7950X and Ryzen 9 7900X, with nearly identical framerates to AMD’s Ryzen 7 5800X3D.
Competitive multi-core performance
Intel’s 13th-gen processors are highly competitive with AMD’s newest 7000-series chips when it comes to multi-core performance—and Intel outshines its 12th-gen chips here, too. Compared to the AMD Ryzen 9 7950X, the Core i9-13900K has nearly identical single-core scores in Geekbench 5 (around 2250), yet significantly higher multi-core scores (22338 to 24664, respectively.)
The Core i9 also blows past the Ryzen 9 7950X in our Cinebench test, scoring nearly 300 more points in single-core performance (2304 to 2029, respectively), yet it did not surpass the Ryzen 9 7950X in multi-core here (35948 to 36974, respectively.)
This tracked with what we saw in our practical benchmarks, like 3D rendering, which relies on multi-core performance. Rendering a scene of a BMW in Blender, the 13th-gen Core i9 completed that task in one minute and four seconds, while the 12th-gen Core i9 took one minute and 33 seconds and the Ryzen 9 7950X took one minute.
When it came to running a path tracing test, the results were mixed. The Core i9-13900K fell behind the Ryzen 9 7950X by almost two minutes (11 minutes and 42 seconds versus nine minutes and 59 seconds, respectively). However, Intel’s 13th-gen chip was six minutes and 30 seconds faster in the same task compared to the last-gen Core i9-12900K.
The slower path tracing time is a small concession to make for those significant frame rate increases—and if you’re buying one of these 13th-gen chips just for gaming, you lose nothing. You should also take into account that AMD’s 7000-series processors are having issues with Windows 11 at the time of this writing; the latest OS update appears to be compromising their performance, where that issue doesn’t seem to be happening with Intel’s 13th-gen chips.
The Core i5-13600K is also on-par with AMD’s Ryzen 7 5800X3D in terms of gaming performance—exactly on par. The Core i5 also costs $320 versus the Ryzen 7 5800X3D’s $450 price tag, although we’ve seen the Ryzen chip on sale for around $400. The fact that the Core i5-13600K gets the same gaming performance as the Ryzen 7 5800X3D and way better multi-core performance for less makes this an extremely enticing chip to put in your next build.
Compatiable with DDR4 and DDR5
While you’ll have to commit to using either DDR4 or DDR5 RAM, since motherboards can’t accommodate both, we like that Intel is still catering to those who want to save money by sticking with the DDR4 RAM they already own.
Depending on how you have your PC configured, and how you have the in-game graphics and resolution set, you most likely won’t see a big framerate increase just by choosing DDR5 memory over DDR4 memory. Intel’s 13th-gen chips are plenty powerful all on their own. The scenario in which you’ll most likely see a small frame rate increase is in CPU-intensive games like Hitman 3, but RAM speed doesn’t make a huge difference on its own.
DDR5 is also more expensive than DDR4, even though prices continue to steadily come down. The RAM we use in our test bench has been on sale for around $200, while the RAM we use in our last-gen testbench goes for $155 when on sale.
Also keep in mind that while DDR4’s clock speeds are slower than DDR5, DDR4 has less latency, so the time it takes to receive and send instructions to other hardware components is faster.
Cooler installation issues are a thing of the past
One of the issues Intel’s 12th-gen chips had when they were released was that they would not work with 11th-gen and older motherboard sockets due to the physical increase in size. As a result, many CPU cooler manufacturers had to provide new standoff screws so their all-in-one coolers could be compatible with the new chips and sockets.
However, many people, including us, had issues getting the cooling block to lay perfectly flush with the processor. When that happens, the processor isn’t getting all the cooling it needs and could overheat as a result. But we re-used the same Corsair AiO cooler and standoffs provided by the company with the 13th-gen chips and motherboard and had zero issues getting the block to connect perfectly.
So, if you have an Intel 11th-gen processor or older, and skipped the 12th, your current AiO cooler will most likely work with the 13th-gen chips and you won’t have to deal with the growing pains that came with the 12th-gen. Win-win.
What we don’t like
They’re hot and thirsty
Intel’s 13th-gen chips run hot and they use a lot of power, which are two big concerns for your monthly electricity bill and the longevity of the chips themselves. To test thermals and power consumption, we paired the Core i9-13900K with an RTX 4090 and played Cyberpunk 2077 for about 30 minutes with ray tracing set to ultra, DLSS off, and at 4K resolution. HWInfo, a PC hardware monitoring tool, ran in the background collecting data on power consumption, temperature, and core clock speeds.
We found that the new Core i9 can easily surpass its thermal limits when pushed, reaching a maximum temperature of 111 degrees Celsius (231.8 Fahrenheit) for the entire CPU package. However, the chip is not supposed to surpass 100 degrees C, so since it did that caused some thermal throttling according to HWInfo on all the chips’ performance cores. While we didn’t see significant in-game throttling, it's possible that it was happening at the same time we saw some possible bottlenecking with the Core i9 and RTX 3080 Ti (more on that later).
When it comes to power consumption, the Core i9 needs at least 205W according to our tests. Intel rates its max power consumption at 253W, so it doesn’t quite reach that when pushed hard in a high-fidelity game, but that’s still a lot! By comparison, the max power consumption of the Core i9-12900K is rated at 241W and the Ryzen 9 7950X is rated at 170W. The 13th-gen Core i5 is rated at a maximum of 181W.
RTX 4090 compatibility issues
This is more of an issue with Nvidia’s RTX 4090 rather than Intel’s newest processors, but we had similar issues with RTX 4090 compatibility in the new Intel chips as we did with the new Ryzen 7000-series system: there was no video output on the monitor. But we did get it to work with some strange workarounds.
Where the RTX 4090 wouldn’t output video at all when paired with a Ryzen 7000-series chip, we could get it to output video when paired with an Intel 13th-gen chip if we installed and reinstalled the Nvidia graphics driver in a specific order. For instance, I tested the Core i5-13600K with an RTX 3080 Ti first (which gave me zero trouble), and then swapped it out for the RTX 4090, which also worked as expected. But when I left the 4090 in the system and swapped the Core i5 for the Core i9-13900K, my monitor became stuck in a perpetual cycle of losing a connection to the GPU, and then the PC would automatically restart itself.
I had to uninstall the RTX 4090 driver, swap the card for the RTX 3080 Ti, reinstall the driver, then uninstall it again only to put the RTX 4090 back in the testbench and reinstall the driver yet again just to get it to work. But unfortunately, this method only worked once; if I turned the PC off and then back on again, I’d have to repeat the entire process.
This is a particularly notable problem for Intel’s 13th-gen chips because the Core i9-13900K needs the power of the RTX 4090 to unclog the slight bottleneck that happens with a less powerful graphics card in some games. This leads me to the most notable issue with these new chips: gaming performance scaling.
Gaming performance doesn’t scale much, but it’s still better than AMD
We didn’t have as much of an issue with performance scaling compared to AMD’s 7000-series chips, but if you are looking at Intel’s 13th-gen lineup, gaming performance is hit with diminishing returns the higher up the silicon chain you go. But unlike AMD’s newest processors, which had the same in-game frame rates regardless of whether it was the Ryzen 5 7600X or Ryzen 9 7950X, Intel’s chips don’t suffer that much.
Let’s look at the Core i5-13600K: when we put it up against Far Cry 6 on ultra-graphics at 1080p, it cranked out 151 fps. In contrast, all of AMD’s new processors managed between 121-125, but the Core i9-13900K got a little ahead of the Core i5 (165 fps to 151 fps, respectively). The Core i9 saw a meaningful jump in other games we tested, like Hitman 3, Shadow of the Tomb Raider—all but Cyberpunk 2077.
Cyberpunk 2077 is where there appears to be a bottleneck when you pair the Core i9-13900K with an RTX 3080 Ti instead of an RTX 4090. At 1080p with the Ultra Graphics preset on (no ray tracing), the Core i5 averages 144 fps, but the Core i9 averages 132. When you pair the RTX 4090 with both of those chips, the Core i5 gets 205 fps and the Core i9 gets 210.
This might not be all that surprising at the lowest gaming resolution, but the same thing happened with ray tracing turned on, and at 1440p; the Core i5 leads the Core i9 93 fps to 89 fps at 1440p, and 125 fps to 119 fps at 1080p with ray tracing turned to the Ultra preset.
The takeaway? Unless you also need a processor for creative work, like video editing or 3D rendering, the Core i5-13600K will net you the same or slightly better performance in most games, which is good news for your wallet.
Should you buy the Intel Core i9-13900K or Core i5-13600K?
Yes, but you should buy the Core i5-13600K
We’ve given Intel’s Core i5 13600K our Editor’s Choice award for a reason: it provides the best value for gaming and content creation compared to any other desktop processor on the market. It also doesn’t get as hot as the Core i9, and it uses less power to churn out frame rates that are within spitting distance of its more expensive sibling. The Core i5’s performance also isn’t limited by last-gen graphics cards, even the high-end ones like the RTX 3080 Ti.
As we said in our AMD Ryzen 9 7900X and Ryzen 7 7700X review, gaming performance also does not scale much between the Core i5 and Core i9, but there is a slight difference. It’s not enough to justify the price increase between the Core i5 ($320) and Core i9 ($580), though, so only get the Core i9 if you need the multi-core processing power. Compared to the Ryzen 9 7950X, it’s slower in some tasks, but the Core i9 costs $120 less than the Ryzen 9 7950X—and the Core i9 obliterates the Ryzen 9 7900X in all use cases for only $30 more.
With 12th-gen chip woes behind it, Intel has made an astonishing comeback, eroding the progress AMD has made over the last several years. If I was building a new PC right now, I’d put the Core i5 13600K in it, hands-down—and I’ve been using AMD in my builds since 2015.
Prices were accurate at the time this article was published but may change over time.
Meet the tester
Senior Editor, Electronics@JLNwrites
Joanna specializes in anything and everything gaming-related and loves nerding out over graphics cards, processors, and chip architecture. Previously she was a staff writer for Gizmodo, PC Gamer, and Maximum PC.
Checking our work.
Our team is here for one purpose: to help you buy the best stuff and love what you own. Our writers, editors, and lab technicians obsess over the products we cover to make sure you're confident and satisfied. Have a different opinion about something we recommend? Email us and we'll compare notes.Shoot us an email