Yeah, this is my first thought every time Intel releases a new generation without any big gains and it becomes a big talking point. If you had a 2600K (or even a 2500K), you were probably all set until around 8th or 9th gen.
6th (SKL) gen vs 2nd gen (SB) over the span of 5 years was a smaller jump than the jump from Zen 1 to Zen 2 in terms of price/performance.
Also DDR4 vs DDR3 barely mattered on launch.
https://www.anandtech.com/show/9483/intel-skylake-review-6700k-6600k-ddr4-ddr3-ipc-6th-generation/7
You are right, Skylake was a big jump over Sandy, that was one exception. But Sandy and Haswell were both powerful enough for the next several years to not bottleneck contemporary GPUs in typical gaming scenarios, so it didn't really matter anyway.
The most popular GPUs were GTX 960/970,750ti or something like R9 270–390, and all of those barely managed to maintain 30–60 FPS 1080p in demanding games like Witcher 3, so CPU improvements were not really that impactful aside from selected games like WoW, Arma or grand strategies.
the real gap closer with sandy bridge was how well it OCed anybody could get a 4.8 GHz and the later once were very regular to land at 5.1 GHz.
i had a 3770k i ran at 4.7 GHz which is what you would expect a 6th gen to get as well. with DDR4 performance at launch it made the gains even on CPU tests single digits over what i was getting.
it really took until 8th gen for RAM speeds, more cores and new games to start pushing out sandy bridge on the gaming front.
> was how well it OCed anybody could get a 4.8 GHz
That's a bit of an exaggeration. 4.4-4.6 was very common at acceptable voltages. 4.8 was much less common or possible if you didn't care about the life of your chip and pour the voltage in. Though, a smaller number of samples were able to clock that high with reasonable voltage. They were also really sensitive about HT. So, a 2500k often clocked higher than a 2600k unless you turned of HT.
>. If you had a 2600K (or even a 2500K), you were probably all set until around 8th or 9th gen.
huh, I still using 2500K, I could play death stranding at high setting at low fps like 60fps. Smooth enough to be playable.
I went 2600k > 12600k.
Held up alright! Was still overclocked like mad.
I actually appreciated the intel stagnation, lmao. Got me all the way through school and university when I couldn’t really afford to upgrade.
I think it still aged surprisingly well.
But considering that 2700k was like $100 extra at launch and would have squeezed out 3 years extra at least, it seems 'bad'.
It's just that the 2700k aged even better. At launch value propsition was squarely in the 2500ks favour. But looking back, most people would consider the 2700k instead. That doesn't make the 2500k bad by any stretch of the imagination though.
The 8th and 9th gens were a huge leap and my 6700k was really holding me back in CPU-bound games by the time those rolled out. Waited a few more years before upgrading, and switching from that 6700k to a 12900k literally tripled my frame rate in some titles.
My i7 3770k survived until i got an 11400k, heck i had a 3090 paired with him for a while. Hopefully i will upgrade to 14 something soon. Actually, survived is not the word, is still kicking in combo with my former 1080ti in my kid computer, running forza 4/5. Hopefully i will upgrade both boxes until next year.
> You missed Haswell.
no it's there:
> Ivy Bridge 2.0 overclocker's regression edition
The IPC gain of Haswell was more than offset by its loss of OC headroom making the average CPU bin worse in final performance.
This is also why I centred everything around Ivy Bridge, as for overclocking it was slightly better than Sandy Bridge (and Sandy Bridge also had buggy hyperthreading, which admittedly wasn't fully fixed in Ivy Bridge either)
With Sandy Bridge the average CPU could expect 5Ghz, top 10% 5.1Ghz and top 1% 5.2Ghz or higher
With Ivy Bridge the average was 4.8Ghz, top 10% 5Ghz and top 1% 5.1-5.2Ghz
With Haswell the average was 4.5Ghz, top 10% 4.7Ghz and top 1% 5Ghz
Sandy -> Ivy 8~% IPC improvement, 4-5% OC headroom loss
Ivy -> Haswell 4.5~% IPC improvement, 7% OC headroom loss
> especially with doubled AVX(2) throughput.
also in many cases ivy bridge's AVX throughput is zero because it doesn't support integer avx even at 128b bit width.
(ok technically it does support SSE but boy is that slow nowadays)
I think people are coming from zen1/zen+ mindset where they assume ivy bridge is the same because it's 128b, but it's not, zen1/zen+ still have integer AVX support and ivy bridge does not. and that's absolutely brutal in production/encoding tasks - x264 for example does not use floating-point AVX at all, pretty sure x265 doesn't either. So if you don't have integer AVX you get nothing in these workloads.
All the "literally no progress since sandy bridge!" folks should actually try using a sandy bridge processor. It's fine in desktop tasks/etc but there has been *huge* progress made in the more intensive stuff, it just shows up in accelerator unit (AVX) benchmarks and not standard desktop tasks... that level of task was "solved" a long time ago, word processing runs fast enough even on an old processor, but intensive stuff does not.
> People are new to this re-release of Intel chips I guess. Before ryzen every year chips were exactly like this every single time
Eh. Intel normally changed *something* about the hardware in most generations, even if it was just adding more cores.
1. Nehalem
2. Sandy Bridge (New Silicon - Architecture)
3. Ivy Bridge (New Silicon - Node)
4. Haswell (New Silicon - Architecture)
5. Broadwell (New Silicon - Node)
6. Skylake (New Silicon - Architecture)
7. Kaby Lake (New Silicon - Uncore)
8. Coffee Lake (New Silicon - More Cores)
9. Coffee Lake Refresh (New Silicon - Yet More Cores)
10. Comet Lake (New Silicon - More Cores Again)
11. Rocket Lake (New Silicon - Architecture)
12. Alder Lake (New Silicon - Architecture & Node)
13. Raptor Lake (New Silicon - More Cores)
14. Raptor Lake Refresh
Intel has previously offered "refresh" SKUs, such as Haswell Refresh. And the Coffee/Comet family really only improved things at the high end where all the cores were enabled. But a refresh/recycle has never before been a whole new product generation. That much is new.
Kabby lake is when Intel started to spin its wheels. Raptor Refresh also introduces extra cores in some SKUs. Intel went Tic-Tok for some time and while we still had 4c8t top parts until Zen gen on gen improvements were closer to 10-20% plus CPUs were rather lazily clocked, meaning people could get an extra 800mhz from OC
Kabby(past initial 6000 with Skylake) till Rocket we got Skylake with different amount of cores and clocks. Also 7000 Intel gen was alive for around 8 months or so. Not great all and all.
Real huge stretch to call Kaby Lake a hardware improvement. Don't forget about HEDT Kaby Lake-X which was the worst thing to come out of Intel in the past 10 years. The 7740X is up there as one of the worst "flagship" desktop CPUs of all-time.
Kaby definitely didn't set the world on fire, especially on the CPU side of things. But it did introduce a new GPU architecture, as well as some fairly important power delivery/DVFS changes. It wasn't just warmed-over Skylake silicon (unlike Raptor Lake Refresh).
Raptor Lake Refresh at least introduces more cores on some SKUs which is actually valuable for CPU performance unlike Kaby which performed the same as Skylake before it.
Amusingly, Kaby Lake-X was _worse_ than Skylake-X. The main reason people went HEDT was at the time, it was the only way to get more than 4 cores from Intel. The 7740X was a 4-core 7700K on the more expensive HEDT platform where people went for CPUs like the 18-core Skylake-X 7980XE. HUB tested it and it actually [performed worse](https://www.youtube.com/watch?v=47a_iyZ9nmI) than the 7700K.
And it led to Ryzen becoming the force it is today. Intel complacency might have worked while AMD was just not delivering acceptable products, but nowadays with AMD beeing competetive such a release is just really bad. Right now for gamers AMD is faster, cheaper and on a platform that is getting at least 2 more major releases. I really dont see a reason to go Intel for anything other than productivity right now.
Depends. The 13600k is "ok" pricewise, although the competing 7600 is 100€ cheaper (and a bit slower). But the 390€ 7800X3D easily competes with the 630€ 13900k.
If you had an AM4 board id still always get a 5800X3D over a 13600k or 7600. If you are building a new system the 7800X3D is great value with a good chance of at least one good upgrade. I dont really see a point in saving 70€ on a 13600k compared to a 7800X3D for a slower product on an end of life platform. So yeah, purely for gaming i do think right now AMD is miles ahead in value for money.
Well i said now because while yes it kinda got better, but at least here in brazil i could buy two am4 systems or 1700 for the price of a setup for the 7600 some months ago. It was not worth at all. I was more focused in the "cheaper" part as most people here would only but the actually cheap ones anyway.
Don't forget said platform costs less to get into when including the cost of an "entry level" motherboard
Beyond the price of the CPUs, the motherboards are just way better value on AMD
I think people just want to see how Intel respond when AMD release the X3Ds.
Well now we know that Intel still can't compete with AMDs X3D. Still pushing powers to the point that it's embarrassing to see that most of the efficient CPUs are AMD, and still adding those stupid "efficiency" cores.
This isn't Intels response to x3d. This is just a refresh. Check out Intels next gen meteor lake / arrow lake to start understanding what Intels response might look like.
>Check out Intels next gen meteor lake / arrow lake to start understanding what Intels response might look like.
But thats like a year away and they were already losing in gaming before 14th gen. Not beeing disappointed by this release because "you knew it was just a refresh" is just a cop out, if i have low expectations i can never be diappointed but its still a terrible look to release bad products. Just dont release it then, or at the very least call it something else than 14th gen.
I'm just not sure what folks expected. It was well known this release would be a refresh with a very slight clock boost on the high end. Now it happens and people here are bewildered that it's not some revolutionary new product totally changing market dynamics.
So I stand by my point that you can only be disappointed here if you had absolutely no clue what Intel was going to be releasing despite it being well known for a long time now.
Gaming benchmarks are pretty close btw. Only in efficiency and extremely niche scenarios is there a decent gap.
Intel is to blame if people are shocked the gains are this low. They _chose_ to name it 14XXX. Had they used a different nomenclature (13990k?) people would be less annoyed.
It's already hard to explain to anyone who's not versed in CPU releases how their nomenclature works, we don't need "Oh, the generation is pretty important! oh but not _that_ generation, 11=>12 is important, but 13=>14 is useless." added into it. If Intel didn't want to be judged harshly, they had to name things appropriately.
The fact that they announced ahead of time that it was a refresh is pointless. Nothing in the naming scheme suggests that. If they don't want to be lambasted for bringing no generational improvement, _don't call it next gen_.
Gaming isn't the only thing that matters.
Also for most uses they're effectively tied in gaming. Who runs a new/high end card at 1080p?
There's edge cases like Factorio of course but that's an edge case.
As an FYI I was making the same argument saying "don't discount Zen 1" back when people screamed that KabyLake was better for gaming.
You do realize the confirmation that this is just a refresh is what is so disappointing, right? To add to that, going back several generations, this is their worst incremental release. The 11700k to 12700k and 12700k to 13700k saw a better improvement in performance.
You're disappointed on release day by a refresh that was announced years ago? I'm kinda surprised so many folks here had absolutely no clue what this release was.
The Raptor Lake Refresh was pre-announced yesterday, what are you talking about.
EDIT: To the downvoters, until last month when it was leaked by Intel China, we did not know it was just a refresh. Yesterday, they finally made the official announcement that it was going to be just a refresh, confirming what some folks feared.
[https://videocardz.com/newz/intel-confirms-14th-gen-core-raptor-lake-s-hx-refresh-for-the-first-time](https://videocardz.com/newz/intel-confirms-14th-gen-core-raptor-lake-s-hx-refresh-for-the-first-time)
[https://www.reddit.com/r/intel/comments/16ogyle/where\_is\_the\_official\_announcement\_of\_the\_raptor/](https://www.reddit.com/r/intel/comments/16ogyle/where_is_the_official_announcement_of_the_raptor/)
From last year:
>Intel's 2023 roadmap for the desktop processor segment sees the company flesh out its 13th Gen Core "Raptor Lake" desktop family with 65 W (locked) SKUs, and the new i9-13900KS flagship; followed by a new lineup of processors under the "Raptor Lake Refresh" family, due for Q3-2023
https://www.techpowerup.com/302619/intel-raptor-lake-refresh-meant-to-fill-in-for-scrapped-meteor-lake-desktop
There were tons of leaks. Not just that one reliable one.
Here's another.
https://www.notebookcheck.net/Intel-Raptor-Lake-Refresh-release-date-reportedly-set-for-mid-October-for-K-series-and-early-January-for-non-K-14th-gen-CPUs.733121.0.html
This is on top of Intels roadmap showing we wouldn't get a new arch or node shrink this year. You can stick your head in the sand but that only backs my point: only folks who had absolutely no clue what intels doing could be disappointed by this typical refresh.
Again, these are by leakers on social media spreading rumors about what might be releasing. This specific one you linked sources a youtube channel. They even say themselves,
>As always, take these rumors with a grain of salt. We don’t have any official details regarding Intel's Raptor Lake Refresh, so it shouldn’t surprise you if the information shared by MLID doesn’t pan out.
EDIT: And to respond to /u/Negapirate since they blocked me, a great example is the 4090 Ti, which was not only predicted by reputable leakers, but they even had pictures of it. It ended up being very real, but was cancelled internally. Leakers are nice for getting a vague idea of what a company is planning, but never rely on them as a definitive source.
A few leakers said it was going to be a refresh, that does not mean that was for certain until we get confirmation from Intel themselves. The fact that people are still so disappointed in the 14700k should tell you something about this.
> This isn't Intels response to x3d. This is just a refresh.
Hence the disappointment? Ryzen 8000 is probably coming somewhat soon next year and so far info is pointing it to being a decent improvement.
You can only be disappointed if you had absolutely no idea what Intel is doing and that this is how Intel has done refreshes for the last decade. Hence my suggestion folks read up a bit.
Check out Intels next gen meteor lake / arrow lake to start understanding what Intels response might look like.
Idk. If it's just a refresh they might as well call it 13x5xk. Giving RLR a new gen somewhat raises expectations. Although even the arch name points out what it really is.
I don't disagree but this is pretty much convention for hardware refreshes afaik. I guess a modern shift would be nvidias super series refresh, but that seems to be an anomaly.
Number go up is better for marketing I guess lol.
Skylake -> KabyLake -> CoffeeLake -> CoffeeLake+ -> CometLake
There might have been some very modest increases between them but they're all usually seen as having near identical perf/clock as they're fundamentally the same design with minor tweaks, mostly focuses on security, clock speed and energy efficiency.
Ipc benchmarks don't show 0%
>Before concluding our tests, let's glance at some IPC results. By running Baldur's Gate 3 with all processors at 5GHz (with a 3GHz ring bus and E-cores disabled), we observed that the 14900K is roughly 2% swifter than the 13900K, which is 3% faster than the 12900K. From the 12900K to the 14900K, there's a slight 6% increase in IPC.
https://www.techspot.com/review/2749-intel-core-14th-gen-cpus/
Intel has had many mediocre refreshes in the last decade+.
Even the 12900k to 13900k was only a 3% uplift. And other refreshes showed similar results.
Your narrative that this is some anomaly in ipc uplift for a refresh just isn't true.
Intel can't just respond to X3D. What comes out started development years ago. If AMD hypothetically releases some crazy new CPU that caught everyone offguard, Intel literally can't just respond immediately.
That being said, because MTL was delayed, RPL was created. It took ADL and cranked the clock speed way up. Now, MTL is coming out, which is essentially just ADL but significantly lower power draw. IF RPL never came out, MTL would launch and bring no perf. improvement, but lower power consumption by 50%. But RPL does exist, so MTL desktop would be a perf. regression.
Intel had 3 options: Release nothing on desktop until ARL is ready. Release RPL again with a tiny clock speed boost. Release MTL and see a perf. regression. They chose RPL refresh because it lets their OEM partners advertise a new CPU for this years model.
I would like to see Intel re-implement something like Broadwell, where they had the massive L4 cache. That actually gives a nice boost to some applications.
Intel filed a patent for an L4 cache tile (which was rumored to be designed for use with Meteor Lake) which is called 'Adamantine' cache earlier this year. No product announcements yet, so it doesn't seem like a release is imminent, but I wouldn't be surprised if we see something resembling vCache out of Intel sometime in the next year or two.
Broadwell had 6MB L3 cache. Alderlake has 30MB L3 cache.
The benefits of a relatively slow eDRAM overflow cache are much reduced when the L3 cache is 5x the size.
By the time fast DDR4 was a thing, system memory had similar latency and higher bandwidth than the eDRAM cache, albeit it was half duplex vs FDX.
At this point any next level cache would need to be a HUGE boost over what broadwell had to be worth the overhead given how good Intel's L3 cache is and how solid DDR5 is. In the server world, HBM is a thing though it's probably not the best choice for desktops quite yet.
I wouldn't go THAT far.
Pre-ryzen you had a bunch of "meh" released for the prior 5 years. +25% IPC in 5 years with negligible clock speed improvements. To intel's credit these were new designs and the only real "critique" is that Intel REALLY should've been giving people 6 cores earlier. Intel cut costs.
During Ryzen you went from SkyLake to SkyLake+ (KabyLake) to SkyLake++ (CoffeeLake) to SkyLake+++ (CoffeeLake v2) to SkyLake++++ (CometLake)
The ++++ era was the same design but with more silicon thrown at it and a dash of process improvement.
This is even worse than those 7th/9th gen refreshes. At least those had some IPC and efficiency gains. This is just Intel shoving even more power into existing silicon lol.
Those generations weren't as bad the newest ones since they significantly increased cores and thread counts when the standard 4 threads of the i5 was beginning to be noticeably slower
This just made me appreciate the 13600K even more, that one (or the 14th gen i5 for the same price) remains great value for mixed usage and gaming at high resolutions. Looks like Zen 5 is going to annihilate Intel though.
Arrow Lake is next year. It should feature new cores and Intel multi die packaging with Foveros. Zen 5 will have the work cut out for it, unless Intel messes up.
> Zen 5 will have the work cut out for it
It's strange that you say Zen 5 will have its work cut out for it, when Intel are the ones with the recent track record of fumbling the architectural ball. They couldn't get Meteor Lake working on desktop for whatever reason so had to scrape the barrel by reliving the good old days of useless +0% IPC, not-even-a-stepping-change rehashes just so they could have a release in 2023. Not to mention it looks like Zen 5 will arrive sooner than Arrow Lake.
If anyone looks like they have their work cut out for them, it's Intel.
Mutli-die approach is very important for cost. Newer nodes are drastically increasing in price and large die yields may be a problem. Being able to only manufacture the compute portion of the chip on bleeding edge, while the less important parts can use more mature nodes helps with volume and cost.
Intel 20A is not library complete. It can't be used for iGPUs or parts such as the memory controller. A 2024 launch of ARL wouldn't be possible if it was still monolithic.
> Looks like Zen 5 is going to annihilate Intel though.
15th gen was always the one that was going to matter. Zen5 will be competing with that, not 14th gen.
It's a valid counterpoint to the people arguing that they didn't raise the price of 14th gen. They kinda did, since now they no longer recommend water cooling, it's virtually required for the 14900k, unless you want the same performance as a 13700k. And preferably a 360 rad or a 420. So that needs to be factored in when considering the price/performance, along with the added cost of electricity.
Meanwhile 7800x3d is happily working with 20-30$ coolers, God, we really need an Intel to step up their game in some areas, sure they have a really good performance, but efficiency is terrible.
I think it's mostly to compare with 13th gen, since performance is so close we're just looking at overclocked 13th gen parts.
Cyperpunk +2% fps +3% power
Last of Us +1% fps +6% power
Star wars +4% fps +5% power
Going backwards.
Intel is having their 11900K moment again. With virtually no gains in a generation, the only moderately interesting part is the 14700K. And with the i5s also not getting any real bumps they are kind of forfeiting to AMD. Seems like the only lever they can pull is power and they are ratcheting it up and up every generation. Why is the 14600K consuming 5% more power than the 7950X3D? In games.
Hope that Arrow Lake will be better, but this does make me a bir anxious.
I think it's a huge stretch to call this an '11900K' moment. It's a boring refresh/rebrand of exist parts sure, but Intel's position in the desktop CPU market is not anywhere near as dire as it was in the 11th gen days, and 14th gen is not an outright bad product like Rocket Lake was. 13th gen competes just fine with Zen 4, and Zen 5 is probably not coming to desktop for at least another 6 months.
>I think it's a huge stretch to call this an '11900K' moment. It's a boring refresh/rebrand of exist parts sure, but Intel's position in the desktop CPU market is not anywhere near as dire as it was in the 11th gen days
I actually think it's worse, honestly. I know that the 11th gen wasn't great, but I think a lot of its terribleness is overstated. If I remember correctly, the 11900K was at least a new architecture that gave you access to things like PCI-E 4.0. (Unless you were upgrading/building new with a 10th gen motherboard)
You lost a couple of cores over the 10900k (8 vs. 10), but the IPC was decently improved, meaning that the 11900k has held up a lot better in gaming and was at least close to parity in multi-core productivity apps. The 10900k was basically a 9900k with a couple extra cores slapped on. And the 9900k was basically an 8700k with a couple of extra cores slapped on.
Maybe I'm misremembering that generation, but I honestly think that this is closer to a 7700k situation, honestly, which is to say, basically complete stagnation. As I recall, that CPU was within spitting distance of the 6700k and there was basically zero reason to upgrade like the situation we have now.
It also proves Intel still does not have DLVR installed on Raptor Lake Refresh, a feature originally planned for the original Raptor Lake to reduce power consumed. Might be one of those technologies only disclosed in patents that never made it to implementation.
Intel's datasheet imply both 13th and 14th gen was expected to have it:
https://edc.intel.com/content/www/us/en/design/products/platforms/details/raptor-lake-s/13th-generation-core-processors-datasheet-volume-1-of-2/006/power-delivery/
Grabbed a great deal on a 11700kf for $175 about 6 months after release at microcenter due to “waste of sand” comments from reviewers.
Hoping we can see either 14th gen go on fast sales, or the 13th gen go on deep discounts here
That's a stretch. 11th gen was a genuine downgrade, not just stagnation. 10 cores to just 8. 14700k at least had a gain of 4 e cores and all other parts got a slight clock boost. But I don't know why this "gen" is hated so much. It was advertised as refresh, and it's exactly that. Also first time in years intel released more than two series of processors for one socket so just a small gain for people still with low end 12th gen cpu's.
> Intel is having their 11900K moment again.
Intel's odd-numbered Core i gens were always meh. 1st, 3rd, the nonexistent 5th except for the 5775C, 7th, 9th, and 11th. Even-numbered gens were always a good buy until now.
I got a 12900K earlier this year for $280 USD. If 14th gen drops the 12900K further to $250ish it'll be the greatest value from Intel by far.
Looking at Techpowerup's review, my 13600K for 4K gaming is on average within 5 fps average/min framerates of the fastest systems that cost significantly more.
Granted, 4K gaming is very GPU limited but it's nice to see that I didn't make a bad choice going for the 13600K last year when AM5 was excessively expensive.
The point is that the 12700K right now is about $50-100 cheaper than the 13600K and the performance difference [is negligible.](https://www.youtube.com/watch?v=J9tanmFrNgc) People have already forgotten that the 12700K is Intel's cheapest 8 P-core CPU and is a value overclocking monster especially when E-cores are disabled.
At current prices I don't think _any_ of the 13th gen CPUs are a good buy because 12th gen has been slashed so much. Even when 14th gen is released, 12th gen will still be the best value buy for those who insist on Intel over AMD. With the exception of the 11900K, Intel's successive CPUs are obviously _"better"_ in a vacuum, but the odd-number generations never provided the uplift that justifies the cost premium.
>they are kind of forfeiting to AMD
I mean Intels current 13th gen offerings already compete well against AMD and wins for most users in most segments. These new slightly better 14th gen CPUs coming in at the same prices is not Intel forfeiting anything. AMDs response won't come for another 3 quarters, but Intels counter-response comes a quarter after that. Overall Intel is doing pretty damn well.
Wow, so overall performance uplift on the 14900k is "margin of error" level...while power consumption shoots up quite a bit.
So I guess it's true, they've gone from "water cooling recommended" to "water cooling required", at least if you want the performance you're paying for with the 14900k.
I really think they should have called this "13th gen+" instead of "14th gen". They're devaluing their brand doing tricks like this. Maybe that why they keep rebranding things?
That map size and complexity is honestly absurd as well. In the same way the small / default bench is not representative, neither is this alternative benchmark IMO.
What a pointless release, why Intel did even bother? Anyone who are still using 12th Gen shouldn't even bother upgrading to this and just wait for Arrow Lake / Zen 5 3D at the minimum
If you are using Intel's 12th Gen, there's no reason to upgrade to anything, unless you have a lower-core part. The IPC and general performance of 12th Gen is still amazing. This is more for people jumping from AM4 and older Intel gens, if the price is right. But I'd still get AM5 parts, lol.
>This is more for people jumping from AM4 and older Intel gens, if the price is right.
Im not sure who in their right mind is bothering upgrading off AM4 unless you're on a low end part. It's pretty cheap to just pay for a 5800x3d or 5900x second hand instead of the cost of a whole new platform, and unless you're running a 4090, the performance difference is maybe a few % in a specific list of highly cpu intensive games.
If you have a lower core part, you most-likely need to upgrade the PSU and CPU cooler to go with a higher core one. At that point, the cost argument goes out the window.
And Zen 5 is early next year. Intel is going to have to live this this until end of 2024 with Arrow Lake. Seems like a repeat of the whole Rocket Lake and Alder Lake situation.
I'm on a 7700K and planning to upgrade this fall. 14th gen is a welcome slight improvement over the 13th gen. Why not release a small refinement when they easily can?
The only positive I got out of this 14th gen Intel review was the competitive nature of the 5800X3D. So it's still the same upgrade path that gamers should follow. Buy something like a Ryzen 7600 or a 7700, and wait for the last 3D V-Cache chips on the AM5 platform to release in a few years.
Was waiting to see how these shook out and I've now confirmed I'll be picking up a 7800X3D on Black Friday to pair with my 7900 XTX. The little guy will be getting the i5 12400F combo I've been using as a placeholder.
Total system power is a very bad to compare CPU power effiency
When CPU is bottlenecked the GPU will work less resulting in less power usage shown for it overall.
A case in point is the 11900k which is showing less power usage though we know it uses a whole lot more than the 5800x3D and 7600x for example
The 11900K severely bottlenecks the 4090 which results in way less power usage overall
What i am trying to say that a fully loaded 100w chip A can make the 4090 work at 200w watts. 300w total. And show 100 fps
While another 100w chip B can make the 4090 work at 350w. 450w total. and show 130 fps
You would think Chip A is efficient at 100 fps at 300w vs 130 fps at 450w for chip B
But in reality, Chip B is producing more frame at the same 100w CPU power
This is true but I prefer total system power draw. It helps with choosing a power supply when building and it also accounts for the fact that a higher performing CPU while more efficient in isolation has a knock on effect to other components.
This is just another real world vs isolated component type situation and for this scenario I prefer real world style testing.
It's not real world. The 4090 is underutilized at 1080p and once you change the resolution to 4K, the power usage will increase a lot while the cpu power usage will go down.
You want to look at the gpu charts at max usage if you are using them to pick a psu.
It is a situation where you cannot control all the variables because the more work the CPU can do the more work the GPU has resulting in higher power draw for the GPU and an increase in power used full system.
If you isolate the CPU and say CPU A at 100W does 100 fps and CPU B at 100W does 150fps while ignoring the impact that has on the GPU power draw then you are not telling the entire story. Also that 50% uplift in my example is only valid with the test GPU, any other GPU might give a different FPS results depending on how high utilisation is.
It seems more like a presentation issue tbh, perhaps a stacked bar for CPU, GPU, Rest of system would be better to show it along with an fps/w metric to more easily rank the efficiency of the test system.
Decent enough point, but I see no problem with using real world metrics over artificially isolating components. Both have their merits. No need to make Intel look even worse in this regard.
As always, if you want good analysis check another reviewer like GamersNexus who covers rail level power and power efficiency at completing a task. HUB aren’t very good.
The only positive thing is , This runs cooler than 13900K ( according to TPU) , So you can buy with best air cooling and keep high Freq at high Temp , this won't throttle .
>Interestingly the 14900K runs considerably cooler than our 13900K. I confirmed that they both sit at the 253 W power limit in the Blender test, which should result in very similar heat output. Not sure what's happening here, maybe the IHS contact quality is different or the accuracy of the CPU's own power sensor varies. It is expected that the 13900KS runs warmer, because it has a 350 W power limit (vs 253 W on the 14900K).
> So you can buy with best air cooling and keep high Freq at high Temp , this won't throttle .
That's always been true, *if* you actually enforce power limits.
Very expected. I would just like to see a different kind of efficiency benchmark - power used at locked FPS. So if you lock fps to 144, what is CPU power use.
Rocket lake relaunch? No seriously what is intels strategy behind this? I know refresh keeps prices high and maybe good for your margins. But at the same time they're just damaging their brand image more and more. Yes the lower skus are decent in terms of perf/watt but why didn't they drop prices.... ugh what do I know im just a consumer.
> I know refresh keeps prices high
There's your only needed explanation. They rebranded their 13th gen so they could raise prices up again, and also put 13th gen on "sale" to hit lower parts of the market as well.
Because every extra CPU tested is ~36 more benchmark passes you have to do. That adds up very quickly. There's not enough difference between 12th and 13th gen to bother. Just subtract ~5-10% from the corresponding 13th gen part.
They don't have "potential", RPL-R was forced because MTL-S was not ready or feasible. RPL itself wasn't supposed to launch either, only existed because MTL was delayed.
The issue I'm having with these total power consumption charts, it's that I'm basically looking at the power of the 4090, not the cpu itself.
If it consumes 50w more, is it the cpu doing that or is the 4090 pulling more power because it's less bottlenecked?
DIY desktop gaming processors are a tiny part of Intel's business. Not only that, everyone knew exactly what these processors were going to be for months prior. These reviews absolutely should not come as a surprise.
DIY desktop is a tiny, insignificant market lol
Also, ARL?
Or even, MTL? Client notebooks are pretty high margin, that's miles more of an important segment than DIY desktop is...
I wonder when they gonna start showing power consumption on GAMING instead of synthetic benchmarks. I really don't care about power consumption during Cinebench.
People are new to this re-release of Intel chips I guess. Before ryzen every year chips were exactly like this every single time
Yeah, this is my first thought every time Intel releases a new generation without any big gains and it becomes a big talking point. If you had a 2600K (or even a 2500K), you were probably all set until around 8th or 9th gen.
6th gen was a pretty big step up (ddr4), but you ain't wrong. id say more like 7th or 8th when ddr4 prices came down.
6th (SKL) gen vs 2nd gen (SB) over the span of 5 years was a smaller jump than the jump from Zen 1 to Zen 2 in terms of price/performance. Also DDR4 vs DDR3 barely mattered on launch. https://www.anandtech.com/show/9483/intel-skylake-review-6700k-6600k-ddr4-ddr3-ipc-6th-generation/7
You are right, Skylake was a big jump over Sandy, that was one exception. But Sandy and Haswell were both powerful enough for the next several years to not bottleneck contemporary GPUs in typical gaming scenarios, so it didn't really matter anyway. The most popular GPUs were GTX 960/970,750ti or something like R9 270–390, and all of those barely managed to maintain 30–60 FPS 1080p in demanding games like Witcher 3, so CPU improvements were not really that impactful aside from selected games like WoW, Arma or grand strategies.
the real gap closer with sandy bridge was how well it OCed anybody could get a 4.8 GHz and the later once were very regular to land at 5.1 GHz. i had a 3770k i ran at 4.7 GHz which is what you would expect a 6th gen to get as well. with DDR4 performance at launch it made the gains even on CPU tests single digits over what i was getting. it really took until 8th gen for RAM speeds, more cores and new games to start pushing out sandy bridge on the gaming front.
> was how well it OCed anybody could get a 4.8 GHz That's a bit of an exaggeration. 4.4-4.6 was very common at acceptable voltages. 4.8 was much less common or possible if you didn't care about the life of your chip and pour the voltage in. Though, a smaller number of samples were able to clock that high with reasonable voltage. They were also really sensitive about HT. So, a 2500k often clocked higher than a 2600k unless you turned of HT.
2500k to 2600x in my case, which was 8th Gen ish
Yeah lol, my i7 3700 (non K) held out until 2020 when I first stumbled into real CPU limits in games.
My 3570K lasted me till I upgraded to a Ryzen 3600X
[удалено]
>. If you had a 2600K (or even a 2500K), you were probably all set until around 8th or 9th gen. huh, I still using 2500K, I could play death stranding at high setting at low fps like 60fps. Smooth enough to be playable.
I was using a 2600k until a couple of months ago, it worked great until I got a high refresh monitor
I went 2500k to 12th Gen. I wonder when I’ll next need to upgrade
I went 2600k > 12600k. Held up alright! Was still overclocked like mad. I actually appreciated the intel stagnation, lmao. Got me all the way through school and university when I couldn’t really afford to upgrade.
Cries whith my i5 2500k
Lmao, yeah my mate was still running his until a few months ago. Not having hyperthreading really hurt how that chip aged.
I think it still aged surprisingly well. But considering that 2700k was like $100 extra at launch and would have squeezed out 3 years extra at least, it seems 'bad'. It's just that the 2700k aged even better. At launch value propsition was squarely in the 2500ks favour. But looking back, most people would consider the 2700k instead. That doesn't make the 2500k bad by any stretch of the imagination though.
2600k was the real i7 upgrade. The 2700k was more expensive for a pointless clock bump.
The 8th and 9th gens were a huge leap and my 6700k was really holding me back in CPU-bound games by the time those rolled out. Waited a few more years before upgrading, and switching from that 6700k to a 12900k literally tripled my frame rate in some titles.
My i7 3770k survived until i got an 11400k, heck i had a 3090 paired with him for a while. Hopefully i will upgrade to 14 something soon. Actually, survived is not the word, is still kicking in combo with my former 1080ti in my kid computer, running forza 4/5. Hopefully i will upgrade both boxes until next year.
Ivy Bridge beta Ivy Bridge Ivy Bridge 2.0 overclocker's regression edition Ivy Bridge 2.1 cache and Iris Pro edition™ (you can't have it though) Skylake carried-by-DDR4 edition Skylake 1.004 Skylake 1.01 prepare to be cooked edition
You missed Haswell. Skylake was a pretty big jump compared to Ivy Bridge, especially with doubled AVX(2) throughput.
> You missed Haswell. no it's there: > Ivy Bridge 2.0 overclocker's regression edition The IPC gain of Haswell was more than offset by its loss of OC headroom making the average CPU bin worse in final performance. This is also why I centred everything around Ivy Bridge, as for overclocking it was slightly better than Sandy Bridge (and Sandy Bridge also had buggy hyperthreading, which admittedly wasn't fully fixed in Ivy Bridge either) With Sandy Bridge the average CPU could expect 5Ghz, top 10% 5.1Ghz and top 1% 5.2Ghz or higher With Ivy Bridge the average was 4.8Ghz, top 10% 5Ghz and top 1% 5.1-5.2Ghz With Haswell the average was 4.5Ghz, top 10% 4.7Ghz and top 1% 5Ghz Sandy -> Ivy 8~% IPC improvement, 4-5% OC headroom loss Ivy -> Haswell 4.5~% IPC improvement, 7% OC headroom loss
> especially with doubled AVX(2) throughput. also in many cases ivy bridge's AVX throughput is zero because it doesn't support integer avx even at 128b bit width. (ok technically it does support SSE but boy is that slow nowadays) I think people are coming from zen1/zen+ mindset where they assume ivy bridge is the same because it's 128b, but it's not, zen1/zen+ still have integer AVX support and ivy bridge does not. and that's absolutely brutal in production/encoding tasks - x264 for example does not use floating-point AVX at all, pretty sure x265 doesn't either. So if you don't have integer AVX you get nothing in these workloads. All the "literally no progress since sandy bridge!" folks should actually try using a sandy bridge processor. It's fine in desktop tasks/etc but there has been *huge* progress made in the more intensive stuff, it just shows up in accelerator unit (AVX) benchmarks and not standard desktop tasks... that level of task was "solved" a long time ago, word processing runs fast enough even on an old processor, but intensive stuff does not.
Pentium Age, Core Age: (Bridge Era, Well Era, Lake Era (2015-present)).
> People are new to this re-release of Intel chips I guess. Before ryzen every year chips were exactly like this every single time Eh. Intel normally changed *something* about the hardware in most generations, even if it was just adding more cores. 1. Nehalem 2. Sandy Bridge (New Silicon - Architecture) 3. Ivy Bridge (New Silicon - Node) 4. Haswell (New Silicon - Architecture) 5. Broadwell (New Silicon - Node) 6. Skylake (New Silicon - Architecture) 7. Kaby Lake (New Silicon - Uncore) 8. Coffee Lake (New Silicon - More Cores) 9. Coffee Lake Refresh (New Silicon - Yet More Cores) 10. Comet Lake (New Silicon - More Cores Again) 11. Rocket Lake (New Silicon - Architecture) 12. Alder Lake (New Silicon - Architecture & Node) 13. Raptor Lake (New Silicon - More Cores) 14. Raptor Lake Refresh Intel has previously offered "refresh" SKUs, such as Haswell Refresh. And the Coffee/Comet family really only improved things at the high end where all the cores were enabled. But a refresh/recycle has never before been a whole new product generation. That much is new.
Kabby lake is when Intel started to spin its wheels. Raptor Refresh also introduces extra cores in some SKUs. Intel went Tic-Tok for some time and while we still had 4c8t top parts until Zen gen on gen improvements were closer to 10-20% plus CPUs were rather lazily clocked, meaning people could get an extra 800mhz from OC Kabby(past initial 6000 with Skylake) till Rocket we got Skylake with different amount of cores and clocks. Also 7000 Intel gen was alive for around 8 months or so. Not great all and all.
Real huge stretch to call Kaby Lake a hardware improvement. Don't forget about HEDT Kaby Lake-X which was the worst thing to come out of Intel in the past 10 years. The 7740X is up there as one of the worst "flagship" desktop CPUs of all-time.
Kaby definitely didn't set the world on fire, especially on the CPU side of things. But it did introduce a new GPU architecture, as well as some fairly important power delivery/DVFS changes. It wasn't just warmed-over Skylake silicon (unlike Raptor Lake Refresh).
Raptor Lake Refresh at least introduces more cores on some SKUs which is actually valuable for CPU performance unlike Kaby which performed the same as Skylake before it.
Amusingly, Kaby Lake-X was _worse_ than Skylake-X. The main reason people went HEDT was at the time, it was the only way to get more than 4 cores from Intel. The 7740X was a 4-core 7700K on the more expensive HEDT platform where people went for CPUs like the 18-core Skylake-X 7980XE. HUB tested it and it actually [performed worse](https://www.youtube.com/watch?v=47a_iyZ9nmI) than the 7700K.
And it led to Ryzen becoming the force it is today. Intel complacency might have worked while AMD was just not delivering acceptable products, but nowadays with AMD beeing competetive such a release is just really bad. Right now for gamers AMD is faster, cheaper and on a platform that is getting at least 2 more major releases. I really dont see a reason to go Intel for anything other than productivity right now.
> cheaper lets calm down, am5 is only now getting cheaper lol
Depends. The 13600k is "ok" pricewise, although the competing 7600 is 100€ cheaper (and a bit slower). But the 390€ 7800X3D easily competes with the 630€ 13900k. If you had an AM4 board id still always get a 5800X3D over a 13600k or 7600. If you are building a new system the 7800X3D is great value with a good chance of at least one good upgrade. I dont really see a point in saving 70€ on a 13600k compared to a 7800X3D for a slower product on an end of life platform. So yeah, purely for gaming i do think right now AMD is miles ahead in value for money.
Well i said now because while yes it kinda got better, but at least here in brazil i could buy two am4 systems or 1700 for the price of a setup for the 7600 some months ago. It was not worth at all. I was more focused in the "cheaper" part as most people here would only but the actually cheap ones anyway.
Don't forget said platform costs less to get into when including the cost of an "entry level" motherboard Beyond the price of the CPUs, the motherboards are just way better value on AMD
Add cooling difference. 7800x3d works on 50$ pearless.
I think people just want to see how Intel respond when AMD release the X3Ds. Well now we know that Intel still can't compete with AMDs X3D. Still pushing powers to the point that it's embarrassing to see that most of the efficient CPUs are AMD, and still adding those stupid "efficiency" cores.
This isn't Intels response to x3d. This is just a refresh. Check out Intels next gen meteor lake / arrow lake to start understanding what Intels response might look like.
>Check out Intels next gen meteor lake / arrow lake to start understanding what Intels response might look like. But thats like a year away and they were already losing in gaming before 14th gen. Not beeing disappointed by this release because "you knew it was just a refresh" is just a cop out, if i have low expectations i can never be diappointed but its still a terrible look to release bad products. Just dont release it then, or at the very least call it something else than 14th gen.
It's a launch that's mostly for OEMs. Calling it 14th gen makes it sound more impressive to the kinds of people who buy Dell desktops at Best Buy.
I'm just not sure what folks expected. It was well known this release would be a refresh with a very slight clock boost on the high end. Now it happens and people here are bewildered that it's not some revolutionary new product totally changing market dynamics. So I stand by my point that you can only be disappointed here if you had absolutely no clue what Intel was going to be releasing despite it being well known for a long time now. Gaming benchmarks are pretty close btw. Only in efficiency and extremely niche scenarios is there a decent gap.
Intel is to blame if people are shocked the gains are this low. They _chose_ to name it 14XXX. Had they used a different nomenclature (13990k?) people would be less annoyed. It's already hard to explain to anyone who's not versed in CPU releases how their nomenclature works, we don't need "Oh, the generation is pretty important! oh but not _that_ generation, 11=>12 is important, but 13=>14 is useless." added into it. If Intel didn't want to be judged harshly, they had to name things appropriately. The fact that they announced ahead of time that it was a refresh is pointless. Nothing in the naming scheme suggests that. If they don't want to be lambasted for bringing no generational improvement, _don't call it next gen_.
Gaming isn't the only thing that matters. Also for most uses they're effectively tied in gaming. Who runs a new/high end card at 1080p? There's edge cases like Factorio of course but that's an edge case. As an FYI I was making the same argument saying "don't discount Zen 1" back when people screamed that KabyLake was better for gaming.
You do realize the confirmation that this is just a refresh is what is so disappointing, right? To add to that, going back several generations, this is their worst incremental release. The 11700k to 12700k and 12700k to 13700k saw a better improvement in performance.
You're disappointed on release day by a refresh that was announced years ago? I'm kinda surprised so many folks here had absolutely no clue what this release was.
The Raptor Lake Refresh was pre-announced yesterday, what are you talking about. EDIT: To the downvoters, until last month when it was leaked by Intel China, we did not know it was just a refresh. Yesterday, they finally made the official announcement that it was going to be just a refresh, confirming what some folks feared. [https://videocardz.com/newz/intel-confirms-14th-gen-core-raptor-lake-s-hx-refresh-for-the-first-time](https://videocardz.com/newz/intel-confirms-14th-gen-core-raptor-lake-s-hx-refresh-for-the-first-time) [https://www.reddit.com/r/intel/comments/16ogyle/where\_is\_the\_official\_announcement\_of\_the\_raptor/](https://www.reddit.com/r/intel/comments/16ogyle/where_is_the_official_announcement_of_the_raptor/)
From last year: >Intel's 2023 roadmap for the desktop processor segment sees the company flesh out its 13th Gen Core "Raptor Lake" desktop family with 65 W (locked) SKUs, and the new i9-13900KS flagship; followed by a new lineup of processors under the "Raptor Lake Refresh" family, due for Q3-2023 https://www.techpowerup.com/302619/intel-raptor-lake-refresh-meant-to-fill-in-for-scrapped-meteor-lake-desktop
>This, according to OneRaichu, a reliable source with Intel leaks. Don't blame me for not putting much weight on "OneRaichu" as a source.
There were tons of leaks. Not just that one reliable one. Here's another. https://www.notebookcheck.net/Intel-Raptor-Lake-Refresh-release-date-reportedly-set-for-mid-October-for-K-series-and-early-January-for-non-K-14th-gen-CPUs.733121.0.html This is on top of Intels roadmap showing we wouldn't get a new arch or node shrink this year. You can stick your head in the sand but that only backs my point: only folks who had absolutely no clue what intels doing could be disappointed by this typical refresh.
Again, these are by leakers on social media spreading rumors about what might be releasing. This specific one you linked sources a youtube channel. They even say themselves, >As always, take these rumors with a grain of salt. We don’t have any official details regarding Intel's Raptor Lake Refresh, so it shouldn’t surprise you if the information shared by MLID doesn’t pan out. EDIT: And to respond to /u/Negapirate since they blocked me, a great example is the 4090 Ti, which was not only predicted by reputable leakers, but they even had pictures of it. It ended up being very real, but was cancelled internally. Leakers are nice for getting a vague idea of what a company is planning, but never rely on them as a definitive source. A few leakers said it was going to be a refresh, that does not mean that was for certain until we get confirmation from Intel themselves. The fact that people are still so disappointed in the 14700k should tell you something about this.
> This isn't Intels response to x3d. This is just a refresh. Hence the disappointment? Ryzen 8000 is probably coming somewhat soon next year and so far info is pointing it to being a decent improvement.
You can only be disappointed if you had absolutely no idea what Intel is doing and that this is how Intel has done refreshes for the last decade. Hence my suggestion folks read up a bit. Check out Intels next gen meteor lake / arrow lake to start understanding what Intels response might look like.
Idk. If it's just a refresh they might as well call it 13x5xk. Giving RLR a new gen somewhat raises expectations. Although even the arch name points out what it really is.
I don't disagree but this is pretty much convention for hardware refreshes afaik. I guess a modern shift would be nvidias super series refresh, but that seems to be an anomaly. Number go up is better for marketing I guess lol.
> convention for hardware refreshes afaik Once upon a time they had the decency to remain on 4xxx with Devil's Crayon.
When in the last decade have they released a refresh with zero IPC uplift?
Very recently. They did exactly that with the lower tier i3 series 12th- and 13th-gen parts. :)
I'm old enough to remember when Rocketlake was a waste of sand
Still had a decent IPC uplift, the only real dud was the core regression in the 11900K.
Skylake -> KabyLake -> CoffeeLake -> CoffeeLake+ -> CometLake There might have been some very modest increases between them but they're all usually seen as having near identical perf/clock as they're fundamentally the same design with minor tweaks, mostly focuses on security, clock speed and energy efficiency.
Ipc benchmarks don't show 0% >Before concluding our tests, let's glance at some IPC results. By running Baldur's Gate 3 with all processors at 5GHz (with a 3GHz ring bus and E-cores disabled), we observed that the 14900K is roughly 2% swifter than the 13900K, which is 3% faster than the 12900K. From the 12900K to the 14900K, there's a slight 6% increase in IPC. https://www.techspot.com/review/2749-intel-core-14th-gen-cpus/ Intel has had many mediocre refreshes in the last decade+.
Sorry, next to zero. The cinebench results showed a pathetic 1%-2%, this is a real why did they bother generation.
Even the 12900k to 13900k was only a 3% uplift. And other refreshes showed similar results. Your narrative that this is some anomaly in ipc uplift for a refresh just isn't true.
[удалено]
Intel can't just respond to X3D. What comes out started development years ago. If AMD hypothetically releases some crazy new CPU that caught everyone offguard, Intel literally can't just respond immediately. That being said, because MTL was delayed, RPL was created. It took ADL and cranked the clock speed way up. Now, MTL is coming out, which is essentially just ADL but significantly lower power draw. IF RPL never came out, MTL would launch and bring no perf. improvement, but lower power consumption by 50%. But RPL does exist, so MTL desktop would be a perf. regression. Intel had 3 options: Release nothing on desktop until ARL is ready. Release RPL again with a tiny clock speed boost. Release MTL and see a perf. regression. They chose RPL refresh because it lets their OEM partners advertise a new CPU for this years model.
I would like to see Intel re-implement something like Broadwell, where they had the massive L4 cache. That actually gives a nice boost to some applications.
Intel filed a patent for an L4 cache tile (which was rumored to be designed for use with Meteor Lake) which is called 'Adamantine' cache earlier this year. No product announcements yet, so it doesn't seem like a release is imminent, but I wouldn't be surprised if we see something resembling vCache out of Intel sometime in the next year or two.
Broadwell had 6MB L3 cache. Alderlake has 30MB L3 cache. The benefits of a relatively slow eDRAM overflow cache are much reduced when the L3 cache is 5x the size. By the time fast DDR4 was a thing, system memory had similar latency and higher bandwidth than the eDRAM cache, albeit it was half duplex vs FDX. At this point any next level cache would need to be a HUGE boost over what broadwell had to be worth the overhead given how good Intel's L3 cache is and how solid DDR5 is. In the server world, HBM is a thing though it's probably not the best choice for desktops quite yet.
I want to see ubm respond, but I lack the courage to look.
I wouldn't go THAT far. Pre-ryzen you had a bunch of "meh" released for the prior 5 years. +25% IPC in 5 years with negligible clock speed improvements. To intel's credit these were new designs and the only real "critique" is that Intel REALLY should've been giving people 6 cores earlier. Intel cut costs. During Ryzen you went from SkyLake to SkyLake+ (KabyLake) to SkyLake++ (CoffeeLake) to SkyLake+++ (CoffeeLake v2) to SkyLake++++ (CometLake) The ++++ era was the same design but with more silicon thrown at it and a dash of process improvement.
This is even worse than those 7th/9th gen refreshes. At least those had some IPC and efficiency gains. This is just Intel shoving even more power into existing silicon lol.
Which was what intel did at 8th and 9th Gen processors. Just look at other reviews
Those generations weren't as bad the newest ones since they significantly increased cores and thread counts when the standard 4 threads of the i5 was beginning to be noticeably slower
Not just intel, or even cpus. There have been quite a few gpu generations where the next version was barely better re-releases.
The old tick-tock release
No, not every time, sometimes the price went up more. ;)
It is in a way surprising that Intel did this though now with Ryzen though. It just looks so bad
This just made me appreciate the 13600K even more, that one (or the 14th gen i5 for the same price) remains great value for mixed usage and gaming at high resolutions. Looks like Zen 5 is going to annihilate Intel though.
Arrow Lake is next year. It should feature new cores and Intel multi die packaging with Foveros. Zen 5 will have the work cut out for it, unless Intel messes up.
> Zen 5 will have the work cut out for it It's strange that you say Zen 5 will have its work cut out for it, when Intel are the ones with the recent track record of fumbling the architectural ball. They couldn't get Meteor Lake working on desktop for whatever reason so had to scrape the barrel by reliving the good old days of useless +0% IPC, not-even-a-stepping-change rehashes just so they could have a release in 2023. Not to mention it looks like Zen 5 will arrive sooner than Arrow Lake. If anyone looks like they have their work cut out for them, it's Intel.
[удалено]
Mutli-die approach is very important for cost. Newer nodes are drastically increasing in price and large die yields may be a problem. Being able to only manufacture the compute portion of the chip on bleeding edge, while the less important parts can use more mature nodes helps with volume and cost. Intel 20A is not library complete. It can't be used for iGPUs or parts such as the memory controller. A 2024 launch of ARL wouldn't be possible if it was still monolithic.
Arrow Lake is 20A. Is that not die shrink enough over 7N Intel?
> Looks like Zen 5 is going to annihilate Intel though. 15th gen was always the one that was going to matter. Zen5 will be competing with that, not 14th gen.
Zen5 will compete with both as it is releasing sooner than 15th Gen.
But like usual it won't have 3D cache at launch so people will bitch and moan about not caring till vcache Zen 5 launches in 2025.
[удалено]
It's a valid counterpoint to the people arguing that they didn't raise the price of 14th gen. They kinda did, since now they no longer recommend water cooling, it's virtually required for the 14900k, unless you want the same performance as a 13700k. And preferably a 360 rad or a 420. So that needs to be factored in when considering the price/performance, along with the added cost of electricity.
Meanwhile 7800x3d is happily working with 20-30$ coolers, God, we really need an Intel to step up their game in some areas, sure they have a really good performance, but efficiency is terrible.
The 14700K and 14600K were also reviewed, and a drop in power usage would have been quite welcome if it happened.
I think it's mostly to compare with 13th gen, since performance is so close we're just looking at overclocked 13th gen parts. Cyperpunk +2% fps +3% power Last of Us +1% fps +6% power Star wars +4% fps +5% power Going backwards.
Intel is having their 11900K moment again. With virtually no gains in a generation, the only moderately interesting part is the 14700K. And with the i5s also not getting any real bumps they are kind of forfeiting to AMD. Seems like the only lever they can pull is power and they are ratcheting it up and up every generation. Why is the 14600K consuming 5% more power than the 7950X3D? In games. Hope that Arrow Lake will be better, but this does make me a bir anxious.
>Why is the 14600K consuming 5% more power than the 7950X3D? In games. That's easy. Because it is at a node disadvantage.
I think it's a huge stretch to call this an '11900K' moment. It's a boring refresh/rebrand of exist parts sure, but Intel's position in the desktop CPU market is not anywhere near as dire as it was in the 11th gen days, and 14th gen is not an outright bad product like Rocket Lake was. 13th gen competes just fine with Zen 4, and Zen 5 is probably not coming to desktop for at least another 6 months.
>I think it's a huge stretch to call this an '11900K' moment. It's a boring refresh/rebrand of exist parts sure, but Intel's position in the desktop CPU market is not anywhere near as dire as it was in the 11th gen days I actually think it's worse, honestly. I know that the 11th gen wasn't great, but I think a lot of its terribleness is overstated. If I remember correctly, the 11900K was at least a new architecture that gave you access to things like PCI-E 4.0. (Unless you were upgrading/building new with a 10th gen motherboard) You lost a couple of cores over the 10900k (8 vs. 10), but the IPC was decently improved, meaning that the 11900k has held up a lot better in gaming and was at least close to parity in multi-core productivity apps. The 10900k was basically a 9900k with a couple extra cores slapped on. And the 9900k was basically an 8700k with a couple of extra cores slapped on. Maybe I'm misremembering that generation, but I honestly think that this is closer to a 7700k situation, honestly, which is to say, basically complete stagnation. As I recall, that CPU was within spitting distance of the 6700k and there was basically zero reason to upgrade like the situation we have now.
It also proves Intel still does not have DLVR installed on Raptor Lake Refresh, a feature originally planned for the original Raptor Lake to reduce power consumed. Might be one of those technologies only disclosed in patents that never made it to implementation.
DLVR is for Meteor Lake. They tried to backport it to RPL mobile, but I don't recall anything about desktop.
Intel's datasheet imply both 13th and 14th gen was expected to have it: https://edc.intel.com/content/www/us/en/design/products/platforms/details/raptor-lake-s/13th-generation-core-processors-datasheet-volume-1-of-2/006/power-delivery/
Grabbed a great deal on a 11700kf for $175 about 6 months after release at microcenter due to “waste of sand” comments from reviewers. Hoping we can see either 14th gen go on fast sales, or the 13th gen go on deep discounts here
Yeah, but on another hand I would rather have zen4 at a dirt cheap price after zen5 launch and then upgrade at the end of am5 platform.
That's a stretch. 11th gen was a genuine downgrade, not just stagnation. 10 cores to just 8. 14700k at least had a gain of 4 e cores and all other parts got a slight clock boost. But I don't know why this "gen" is hated so much. It was advertised as refresh, and it's exactly that. Also first time in years intel released more than two series of processors for one socket so just a small gain for people still with low end 12th gen cpu's.
> Intel is having their 11900K moment again. Intel's odd-numbered Core i gens were always meh. 1st, 3rd, the nonexistent 5th except for the 5775C, 7th, 9th, and 11th. Even-numbered gens were always a good buy until now. I got a 12900K earlier this year for $280 USD. If 14th gen drops the 12900K further to $250ish it'll be the greatest value from Intel by far.
13th gen was good
Looking at Techpowerup's review, my 13600K for 4K gaming is on average within 5 fps average/min framerates of the fastest systems that cost significantly more. Granted, 4K gaming is very GPU limited but it's nice to see that I didn't make a bad choice going for the 13600K last year when AM5 was excessively expensive.
The point is that the 12700K right now is about $50-100 cheaper than the 13600K and the performance difference [is negligible.](https://www.youtube.com/watch?v=J9tanmFrNgc) People have already forgotten that the 12700K is Intel's cheapest 8 P-core CPU and is a value overclocking monster especially when E-cores are disabled. At current prices I don't think _any_ of the 13th gen CPUs are a good buy because 12th gen has been slashed so much. Even when 14th gen is released, 12th gen will still be the best value buy for those who insist on Intel over AMD. With the exception of the 11900K, Intel's successive CPUs are obviously _"better"_ in a vacuum, but the odd-number generations never provided the uplift that justifies the cost premium.
>they are kind of forfeiting to AMD I mean Intels current 13th gen offerings already compete well against AMD and wins for most users in most segments. These new slightly better 14th gen CPUs coming in at the same prices is not Intel forfeiting anything. AMDs response won't come for another 3 quarters, but Intels counter-response comes a quarter after that. Overall Intel is doing pretty damn well.
Wow, so overall performance uplift on the 14900k is "margin of error" level...while power consumption shoots up quite a bit. So I guess it's true, they've gone from "water cooling recommended" to "water cooling required", at least if you want the performance you're paying for with the 14900k. I really think they should have called this "13th gen+" instead of "14th gen". They're devaluing their brand doing tricks like this. Maybe that why they keep rebranding things?
The higher-end 13th gen CPUs were "water cooling required" as well, at least if you wanted to use them at full load.
Yup, managed to hit 380w on my 13900ks yesterday. Going for that 400 lol. Granted I can't cool it for long even with beefy WC.
Next year when desktop switches to "2nd gen Core Ultra " branding, it'll only make the nee branding look that much more impressive.
> So I guess it's true, they've gone from "water cooling recommended" to "water cooling required" FX-9590 moment
lol, it’s just a re-badge, apart from small tweaks to the 14700k…what a waste of time.
Well, would you look at that 14900k is faster in Factorio if you use a big map. 200+W for a whole FPS.
It's game updates per second, not FPS.
That map size and complexity is honestly absurd as well. In the same way the small / default bench is not representative, neither is this alternative benchmark IMO.
Yeah, he should geomean the two imo.
But hey, fast loading means less time at 200+W /s
What a pointless release, why Intel did even bother? Anyone who are still using 12th Gen shouldn't even bother upgrading to this and just wait for Arrow Lake / Zen 5 3D at the minimum
12th gen came out like 2 years ago why would anyone running that CPU even be looking to upgrade? Am i missing something?
Sometimes people like to upgrade, or just build PCs. Doesn't have to be a strict budget thing.
I thought PCs were expensive and then I got a car... now I understand people who get the best GPU every generation a bit more
lol dude 1000%. As hobbies go, PC building is on the tame side of the budget.
You could get a really mean gaming rig for entry level warhammer money
CPUs tend to have minimal gains compared to GPU upgrades though.
If you are using Intel's 12th Gen, there's no reason to upgrade to anything, unless you have a lower-core part. The IPC and general performance of 12th Gen is still amazing. This is more for people jumping from AM4 and older Intel gens, if the price is right. But I'd still get AM5 parts, lol.
>This is more for people jumping from AM4 and older Intel gens, if the price is right. Im not sure who in their right mind is bothering upgrading off AM4 unless you're on a low end part. It's pretty cheap to just pay for a 5800x3d or 5900x second hand instead of the cost of a whole new platform, and unless you're running a 4090, the performance difference is maybe a few % in a specific list of highly cpu intensive games.
12700k you are basically moving from an easily air cooled part to a hotter less efficient part. Would not upgrade.
Yeah I’d say it’s only worth upgrading from a 12100f or 12400f, which to be fair are two very common parts
It's for people like me who are still on the 8700K Lol, so yeah
If you have a lower core part, you most-likely need to upgrade the PSU and CPU cooler to go with a higher core one. At that point, the cost argument goes out the window.
So pc manufacturers can slap a 14 series sticker on their products.
And Zen 5 is early next year. Intel is going to have to live this this until end of 2024 with Arrow Lake. Seems like a repeat of the whole Rocket Lake and Alder Lake situation.
> What a pointless release, why Intel did even bother? because intel. this is what they do
So they didnt need to lower their prices and so alla mootherboard manufactures could sell more new boards at higher prices.
Probably contractes wit OEMs.
I'm on a 7700K and planning to upgrade this fall. 14th gen is a welcome slight improvement over the 13th gen. Why not release a small refinement when they easily can?
I just want to appreciate the YouTube thumbnail https://i.imgur.com/f5NoIqu.jpg
it's recycled just as 13900k
Does anyone know why he had to re-upload this?
Accidentally uploaded the 13th gen review.
Is this why the thunbnail changed?
Ahh. Thanks!
The only positive I got out of this 14th gen Intel review was the competitive nature of the 5800X3D. So it's still the same upgrade path that gamers should follow. Buy something like a Ryzen 7600 or a 7700, and wait for the last 3D V-Cache chips on the AM5 platform to release in a few years.
Was waiting to see how these shook out and I've now confirmed I'll be picking up a 7800X3D on Black Friday to pair with my 7900 XTX. The little guy will be getting the i5 12400F combo I've been using as a placeholder.
Total system power is a very bad to compare CPU power effiency When CPU is bottlenecked the GPU will work less resulting in less power usage shown for it overall. A case in point is the 11900k which is showing less power usage though we know it uses a whole lot more than the 5800x3D and 7600x for example The 11900K severely bottlenecks the 4090 which results in way less power usage overall What i am trying to say that a fully loaded 100w chip A can make the 4090 work at 200w watts. 300w total. And show 100 fps While another 100w chip B can make the 4090 work at 350w. 450w total. and show 130 fps You would think Chip A is efficient at 100 fps at 300w vs 130 fps at 450w for chip B But in reality, Chip B is producing more frame at the same 100w CPU power
This is true but I prefer total system power draw. It helps with choosing a power supply when building and it also accounts for the fact that a higher performing CPU while more efficient in isolation has a knock on effect to other components. This is just another real world vs isolated component type situation and for this scenario I prefer real world style testing.
It's not real world. The 4090 is underutilized at 1080p and once you change the resolution to 4K, the power usage will increase a lot while the cpu power usage will go down. You want to look at the gpu charts at max usage if you are using them to pick a psu.
That does not make sense as not all CPUs will be paired with a 4090.
Yeah, what if they were paired with a 7900XTX /s
It is a situation where you cannot control all the variables because the more work the CPU can do the more work the GPU has resulting in higher power draw for the GPU and an increase in power used full system. If you isolate the CPU and say CPU A at 100W does 100 fps and CPU B at 100W does 150fps while ignoring the impact that has on the GPU power draw then you are not telling the entire story. Also that 50% uplift in my example is only valid with the test GPU, any other GPU might give a different FPS results depending on how high utilisation is. It seems more like a presentation issue tbh, perhaps a stacked bar for CPU, GPU, Rest of system would be better to show it along with an fps/w metric to more easily rank the efficiency of the test system.
Decent enough point, but I see no problem with using real world metrics over artificially isolating components. Both have their merits. No need to make Intel look even worse in this regard.
They are playing at 1080p with a 4090. It's not real world.
As always, if you want good analysis check another reviewer like GamersNexus who covers rail level power and power efficiency at completing a task. HUB aren’t very good.
The only positive thing is , This runs cooler than 13900K ( according to TPU) , So you can buy with best air cooling and keep high Freq at high Temp , this won't throttle . >Interestingly the 14900K runs considerably cooler than our 13900K. I confirmed that they both sit at the 253 W power limit in the Blender test, which should result in very similar heat output. Not sure what's happening here, maybe the IHS contact quality is different or the accuracy of the CPU's own power sensor varies. It is expected that the 13900KS runs warmer, because it has a 350 W power limit (vs 253 W on the 14900K).
It does terribly in this video though.
> So you can buy with best air cooling and keep high Freq at high Temp , this won't throttle . That's always been true, *if* you actually enforce power limits.
Very expected. I would just like to see a different kind of efficiency benchmark - power used at locked FPS. So if you lock fps to 144, what is CPU power use.
Shocking news - the 14k series is, indeed, a refresh 🤯
Rocket lake relaunch? No seriously what is intels strategy behind this? I know refresh keeps prices high and maybe good for your margins. But at the same time they're just damaging their brand image more and more. Yes the lower skus are decent in terms of perf/watt but why didn't they drop prices.... ugh what do I know im just a consumer.
OEMs want bigger number for new box
They also want to stop support of 600 and 700 series motherboards and launch their new "max" 700 series boards with higher prices with same features
> They also want to stop support of 600 and 700 series motherboards What? They've all been updated to support 14th gen CPUs
> I know refresh keeps prices high There's your only needed explanation. They rebranded their 13th gen so they could raise prices up again, and also put 13th gen on "sale" to hit lower parts of the market as well.
I'm confused, why would they not include 12th gen in their gaming benchmarks? Just that "ipc" test where every chip is clocked to 5ghz.
Because every extra CPU tested is ~36 more benchmark passes you have to do. That adds up very quickly. There's not enough difference between 12th and 13th gen to bother. Just subtract ~5-10% from the corresponding 13th gen part.
Ahh. Intel going back to its roots of releasing same cpus over and over again. All that potantial to advance yet they choose to be greedy businessman
They don't have "potential", RPL-R was forced because MTL-S was not ready or feasible. RPL itself wasn't supposed to launch either, only existed because MTL was delayed.
Amd saved us from these pieces of shit, they always have done this.
AMD’s plenty guilty of rebrands as well (see the Zen 2 Ryzen 7000 mobile chips, B350 to B450, RX 480 to RX 580, and so on)
The issue I'm having with these total power consumption charts, it's that I'm basically looking at the power of the 4090, not the cpu itself. If it consumes 50w more, is it the cpu doing that or is the 4090 pulling more power because it's less bottlenecked?
I don't know how INTC isn't tanking this morning. This coming year is going to be more market share losses at the high end after Zen 5 is released.
DIY desktop gaming processors are a tiny part of Intel's business. Not only that, everyone knew exactly what these processors were going to be for months prior. These reviews absolutely should not come as a surprise.
> tfw only > 80% market share > why live
DIY desktop is a tiny, insignificant market lol Also, ARL? Or even, MTL? Client notebooks are pretty high margin, that's miles more of an important segment than DIY desktop is...
I wonder when they gonna start showing power consumption on GAMING instead of synthetic benchmarks. I really don't care about power consumption during Cinebench.
moreover they took old 13900 preview [https://postimg.cc/HVffXfRb](https://postimg.cc/HVffXfRb)
Please let me know I made a good choice from 7700k to 14700k haha