T O P

  • By -

StrikerX1360

https://i.redd.it/7kjzvyt0ue8d1.gif


Nuze_YT

Yeah, it's crazy that the company that they bootlick to hell and back banned them from their subreddit


SamsquanchOfficial

LMAO professional clowns


hiddenhero94

why does userbenchmark hate AMD so much anyways? are they paid off by intel or something?!?


buff-equations

Intel publicly denounced them, so not sure


Ozok123

Why? Its always a good laugh. 


Ahielia

Because there are always newbies thinking it's useful or good. It's neither useful or good.


ehhdjdmebshsmajsjssn

I use pcgamebenchmark. It shows what percentage of game you can run with your specs. There's also graphs and stuff.


NatoBoram

Oh, I was looking for an alternative to Can You Run It that also worked on Linux since these clowns seem incapable of making a simple binary that's cross-platform


The_Jyps

Can we not forget that it's genuinely the best way to compare hard data between cards of the same brand? Nvidia Vs Nvidia cards doesn't have the information as easily digestible anywhere else. Never use it to compare AMD to Nvidia though, they seem to hate AMD and nerf their results.


Marty5020

I hate to agree, but agreed. It used to have that one useful aspect for me, which is the percentage of relative performance to similar counterparts. Not that it matters anymore now that it's been paywalled.


buff-equations

Toms Hardware maintains a good hierarchy with proportional based scoring, making it incredibly easy to compare any GPUs even cross brand scaling


The_Jyps

If that's true I will switch.


buff-equations

There’s one for CPUs and one for GPUs. I think they only keep 3 generations but it’s still really good resource. I would only say it’s a little less digestible because they’re all scaled to whatever is the best card at the moment. You just need to compare growth percentage. 100(A-B)/B=how much better A is compared to B.


StrikerX1360

[Passmark for CPUs](https://www.cpubenchmark.net/singleCompare.php) [TechPowerup for GPUs](https://www.techpowerup.com/gpu-specs/)


svenger-hunter-gomez

why :D is it a meme here? :D


StrikerX1360

No because it is genuinely strewn with misinformation and bias and is a highly unreliable source of information


svenger-hunter-gomez

i see. lol never knew that, will keep in mind


StrikerX1360

Would highly recommend watching the [Hardware Unboxed](https://youtu.be/AaWZKPUidUY?si=N7Qh-1AN8s50DUlk) video on it, or just looking up pretty much any review UB has done on a Ryzen processor. They don't even hide their bias and their product descriptions read like a 12 year old fanboy's hatemail


TakeyaSaito

because its crap.


NeverEndingWalker64

Bruh he just asked why you gotta downvote him


svenger-hunter-gomez

yes, iam getting slaughtered, i have some replies here, people dont like it and brutally downvote it even tho i just said what my intention was or me asking something lol but its ok for me


no6969el

Yeah, and then your question gets hidden and no one sees to answer unless they click it. So someone else is probably going to ask the same question again thus repeating the process.


redmainefuckye

Well DUH. HOW ELSE AM ISUPPOSE TO FEEL IMPORTANT AND LIKE MY OPINOIN MATTER ?!?


persondude27

The 13900k and 14900k being basically identical [was a major criticism of the 14th gen](https://youtu.be/2MvvCr-thM8?t=1004). All of the reviews were that 14th gen was a shameless money grab since each chip was maybe 2-3% faster than the 13th gen but only under the best circumstances. Thermal throttling affecting performance is the #1 complaint for both 13th and 14th gen. (12th gen, too, but it wasn't as bad). The conclusion is you basically want to throw the biggest water cooler you can on there - up to and including 420 mm if it's supported - because the 14900k can draw *350+ watts* in artificial benchmarks. The good news is that gaming tends to use about half that (150-180 watt) in gaming workloads, so your CPU cooler is probably fine. TL;DR: yep, that's how the 14900k works. it's a 13900k + 1-2% if your cooler can support it. If you want a few more points on your benchmarks, I suppose you could get an Arctic Liquid Freezer III 360.


Z-Frost

350 watts on a cpu is insane


bbpsword

It's high on a GPU, let alone a CPU.


Whats_logout

My i7 14700f uses more power than my 7900 xt lol


postvolta

Meanwhile my 7800x3d sipping


maaaaarcus

I am so glad I picked 7800x3d over 14700k


ArtFart124

It's basically the only logical choice (in that price bracket) for someone who wants to have a PC for gaming/general use.


EastLimp1693

My 10900k did 300 watt at 5.0 allcore


mister2forme

The sad part is he didn't want to upgrade his board, but could have gotten a 7800x3d and motherboard for about the same price. Though he may have needed memory too. At least then he would have seen a gaming increase.


turtleship_2006

So if I have a 10th gen CPU right now and want to upgrade (I need to do it within a few months for uni, I have an i3 but the [software](https://www.sidefx.com/products/houdini/) they use requires at least an i7 apparently) which generation should I go for? I can upgrade the motherboard if it's worth it.


reddit_pengwin

>the software they use requires at least an i7 apparently) That's just nonsensical: i7s used to be the same across generations between the 2000-7000 series: 4 cores with hyperthreading. But since the 8000 series the core and thread counts have been increasing with most generations. Your 10th gen i3 is the same core config as a 2nd-7th gen i7 for example. IMHO test it with your current i3 - after that you can still try an in socket upgrade to a 10k-11k series used i7 on the cheap. Intel 10k-11k series are basically the same, 12th gen was new, 13th gen was merely a refinement of 12th, and the 14k series was the same as the 13th gen. 12th-14th gen use the same socket, but it is already deprecated - 15th gen is getting a new socket. BTW the software's own system requirements page does not list an i7 or any core count, or even Intel as a requirement. I can't find any benchmarks, but Houdini seems to be able to use as many cores/threads as you can throw at it while being less frequency dependent, so you might be better off with an AMD system anyway.


turtleship_2006

Tbh that's just what one of the guys in the Uni's discord server recommended and I assumed they meant a recent i7 >after that you can still try an in socket upgrade to a 10k-11k series used i7 on the cheap. the cheapest ones i can find after a quick search are around 200 quid, but I don't want to buy one just to find out I need to buy another one


morriscey

You'll be fine. a 10th gen i7 will do everything you need it to, just a little slower than the newest gen.


Real-Human-1985

change to AMD.


StomachosusCaelum

> because the 14900k can draw > >350+ watts > > in artificial benchmarks. Not with the PL2 limit at Intel's stock settings, it wont.


Hattix

Therein lies much of the problem. What are Intel's stock settings? Where do we find them? The motherboard defaults are not Intel's stock, the UEFI doesn't show what they are and they're not shown on ARK. (For anyone wondering, Intel's spec for a 14900K, PL1 is 253 watts, PL2 is 253 watts, tau is infinite, and current limit is 307 amps)


FatBrookie

Every mobo can enforce the limits by a simple setting. Even before the all mighty fix they dropped. At least on asus it's called "Disable - Enforce all limits" was there since the start of the 13tg gen CPUs.


StomachosusCaelum

This. And the default specs are definitely on ARK. (Right there "maximum turbo power" - 253W)


YCCprayforme

I have an ak620 air cooler on my 13700k, wanted to try it out instead of a 360 AIO. Haven’t had any thermal throttling issue at all, also have not run cinebench but i will test it out


Appropriate-Oddity11

24 vs 16 cores


zcomputerwiz

A larger cooler won't help much as far as performance goes. Generally undervolting is required as well. They can't move the heat out of the silicon fast enough, so the issue will be throttling under load.


svenger-hunter-gomez

iam so shocked of intel rn :( and also crazy how i could not find this when i did research on that 14900k lol well i knew there is some heat issues but damn son, thats straight unfair, i could have just bought a 14700k then paid like 200€ less and have same performance then no?


peacedetski

>i could have just bought a 14700k For gaming, absolutely. Or a 14600K, or a 13700K, or even better, a 7800x3d. Really, unless you have a 4090 and/or play competitive shooters at 1080p low details, you'll be nearly always limited by GPU with any above-midrange CPU.


Noreng

It's more like once you've reached the currently "necessary" amount of cores for gaming (currently 6 for the most part), the only improvement you can get is having those cores running faster. The clock speed difference between a 14600K and a 14900KW is small in the larger picture.


grandmapilot

Cache. To feed the cores. It somehow often overlooked. 


Noreng

Cache is definitely a way to make cores faster


seraph321

Yep that’s why I went with the 13700k and stuck with my board and dd4 ram, no need for more than that for a while imo. I expect I’ll be able to easily harness the power of a 5080 for 4k gaming and still not worry about being cpu limited.


Appropriate-Oddity11

Where did you even find this info? The 7800x3d beats the 14900ks in gaming while being cheaper thean the 14700k and cooler than the 13600k.


Wada_tah

I'm super interested to see where you researched that chip that did not mention (at MINIMUM) no/minimal performance gain from 13th gen. All reputable reviewers have been discussing this. To your second point, for the price of the 14900k you should be able to get a 7800x3d + AM5 mobo, for better performance and considerably less power and heat.


Maethor_derien

This is literally not the first time they have literally pulled the exact same thing. They have done this like 3 or 4 times now. Pretty much you can expect the in between generations from them to have almost no large performance boosts. If it uses the same socket don't expect more than about 5-10% performance boost out of it at best.


scheurneus

Eh, going from 12th to 13th gen has some benefits: a bit more cache, and a significant increase in E-core count which is useful if you do things that benefit from them. Gaming doesn't, but a lot of other CPU-intensive tasks do benefit quite a bit. For example, the i5-12500 has 6+0 cores compared to 6+8 on the 13500. While the cache benefits only apply to the 13600K and up, the 8 extra E-cores mean that when they can be used you get a large performance uplift.


Maethor_derien

The thing is that the 12500 was really the only exception in that generation, most of the others didn't have nearly that big in a jump in performance. The other processors only got 4 more E-cores not 8 as well. The e cores are not multithreaded so an extra 4 cores that something like a 13700 or 13600 got were not nearly that impactful. Anyone who actually was regularly using the type of programs that would actually take advantage of the e cores was also buying something much higher end that the 12500. Really the big benefit of the 13500 was that the E-cores massively improved the power consumption on easy tasks like web browsing.


Effective_Secretary6

Yes and I really don’t know why u getting downvoted on this for no reason :(


riba2233

You didn't watch basic benchmarks before spending 600 dollars, that is 100% on you


peacedetski

Bro, 14900K is **literally the same CPU** as 13900K, just with 3% higher boost clock. And they're both heavily power-constrained (especially after the latest BIOS updates limiting the power), so they will only run as fast as your motherboard, CPU cooler and the power envelope in BIOS will allow. Try undervolting if you want better performance.


elliotborst

Don’t use userbenchmark it’s garbage


KernelPanic-42

The 14900k isn’t next gen, it’s the same gen. They’re literally the exact same CPU. Also, gtfo of here with userbenchmark.


NoStructure5034

Lowkey thought it was bait at first, especially with the title being "14900K performs like 13900K." I was like, "yeah, we've known that for months."


romulof

You should have gotten a Ryzen 7800x3d for that price.


smithsp86

But why would anyone want a more efficient and powerful chip for less money?


romulof

Fancy blue box?


DynamicHunter

Productivity Intel still wins. But for gaming it’s a no brainer for 7800X3D


MetalGearHawk

Do some research next time before spending


ademayor

He did, he went to userbenchmarks to hear “AMD bad”


ADeadlyFerret

Hey he didn't listen to youtubers though. You know because they just shill for companies lol.


FatBrookie

Because it's the same CPU. What did you expect?


ssuper2k

Looks like your searching skills lackz 13900k/14900k overheating and extreme power use, is about everywhere. 13900k was already pushing it hard, 14900k is the same chip, just pushed even harder. For that Cpu, your CPU cooler lacks, your mobo lacks. For just gaming, best Cpu is 7800x3D


sharknice

Did you seriously spend $600 to upgrade to a 2% faster CPU?  CPU lottery and cooler seating make more of a difference.  And even if it did you are most likely already bottlenecked by your GPU for nearly every game.    I hope you're trolling.


Beautiful-Musk-Ox

> I lately upgraded from 12400f to 14900k they upgraded from a 12400f, not a 13900k.


svenger-hunter-gomez

yes, and sadly iam not :S


DynamicHunter

Let this be a lesson to do minimal research on big purchases


vesElectricEyes

Maybe it's not a big purchase for OP?🤔


Eazy12345678

i9 12900 i9 13900 and i9 14900 will all perform similar. they all use the same socket same constraints. there is this thing called silicon lottery. you could get a really good i9 12900 that out performs i9 14900. i9 needs water cooling.


I-LOVE-TURTLES666

Yep I hit the lottery on a 13900ks. Binned 4 14900ks when it came out and they couldn’t match the 13.


svenger-hunter-gomez

damn if i only knew this i would have bought a 13900k instead lol i hate water cooling tho, but maybe i will set it up one day one big AIO, but if u say it can be normal i will just leave it like this so far my PC is overscaled anyway i have a 14900k and i game like 1hr per week mostly just desktop and browsing, but the shock is big. my whole life i was a intel fanboy but rn i feel like betrayed like never before, was my first time buying i9 and i guess also the last time for me lol. Or is this same shit for AMD cpus too?


TressymDude

I mean a 7800X3D can easily be aircooled and matches a 14900K in gaming, just falls behind in a lot of productivity performance. Silicon lottery is everywhere, sometimes you get happy, sometimes you get sad.


LJBrooker

I'd say "quite handily beats the 14900k in gaming".


TressymDude

Handidly? I’m not an intel fanboy but even I know that they match the performance in modern triple A games. Some games the sheer multicore performance favors 14900k, some games the L3 cache is insane for the 7800X3D. Now, in terms of *efficiency*, best CPU right now is 7800X3D hands down. 14900k lights on fire while the 7800X3D draws nothing in comparison.


LJBrooker

Likewise not a fanboy, have jumped ship more times than I care to count. Wherever the performance or value is (which is more important has varied throughout my life too), is where I've always gone and typically that's always been Intel for the most part, but it just makes no sense to me the last few years. It's fine margins I guess, but the 78x3d is about 10% ahead with the i9 running on its new Intel performance profile, and 6% on the 253w extreme one (and I can't stress enough, that's 253w for lords sake). Then there's extreme cases. Jedi Survivor, 18% faster on the 78x3d, BG3: 18%, Hogwarts: 12%. You'll see the odd game that's 5% the other way but seldom the same sort of gap, and not nearly so often. I'd say 6-10% at the top end halo tier constitutes "handily beating". Yes you won't get that in every game, and yes there are some that favour Intel, but do an average across a lot of titles, and that's the ballpark result most outlets come up with.


TressymDude

Wow, haven’t seen the performance on recent games and yeah, 7800x3D is 5% faster on average… Why the hell do people keep buying 14900k? It’s more expensive, draws a ton more power, and is slower…


LJBrooker

I have no idea. The 78x3d has gotten better with time and bios updates. The 14900k has literally gotten worse now it's had it's power constraints reined in a touch. But people just think more number = better. If you're working on the thing, there might be a case for it. Adobe everything springs to mind. For everything else there's the 7950x and for gaming the 78x3d. And yet every day this sub is full of people buying the 14900k for more money, on a more expensive, dead end platform, and paying 200 dollars on AIOs for it. When question they say they want to play League or something. 🤷 Makes not a jot of sense, but fools and their money are quickly parted.


NoStructure5034

Why do people keep buying the 14900K? 1. Many people see "i9 vs R7" and go "ah, yes, bigger number better" 2. People like OP are tricked by Userbenchmark's unhinged ranting about "AmD sHiLlZ" and biased benchmark representations. 3. They just do no research at all and buy the newest Intel chip


cowoftheuniverse

> Or is this same shit for AMD cpus too? It was kind of the same for a long time but AMD now has 2 gaming focused special products 5800x3d and 7800x3d. They have just enough cores for gaming but not too many and not enough to be premium workload cpus. What happened previously is that if you wanted the fastest chip, you needed to buy the chip that has the most cores also and is workload focused too. Mind you that if you stepped down from say 5950x to 5800x (half the cores) you almost got the same perf in gaming, but the absolute fastest AMD had before 3d was 5950 by a hair. If you had 14600k you would lose gaming performance only little but workload performance a lot. Also, the other poster suggesting silicon lottery could have 12900 beating 13900/14900 is just wrong. 12900 is the slowest assuming all are working chips.


Slazagna

An i7 won't bottleneck a 4090. A poorly performing i9 won't bottleneck anything. I don't know why people buy i9s for gaming and I certainly don't know why people care about benchmarks when nothing will push it that hard in reality.


[deleted]

[удалено]


Slazagna

Ok, a recent gen i7 then smart ass. ;p


DynamicHunter

This is why you can’t just say i7 smart ass :p


iliketurtles50000

I have a friend who wants to do that pairing. For the shits and giggles of course, he wants to cram it in a dell optiplex using a dremel and hopes and dreams


Local_Trade5404

i have some parts for your friend then in good price :P i just switched from i7 4790k to 7800x3d this year


DynamicHunter

lol are you me? Did the exact same because that CPU was struggling in Helldivers


Local_Trade5404

nah mine problem was with wow, funny enough it was been mostly bad optimization of skills on specific bosses and i could wait for next gen of intels/amd with making decision :P nothing lost tho, im mostly ok with amd, only temps are killing me


ThatWasNotWise

You don't watch GN do ya?


livelivinglived

> I did not listen youtube influencers . For me its all corporated shilling and fudding of amd and nvidia products i mean each say different also they talk too much mostly :D >i mostly just looked at raw benchmarks and stuff, and it was a child dream to have once the best intel cpu there is :D (my inner child just got rekt after understanding its a paperbox tho lol) but selling that cpu will not be an option tho, i will keep this build now until it dies lol Meanwhile he went off of info from Userbenchmark to justify this purchase.


LightBluepono

Sweat irony .


Ok_Cut_5180

Ever Lost the Silicon lottery?


svenger-hunter-gomez

![gif](giphy|7d9ny05QfucjcHDnsM|downsized)


Appropriate-Oddity11

Silicon lottery means either nothing or everything for that cpu depending on how you see it. The 14900k is a 13900k overclocked 200mhz, so you could feasibly get 14900k perf anyway.


xenocea

The two mistakes that you made is thinking the 14900k was much faster than the 13900k The second mistake is using userbench.


nhc150

That Noctua air cooler is doing yourself no favors. Ignore benchmark FOMO if you're happy with gaming performance.


Link_0610

What are you doing with the pc mainly? If gaming, try to return the CPU. For 600€ you can upgrade to a 7800x3d, am5 mobo and 32gb DDR5 RAM.


jaegren

That's what you get for supporting Intel and UBMs bullshit.


FallNice3836

I’d water cool it and let it ramp up the power bill. I know there’s tweaks but I wouldn’t run i9 on anything but water personally.


Appropriate-Oddity11

360 or 420


ghoxen

Good thing I'm still sitting on 12th gen


NoStructure5034

12700K gang!


Beautiful-Musk-Ox

i'm curious hows the arc for you? any problems in the games you play?


NoStructure5034

Works well for what I play, but I don't really push it by playing demanding games. I play Valorant and Minecraft at 1440p, and pretty much no other games.


AconexOfficial

same. 12700 is still a beast with its 8 performance cores


georgewesker97

Bro even my 12600k eats EVERYTHING i throw at it, and besides gaming I do software development.


joodontknowme

LoL all this crap and still not faster than AMD...


EMB_pilot

I heard with the 14900K pretty much needs to be delided and Liquid Metal cooling to get normal temps.


StomachosusCaelum

Or just enable the default power settings so it wont try to draw an extra 200W for 4% more performance.


Appropriate-Oddity11

4% less perf is less than the 13900k lmfao


ssuper2k

14900k = 13900k + pre OC'ed If temps allow, maybe you get 100-200 Mhz extra (more watts too), if not, you get same or worse


MyPokemonRedName

The title of this post might as well be 13900K performs like 13900k. Anyone who had any passing interest in tech YouTube in the last year knows that 14th gen is basically re-badged 13th gen.


PrenupCleanup

Dude was focusing on a gazillion of brand labels and tech names and a bunch of pc related mumbo jumbo (as evidenced by the post itself - ROG SHMOG DH CBCPU bullshit), and forgot to actually do proper research before buying.


gfy_expert

Hate AMD for 15-20% IPC uplifts every 2 years on same motherboards


Real-Human-1985

AMD so baaaaaaaaad so eeeeeeevil.


clotteryputtonous

When I had the same issue with the 10700k and 11700k, that’s when I switched to AMD. Intel lost its touch with their consumer based CPUs in favor of their workstation and server clients. I hope one day a new CEO is able to turn the tables at intel.


bergenus

This is pretty much my story. I used Intel all my life. The 14 series was so poor (havent tried a 13 series), that I went AMD for the first time. I've never remotely had as much hardware issues as trying to get the 14600 and 14700 to work... I honestly think they shouldn't have been released. I can't believe how much better my 7900 is in every single way.


Mekemu

Oh no! I got results that I could have easily researched in advance! 600 USD poorer and still no smarter.


iamnotyourspiderman

Even my 13700k heats up like hell with a Dh-15. These gen processors get crazy hot. I’ve resorted to undervolting and underclocking mine a bit to have it working without any throttling issues. For 1440p gaming it has been just fine with those tweaks. Not really keen on watercooling it, but maybe one day. At least there is room to play then. Still I am amazed and kind of disappointed at this development, coming from a 6600k which I could push and overclock like crazy for almost a decade until I upgraded last winter.


Civil_Excitement_747

As someone who also has a 14900k I know that it performs pretty much identically to the 13 but it was actually cheaper than the 13th one when I was buying so I went with it instead, also userbenchmark is trash. P.S you definitely need a better cooler these cpus get insanely hot because of how much power they draw


ArtFart124

It's up to you but in my personal opinion you should return the CPU and instead buy a new motherboard and CPU for the same price like a 7800X3D or stick to intel but with a new board. I don't really understand how you didn't read ANY of the reviews that all stated the biggest flaw of that CPU is its lacklustre gain in perf to the previous gen, but hey we all make mistakes occasionally. Maybe in the future do some more rigorous research before buying!


LightBluepono

User bench mark lol


lndig0__

14900k is just the price-cut version of the 13900KS.


Psych_out06

Did you read up on the chip before buying it? It's barely different then the 13900k in real world usage. However, that's not a bad thing, because the 13900 IS A BEAST. Stop whining and go enjoy your pc


svenger-hunter-gomez

thank you for this comment in its origin this is the ultimate truth for sure, so far i love my 14900k every game runs good, everything stable and chill, was just interested about this bench topic and this "get what i paid for" topic >D


Psych_out06

Everything is worth exactly what your willing to pay for it. No more, no less. 😁


InterstellarReddit

My Bro one generation apart same cpu is max a 5% performance improvement.


Appropriate-Oddity11

realistically 1-2%.


InterstellarReddit

Unless it’s a new socket then I’ve seen some good numbers.


apachelives

AMD knows how to give good gains between generations, not all the time, and that could just mean previous gens were a little behind etc Intel otherwise yeah, we are talking about a company that gave us 10+ generations of quad core CPU's including 7-8 quad core i7 models


Gippy_

> we are talking about a company that gave us 10+ generations of quad core CPU's including 7-8 quad core i7 models Intel did have 6-core and 8-core CPUs. They were just paywalled under the more expensive "HEDT" platforms. But why buy a 6-core i7-3930K (yes, this is Sandy Bridge) when a 4-core i7-2600K could overclock a bit higher and perform as well in games overall for half the price? All Intel did was realize the HEDT marketing didn't work anymore in 2018 and created the i9. They slowly transitioned out of HEDT by first giving the mainstream platform more cores, then discontinued HEDT altogether: X299 was the last HEDT platform. And it worked because now many gamers have been conditioned to buy i9s for bragging rights.


I9Qnl

This is mostly just down to naming, intel prefers to call their refreshes as a new generation while AMD keeps trickling new variants into the market the same generation, 14th gen is a refresh of 13th gen, it's more or less the same as AMD's XT and disabled iGPU refreshes, as well as the all other weird releases like the GT and F, and that one off 4000 series refresh which was worse than 3000 series. AMD just announced the 5800XT and 5900XT for AM4, they're literally the exact same chips as the 5800X and 5950X but will be sold at a higher price, they're serving the same purpose 14th gen did, selling same parts for a higher price. I like neither of them, AMD builds a pattern just to break it later in anti consumer fashion, the Ryzen 7 5700 having less cache than 5700X and no PCie 4.0 support is a scummy attempt to prey on people thinking the non-x version is just a slightly underclocked version of the same processor, the 5700 performs worse than a 5600, and of course the entire 4000 series is a joke.


Real-Human-1985

on intel maybe.


Ziggy_Zigg

Intel is like Apple. You gotta upgrade every 2-3 Generations to see improvements


Han_5olo

Did you install the 5/31 bios update already?


svenger-hunter-gomez

i dont like bios updates, but i think i might do have this one already installed, dunno, what will it change? how can it help lol


Han_5olo

With the latest bios update (3212) i mentioned, you get the option to apply the recommended intel default settings for your 14900k. Bios>Extreme Tweaker>Performance preference. Good luck


svenger-hunter-gomez

i just checekd my bios is 3010 so its like half year old, i will write that down and try that, thank you very much sir, will update what it changed


Appropriate-Oddity11

What the fuck did you think would happen? 13th and 14th gen have a difference of 200mhz clock speeds.


drumpad322

Upgrading just 2 generations is crazy for me


Sterrenstoof

You know what's insane, selling a CPU that's so unstable that it causes crashes in DX12 games, specifically Unreal, that they've had to force motherboard vendors to push out a Intel Baseline Profile.. while they are probably the ones that told those vendors to push their CPU's as hard as they can just to try and beat the X3D series from AMD in gaming performance. The powerdraw is stupid too, I am selling my rig next year just to swap back to AMD's 9000 series.. not hating on Intel but things there really haven't advanced so much in the past few years lol.


Darth_Murcielago

oh you got an intel cpu and trusted UserBenchmark? a certified smh my head moment. here have a gif of a pallas cat to lessen the pain of wasting 600€ ![gif](giphy|13i0KOcbP7af16|downsized) (btw if you use your pc mainly for gaming, the AMD Ryzen 7800x3d would've been the best choice... despite being better and cheaper it would even save you some money in the long run because it doesnt really suck that much energy... unlike those hungry intels)


Simpy-Cuck

i have a 12400F - What should i do now if i want to upgrade to max? Mobo: B660M DS3H DDR4


svenger-hunter-gomez

yea seems like dont buy a 14900k is the right answer on this :D


Feeling_Designer_112

Welcome to the Intel‘s world my friend.


Ok_Blueberry_3139

It's posts like this that make me glad I'm too stupid to understand all of this. All I know is - my pc can run the games I wanna play - I'm happy


svenger-hunter-gomez

this was me before i did the first benchmarks too :D dont do benchmarks sir :D


Ok_Blueberry_3139

I shall not, kind squire


T11nkr

Try and see if your board supports the following settings: Hyperthreading disabled (Benchmarks will score even lower but most games don't use hyperthreading so it really doesn't matter if it is turned off. Hyperthreading is useful for a creator when rendering videos or compiling shaders in an engine CPU Lite load (is set as default at mostly 9, you can lower it to 2) Disable all boost and power saving options like Intel Turbo Boost, CPU C State) Set a manual core multiplier. The 13700k has a base clock of 5200mhz so the multiplier is 52. set the multiplier to the highest boost. For my 13700 it's 54. And whatever the E-Cores would boost to. 42 for me. Now you could additionally play with a voltage offset if your board supports it. A negative core voltage offset reduces the overall power for the cpu. You will be undervolting. -0.12v is what worked for me. So now you disabled all the fancy features that look good on paper but in reality only are a heat generator. That doesn't mean the cpu will run at a constant clock speed all the time but the difference of drawing these enormous spikes are gone. What happened to my system is that the cpu now runs without boosting at a fixed speed drawing less power. Gaming will benefit greatly and heat issues are gone! 75 degrees max with a tuned down 240 AIO. (Fan and pump speed) It's fucked up but Gamers Nexus and others pointed out that MOBO manufacturers have increased their default settings for Intel CPUs in the bios compared to Intels stock recommendations thus generating heat issues.


svenger-hunter-gomez

so i already set undervolt offset to -0.12and it changed just temperature score was about same -Yes my HT was off already even before my original post because of old ubisoft games i have a b660 chip so i dont think i can modify cpu lite load or set a manual core multiplier, my board is not made for a 14900k :D but i can try to turn off intel turbo boost and cpu c state (i dont know about cpu c state tho as it is officially by intel stated needed for 14900k lol)


itsamepants

.... Why did you buy a K chip with a B series mobo ? It's like buying Porche but fuelling it with a 91.


StomachosusCaelum

>.... Why did you buy a K chip with a B series mobo ? > >It's like buying Porche but fuelling it with a 91. Because the base and boost clocks on the K-SKU are still FAR higher than the non-K, thats why.


itsamepants

That's fair , but would it be able to achieve these high boost clocks with the piss poor VRMs on a B series ?


StomachosusCaelum

... not all B-series boards have bad VRMs. Quite a few are just the Z-series board with the B-series chipset (the B760 ITX board from ASUS is basically the Z790 board cut down slightly due to the chipset difference). And you dont need to leave it uncapped/out of Intel spec to get full boost clocks. Its a lot more cooling dependent (for an i9 at least, i7 less so, and the i5 can be cooled just fine by a Peerless Assassin) than power delivery dependent. I run a 13600K on a B760-I just fine. Gets the full boost clock no problems. I went that way because the ITX Z series boards in particular were stupid expensive (almost twice as much) and getting a potential extra 300mhz or so of all-core boost simply wasn't worth the tinkering that would have been required or the very little performance it would have given me, especially since i game at 1440p+ and almost nothing i play would be remotely CPU limited at that resolution paired with my 4080. It just wasn't worth bothering to deal with OCing. CPU already boosts well past 5ghz, which is never going to be the limiting factor between "game performs good" and "game is unplayable" in the reasonable lifetime of the machine.


svenger-hunter-gomez

i had the mobo already and i had the money for a 14900k back then, my budget was too high :D


T11nkr

They recommend C-State and other power saving options because saving energy is top trending. In fact, you won't save that much energy because as the cpu uses less energy when idle, the more it will use to boost +boost options combined with power saving will lead to huge spikes. For really making use of this otherwise great CPU, buy a Z790 MOBO at some point.


MrPopCorner

14900k vs 13900k is like 2% difference, which can be nullified by margin of error. The big difference between 13 & 14 lies in the memory stability improvements with DDR5. So to say 14900k should outperform a 13900k.. is technically what it will do, just not how/in the way you'd expect it to.


svenger-hunter-gomez

and i even running this thing on ddr4 haha damn i screwed up hard on my build :D


MrPopCorner

Oh snap! But, on the bright side: you can upgrade mobo & ram later :D


svenger-hunter-gomez

hahah yes the masterrace never ends, mobo, ram, fan, :D


smithsp86

Sounds to me like you are getting the normal intel experience.


SupplyChainNext

My 14th gen going from. SHITrix 690 with a 360 aio to an Aorus Master with the same AIO and stock settings on both the 14th Gen performed on average 10 degrees lower for me. Many things can factor here.


thejaysonwithay

Generational upgrades are not the move in 2024


Whydontname

Only way to get max performance from 14900k is delid and direct die cool. Things are monsters.


megaschnitzel

You guys out there updating a CPU for 2% more performance while i'm still using my Ryzen 5 2600 lol


No_One_Special_023

I upgraded from a i7-9700K to a i7-14700K. The performance for me was massive but during my research on what chip to upgrade to, I discovered the 13000/14000 series were the same chip. I only went with the 14 series cause it was on sale and with that sale only $50 more than the 13 series. It made sense. I probably won’t upgrade for another four to five years though so I wanted the best on the market.


Gippy_

> I discovered the 13000/14000 series were the same chip. Well, the 14700K has 12 e-cores while the 12900K/13700K have 8 e-cores. In practice, this doesn't really matter much. OP's MT CPU-Z score is 13.6K with 16 higher-clocked e-cores, while my 12900K gets 12K with 8 e-cores. So double the e-cores for an extra 14% in MT performance, but at the cost of significantly extra power and heat.


Fad-Gadget916

My last two builds were 10980XE and 10600K.....my next jump will be 1nm. Architectural jumps are much better than ticks.


King_o_Time

On another note: The NH-D15 is G.O.A.T.ed! I will probably still use that cooler in ten years.


Educational_Duck3393

That's because they practically are the same.


Ziggy_Zigg

i chuckled once you mentioned "userbenchmark". Never trust those lying shills. Go on youtube and look up GamerNexus or TomsHardware. These two guys are legit and coummnity certified tech experts .


Flamestroyer

downclocks because how fucking unstable 14900k's tend to be.


cowoftheuniverse

> i have this setting on asus that its 90 degrees limited but i wont uncheck that When you compare to others, almost everyone is using the default limit of 100 for intel. That will be a part of the reason why they score better. Big score for cinebench with current i9 needs unlimited power, insane cooling and no power limits. If you are just gaming then all core rendering performance like cinebench score is mostly irrelevant.


rohitandley

14th gen is basically buying time for Intel to push out ultra variants. They weren't ready on time so they just added a NPU feature & marketed as something better.


HelloItsKaz

Why upgrade from the 12900k? Seriously I’ve used mine since it came out and it’s honestly still insane for what I’m playing.


StomachosusCaelum

You cant undervolt on B660. You might think you can, and the BIOS might make you think you can, but it wont do anything.


svenger-hunter-gomez

what do you mean? Surely it changes somewhing or else how did my pc freezed and refused to start when i played around in negative offsets from -0.15 until i got to -0.12 which was the first one not crashing and freezing? :D


StomachosusCaelum

Intel disabled Undervolting on B660 boards almost immediately. Doesn't work on B760 either. You can change all the settings you like, but it wont work right. There's dozens of articles about it. Itll also potentially completely destroy performance. even a .01 offset would cut performance in half in some cases. Only manufacturer that worked a way around it was MSI with their Lite Load settings.


L1191

ASUS B660-I, B760-I support voltage offseting via microcode, they removed it and readded it. Presumably due to backlash. I have B760-I and run 0.07 offset on 13900K


StomachosusCaelum

The re-ad would have to be fairly recent. It wasn't working when i built my 13600K/B760 rig last fall. But, glad they re-added it. Disabling it was stupid.


quaint-addle-pate

Another sign to avoid the lure of capitalism


svenger-hunter-gomez

yess purely another show of the modern world and its consequences :D


GothicRuler

If I’m getting a new PC, is the 14th gen i9 still good?


NoStructure5034

The 7800X3D games the same or better for much less. Draws significantly less power and produces less heat as well. But in most cases the CPU isn't too important as you'll run into GPU bottlenecks first.


Gamma89

Depends if you need a lot of cores/threads for professional applications, if not just buy another CPU that performs the same in video games


KanedaSyndrome

You can't expect much from generation to generation I think. Have you made sure that your RAM keeps up with the new CPU? CPU and RAM should be matched together to prevent bandwidth throttling. Faster CPU = need for faster RAM if they were matched in the previous generation CPU. In the future, only get an "upgrade" if it's an actual upgrade in terms of performance/Watt. If the performance/Watt ratio is the same, then it's basically the same processor, just with more power and not worthy of a "new generation" stamp.


[deleted]

[удалено]


NoStructure5034

That's because of the mobo, not the CPU.