AMD vs. Intel GeForce RTX 3080 Benchmark

Bro a few frames behind means jack anything over a 100fs is overkill to begin with you can't even tell at a point how fast the refresh is many will lie and say they can but their full of it

The Results for Ryzen chips is good enough to be happy with I own many Intel Rigs as well AMD Rigs there comes a point that people want the most frames but you only need 30-60fps to really enjoy a game anything after is just scratch

Real Players know 60FPS and a good Internet connection is just as good as someone on a 144hrz refresh it's all the same on the internet your connection matters more than refresh and frames if you have a mid to top tier card just saying

You have people playing on less and last time I checked most people are still using 1080p and 1440p slowly becoming the standard

I have a 3950x Overclocked and used my 1080ti and 2080super with zero issues or lags it's people who think that a few stupid frames matter when you can't tell anything after 60fps

This is a FANBOY argument because at the end of the day they all do the same crap and 8 to 10 frames less can be made up in other ways

We don't know if they overclocked what ram etc so I mean most of these test benches are on stock configs as well
I am willing to take a blind test between 60 and 144 fps. If I lose I'll pay you 10k. If I win you pay me 10k plus the costs for tickets to visit your place and take the test. I don't even need to play any games, I can tell the difference just by scrolling on YouTube or any other website. Since YOU seem to be full of it, I'm sure you'll take me up on the challenge.
 
So were you comparing boxed clocks? Knowing that most motherboards overclock out of the box all the Intel 10900K cores to eg 4800MHz. If so up to Techspot to find out what happens if you OC the Ryzen CPU to 4.0 or even higher... I bet the performance difference will become marginal at the lower resolutions...

Happily upgraded from 9900K@5Ghz to 3700X@4.2, not even looking back
Nice side grade / downgrade there.
 
Nice side grade / downgrade there.
Downgrade? Not at all, the AM4 is a far better future proof platform than Z390 would be. I don't even notice that I have lesser clocked Processor. The AMD platform feels in general smoother than my Intel setup and its even cooler running. Now at stock I noticed the AMD was slower, so a small OC, tweaked the memory ( 32GB C16 3800Mhz at 1900fclock) and now its butter smooth.... 3900X or Ryzen 3 next :)
 
Thank You TECHSPOT for the comparison.

based on your findings I will build my next desktop using an Intel CPU + my 3080.

every little bit of advantage is appreciated.

You would take a large hit on content editing when going Intel over AMD?
 
Techspot deserves all the stick it gets for using a 3950X. Its slower and more expensive than a 10900K (And even a 10600K) and as has been demonstrated, the PCIE4 doesnt give it an advantage. The only reason I can think that Techspot would use a 3950X in a top end graphics card benchmark is out of pure bias to AMD and pandering to the militant AMD fans that exist like a cancer in the community at the moment. There is no other reason I can see, looking at the results it would have shown Nvidia as worse than it is at 1440p, so Techspot have chose to hurt their results for Nvidia in order to give Ryzen a mention when they dont deserve it. There are probably a number of Intel CPU's that perform better than Ryzen in games that could have been used in this test.

There is a strong case for Ryzen, in fact in most use cases its better. But for gaming its firmly second place, Intel is faster and smoother for playing games, this is well proven at this point and the AMD fans need to accept this. This is a games test, encoding performance is 100% irrelevant here.

This may change soon with Ryzen 3xxx, im hoping it does because I need a new CPU. But if it doesnt, the fans can rest assured that AMD will still hold the crown for spreadsheets and encoding etc. lol.
 
Downgrade? Not at all, the AM4 is a far better future proof platform than Z390 would be. I don't even notice that I have lesser clocked Processor. The AMD platform feels in general smoother than my Intel setup and its even cooler running. Now at stock I noticed the AMD was slower, so a small OC, tweaked the memory ( 32GB C16 3800Mhz at 1900fclock) and now its butter smooth.... 3900X or Ryzen 3 next :)
A 9900K to a 3700X is definitely a downgrade.
 
Techspot deserves all the stick it gets for using a 3950X. Its slower and more expensive than a 10900K...The only reason I can think that Techspot would use a 3950X in a top end graphics card benchmark is out of pure bias to AMD
I believe you missed the point of the article. See the "AMD" in the "AMD vs. Intel" headline? If they simply tested two Intel chips against each other, it might defeat the purpose, don't you think?
 
I am willing to take a blind test between 60 and 144 fps. If I lose I'll pay you 10k. If I win you pay me 10k plus the costs for tickets to visit your place and take the test. I don't even need to play any games, I can tell the difference just by scrolling on YouTube or any other website. Since YOU seem to be full of it, I'm sure you'll take me up on the challenge.
I guess you missed Steve's video when you put the 3080 in the second pcie slot it actually puts the 10900k in its place but you're too busy being a Intel fan boy to care lol.

Anyway keep doing that Intel thing literally the same stuff and really if you judge frames by YouTube then you need to change the kool-aid you're are drinking lol

Keep telling people that AMD is no good there is legit reasons why people use AMD over Intel you're just one of those people hanging on to stereotypes AMD lives matter Bro ??
 
A 9900K to a 3700X is definitely a downgrade.

There's more to life than some synthetic benchmarks, but fanboys believe whatever they want. The Ryzen is better suited for my line of work and during gaming sessions I don't even notice your so called downgrade... I wish Techspot will rerun some 1080 and 1440 benchmarks with an overclock on the AMD CPU.
 
A 9900K to a 3700X is definitely a downgrade.
lol ,I just watched a Video where the 3700x beat the 9900k by over a 100fps lol
Seriously people need to do more research those synthetics got you by the boo boo's

I am in no way a FANBOY of either brand I own both but Synthetic Benchmarks are always coded for certain CPU's and the fact oldschool people like me have to keep Hammering it into you guys brains to comprehend that they need to do more for AMD code and source work

Intel will always have a edge when you have everything programed to run on it and not another platform AMD has been always there but most programs are designed for Intel Microcode not AMD that is why a lot of software is onesided

Clearly that has been changing but again real world performance proves that synthetic benches shouldn't be taken seriously due to varies conditions
 
I guess you missed Steve's video when you put the 3080 in the second pcie slot it actually puts the 10900k in its place but you're too busy being a Intel fan boy to care lol.

Anyway keep doing that Intel thing literally the same stuff and really if you judge frames by YouTube then you need to change the kool-aid you're are drinking lol

Keep telling people that AMD is no good there is legit reasons why people use AMD over Intel you're just one of those people hanging on to stereotypes AMD lives matter Bro ??
What stereotypes? I said there is a noticeable difference between 60 and 144. Didn't mention AMD or Intel in the post you quoted, so I have no idea what you are talking about. I'm actually an AMD fanboy, but Intel is better for gaming so I won't pretend otherwise.
 
lol ,I just watched a Video where the 3700x beat the 9900k by over a 100fps lol
Seriously people need to do more research those synthetics got you by the boo boo's

I am in no way a FANBOY of either brand I own both but Synthetic Benchmarks are always coded for certain CPU's and the fact oldschool people like me have to keep Hammering it into you guys brains to comprehend that they need to do more for AMD code and source work

Intel will always have a edge when you have everything programed to run on it and not another platform AMD has been always there but most programs are designed for Intel Microcode not AMD that is why a lot of software is onesided

Clearly that has been changing but again real world performance proves that synthetic benches shouldn't be taken seriously due to varies conditions
In what real world program is the 3700x better than a 9900k? Afaik, 99% of actual software runs better on a 9900k.
 
Lol, this quickly turned from an 'I am no fanboy and here is this fact [I.e. insert some logical fallacy] from my completely unbiased mind' to 'INTEL for the love of God!'.

@Strawman
The point of this review was to show on a common scale (same platform) the performance differences between different GPUs. Regardless of the platform differences, within the same platform itself, the performance differences among GPUs are not different.

Meaning, 10900k can be the uber bestest gaming CPU, yet on 10900k the differences among GPUs vs. the differences among GPUs on 3950x are not different in a statistically significant way. So article/review fulfills its actual goal.
What your suggestion implies is a need to see the answer to the question 'what is the HIGHEST attainable FPS?'. Well, that is not what this review was about. So again, what you meant by '10900k is used because it is the fastest gaming CPU' holds no ground in a review like this.
 
Techspot, thanks for using the AMD CPU. Personally the fact that there are issues with some games gives more meat to your article and makes is a better read than intel only reviews. I would love to see why those games have issues on Ryzen, is it just unoptimized or another issue and to learn if any studios have a response or solution.

I personally think it is unoptiimized games since other games have no issues at all.
 
Techspot deserves all the stick it gets for using a 3950X. Its slower and more expensive than a 10900K (And even a 10600K) and as has been demonstrated, the PCIE4 doesnt give it an advantage. The only reason I can think that Techspot would use a 3950X in a top end graphics card benchmark is out of pure bias to AMD and pandering to the militant AMD fans that exist like a cancer in the community at the moment. There is no other reason I can see, looking at the results it would have shown Nvidia as worse than it is at 1440p, so Techspot have chose to hurt their results for Nvidia in order to give Ryzen a mention when they dont deserve it. There are probably a number of Intel CPU's that perform better than Ryzen in games that could have been used in this test.

There is a strong case for Ryzen, in fact in most use cases its better. But for gaming its firmly second place, Intel is faster and smoother for playing games, this is well proven at this point and the AMD fans need to accept this. This is a games test, encoding performance is 100% irrelevant here.

This may change soon with Ryzen 3xxx, im hoping it does because I need a new CPU. But if it doesnt, the fans can rest assured that AMD will still hold the crown for spreadsheets and encoding etc. lol.

Please do some more homework before you make wild accusations like this. If you read this page https://www.techspot.com/review/2084-amd-or-intel-for-gaming-benchmarking/ which clearly explains why Techspot used AMD and the answer is not bias.

How does using AMD it hurt their results? They get a more thorough review because of it. Would you know AMD sometimes has 17% lower FPS in another review like at Tom's Hardware? No because they only used Intel.
 
lol ,I just watched a Video where the 3700x beat the 9900k by over a 100fps lol
Seriously people need to do more research those synthetics got you by the boo boo's

I am in no way a FANBOY of either brand I own both but Synthetic Benchmarks are always coded for certain CPU's and the fact oldschool people like me have to keep Hammering it into you guys brains to comprehend that they need to do more for AMD code and source work

Intel will always have a edge when you have everything programed to run on it and not another platform AMD has been always there but most programs are designed for Intel Microcode not AMD that is why a lot of software is onesided

Clearly that has been changing but again real world performance proves that synthetic benches shouldn't be taken seriously due to varies conditions
That’s funny because all the reviews clearly demonstrate that the 9900K beats a 3700 almost all games tested, not benchmarks but real games that people play. But I bet you probably think they are all wrong and that whoever made that video got it right.

Ifs not an opinion, it’s a fact, a 3700X is a gaming downgrade from a 9900K. You’ve been fooled if you think otherwise.
 
What is the worth of bragging and running to buy to most expensive graphics card if it can't even put up a satisfactory performance for CPU skeptics? If it's good, you expect it to run well across all platforms.

I don't see the rationale for buying a CPU based on a few frames lead in benchmarks, despite having negligible real-time "feel" and perception.

I would prefer to buy an all-rounder at a reasonable price, rather than jump into the mass-appeal teenage-fad bandwagon.

For those who think I'm an AMD fan, I'm still using a i7 8700K.
 
What stereotypes? I said there is a noticeable difference between 60 and 144. Didn't mention AMD or Intel in the post you quoted, so I have no idea what you are talking about. I'm actually an AMD fanboy, but Intel is better for gaming so I won't pretend otherwise.
Certain Games are only playable at 60fps some at 30 and that's mostly non shooter games so really you can't tell unless you play call of duty of CSgo all day and nothing else

That's the difference if all you play is FPS games that's all you're going to notice
Playing most Triple A games a lot of them are still stuck at a smooth 60 not all titles will pump out massive FPS my point because I can't tell the difference between Shadow of the Tombraider from The speed of Lets say need for speed the run locked at 60fps

Also man I been playing on 3440x1440 for 4 years and when I first got this monitor I had 2x 1080's then got a 1080TI that card has been driving my monitor forever as you can see it takes so much power to run I don't get the full fps but with a RTX 3090 I do now yay.......
 
Last edited:
That’s funny because all the reviews clearly demonstrate that the 9900K beats a 3700 almost all games tested, not benchmarks but real games that people play. But I bet you probably think they are all wrong and that whoever made that video got it right.

Ifs not an opinion, it’s a fact, a 3700X is a gaming downgrade from a 9900K. You’ve been fooled if you think otherwise.
It's ok The 3700x still pulled off more frames in certain games but you can't recognize that win right
 
Back