AMD introduces 16-core Ryzen 9 3950X CPU, two Navi Radeon RX 5700 GPUs at E3 2019

It was behind on Overwatch and GTA V, by a considerable margin, while keeping up on other games. So it didn´t rekt Intel like I was expecting (I should say, how I wished, not truly expect)..
No it wasn't. There wasn't even any units to the frame rates chart so you have no idea how far behind it was. The only thing we knowed is that it got 266 and 283 fps for the 3800X and 3900X respectively in Overwatch. Get mid-upper 200s fps should already be more than enough frame rates for gamers, even for the more nit picky gamers who want ultra high fps for their fancy high hertz monitors.

Simply saying "AMD had no problems keeping up with 9900k in high FPS gaming" is not entirely truth, becuase it still lagged on some titles shown, and we now need to know what will happen once independent reviewers test other games.
Did you miss the part where the 3900X beat the 9900k in games such as COD 3, CS GO, and PUBG? It beat the 9900k in some games, was behind in some games, and was tied in some games. So yeh, it was basically keeping up with the 9900k.

3 months ago if you asked me, I was expecting AMD 7nm to completly make Intel irrelevant. Thus the disapointment.
Then you clearly had unrealistic expectations, and now you're unfairly cherry picking ways to denigrate Zen 2 because you were disappointed.

Did you read what I said? I wanted a 1 PC only BUT at purely gaming 3900x still doesn´t beat Intel. It beats Intel at gaming + streaming. But that´s with already less performance than using Intel only for games..
We don't even know what the units in the charts are. But if the charts are even remotely accurate to scale then the two should be comparable in fps that it shouldn't even matter much.

Battlefield V with 9700k = 220fps...
Battlefield V with 3900x = 160fps...
Your speculation is nonsense because your numbers are based on the extremely faulty premise that the Zen 2 3900X only has ~72% of the fps of the 9700k. 160/220 = .727 So you're basically claiming the Zen 2 has even worse performance than Zen 1.

Take a look at Techpowerup's 9700k review. Zen 1 Ryzen 2000 CPUs have 92% the fps performance of the 9700k in gaming at 1080p: https://www.techpowerup.com/reviews/Intel/Core_i7_9700K/13.html

So even Zen 1 Ryzen 2000 cpus should be getting over 200 fps if we take 92% of 220 fps, and that doesn't even factor in the fps that Zen 2 Ryzen 3000 would get. The 3800X was going toe to toe with the 9700k and the 3900X was going toe to toe with the 9900k if the chart sizes are even remotely accurate.
 
Last edited:
Wow so Pascal finally has some competition, welcome to 2016. Radeon 5700 performs the same at gtx 1080 at the same 180W TDP. Would make more sense if 5700XT sell at 400usd and 5700 at 300usd but I guess AMD is pretty tight on profit margin already.

Pascal and RTX are now obsolete. RDNA is 100% gaming GPU, not a server farm/AI knock-off.
 
Interesting to see all the fan arguments as usual here. The truth of the matter is that AMD is only aiming for the consumer market at the moment because the only criteria on which AMD wins over Intel is cost (and that is initial outlay not TCO). Intel still beats AMD by a long way in terms of power per watt, you need to remember that you should compare like with like, that means comparing AMD's multi-die offering with Intel's multi-die (I.e. multi-xeon) offering. The only potential threat to Intel in the data centre is ARM, not AMD.
If you want a professional kick-*** workstation you will go with a multiple Intel Xeon rig rather than AMD.
This also holds true for the GPU, professional offerings from nvidia out perform AMD's in everything other than price.
It's the same for high-end gaming rigs, if you want the best performance you'll go Intel + nvidia, if you have a limited budget you'll probably go for AMD.
Here’s a high end pc with full amd components: https://pcpartpicker.com/list/fgMBMZ

Here’s a mid range pc with intel + nvidia components: https://pcpartpicker.com/list/3Kbngw
 
Last edited:
Yes I am sure people that make money from editing videos and rendering 3d, do not look for hardware that can save them time and bring a higher income every month. Absolutely right. Just like miners didn't look out of the best GPUs for the best income. /s
Constipated thinking, with such a shallow mindset. People use their computers to run CAD software, do scientific calculations, run simulators and all sorts of other interesting things, that need power and multitasking. You must be very young, as your scope seems extremely limited.
stevae, you hit the nail on the head my friend.
 
They can't win in everything now :D But competition is always good.

They just need to stay competitive on the GPU side while winning in the CPU side. That will allow them to slowly increase their R&D budget.

People forget that Nvidia and Intel have an R&D buget several times that of AMD. Intel's budget alone is 10x of what AMD spent in 2018 (1.4bil AMD vs 13.1 bil Intel - Nvidia spent 1.8bil on R&D). The fact that they can even make something worth buying is a miracle (even if it's with perf/$ and not the ultra high end stuff)
IMO, this is the best post of the thread.

Perhaps one of the reasons that sIntel and nWidia have gone sideways in performance is that designing ever increasing performance is a true challenge in the industry at the time.

As I see it, no matter how deep the pockets are for the target market for both CPU and GPU, in the absence of AMD, sIntel and nWidia would both be free to price their products astronomically. nWidia can get away with it at the moment, and sIntel was charging higher and higher prices for piddling performance increases between generations. At some point, all the P&Mers would be P&Ming about astronomical prices for the "big two."

For me, I see a certain amount of stress in what sIntel is doing. Prior to AMD's latest offerings, they were rolling in their own glory and laughing on the way to the bank. Not so anymore.

If you don't like AMD's offerings and still want to buy sIntel, that's your right. However, AMD is giving them a run for their money right now. Because AMD may be behind in one or two games simply does not mean that they are an abject failure as some seem to think.

AMD suffered from a dolt as chairman for several years; they are now recovering from that fiasco. They have done nothing but offer some significant strides in performance since their leadership changed. There is no guarantee they will leapfrog nWidia or sIntel at this point, but there is also no guarantee that nWidia or sIntel will come out with an AMD killer.

Physics is physics, and the semiconductor industry is literally reaching the limits of current technology.
 
I literally came here to repeat, again, that AMD is literally half the size of nVidia and a tenth the size of Intel. While this isn't an excuse for weak products, you have to take AMD's much more limited R&D budget and market cap when comparing products.

The fact that AMD is even within 5% of Intel and nVidia in terms of performance is staggering in a good way for AMD, embarrassing for nVidia and Intel, and great for us as consumers.
 
With this new processor, at least the base clock is believable for the TDP. With the 12-core 3900X, the base clock and the TDP seemed to be too good to be true for that many cores. But that's assuming the boost clock doesn't have anything to do with the TDP. And the boost clocks are higher if you get more cores, consistently.

That says two things. The boost clocks are higher on the more expensive chips for marketing reasons. And since higher boost clocks don't increase TDP, the duty cycle on the boost clock is probably low, so it won't do as much good as one might expect.

Yes, it's just figures on paper. But one can read some interesting stuff out of those figures.
 
I literally came here to repeat, again, that AMD is literally half the size of nVidia and a tenth the size of Intel. While this isn't an excuse for weak products, you have to take AMD's much more limited R&D budget and market cap when comparing products.

The fact that AMD is even within 5% of Intel and nVidia in terms of performance is staggering in a good way for AMD, embarrassing for nVidia and Intel, and great for us as consumers.

While true, nobody cares. OK, *almost* nobody.

What they want is the right tool for the job. How does the thing perform in their computer right now? Some people even think about how the thing will perform in the future.

But everyone has different use cases, which is the source of all the fun bickering here. So, some use cases:

• Nvidia + Intel CPU for top-end gaming, but the CPU may change after 7/7.
• Nvidia + AMD CPU for more affordable high end gaming, but the GPU may change after 7/7.
• AMD/Nvidia + AMD CPU for midrange gaming but the CPU could change if Intel gets production volumes up.
• AMD + AMD for most affordable gaming as both Nvidia and Intel simply do not compete on low end prices.

other uses for AMD CPUs (obviously not exhaustive):

actually viable CPU upgrade path
multi-core optimized workstation
inexpensive streaming
inexpensive general use

other uses for Intel CPUs:

h.265 conversion (also their worst relative showing in the announced Zen 2 workstation graphs)
(seriously, outside of gaming and h.265, there's...?)

I built one of each CPU type in the past 2 years: Intel for gaming and h.265 work, and AMD for inexpensive general use, affordable gaming, and an actually viable CPU upgrade path. 2 different uses, the right tools for each job. FYI both have Nvidia for gaming because they were bought during the mining boom. At least one would have been an R570 if they hadn't been $400 at the time
 
Well you can go ahead and be disappointed because you are in a fraction of a fraction of a percent of gamers. For the rest of us, this means it's likely Intel will have to respond with price cuts that benefit us all. Further, if you had been following this at all you'd already have an idea of performance numbers and know that the expected performance is actually HIGHER than previously predicted.

Also, they do beat Intel significantly in workloads other than gaming and this now makes it look they are going to be competitive with Intel in the server segment.

AMD is going to have a large impact on the industry with these products and the fact you are disappointed is completely irrelevant

Fraction of a fraction? Not sure about that considering the most played games on PC are multiplayer shooters. Twitch is driven by FPS gamers like Shroud, Ninja, Summit etc. More and more kids are interested in buying PC rigs to play those games online and play competitively. I think this is not a fraction, it is our current reality.

Fraction of a fraction is people buying PCs to play tomb raider or assassins creed or any other single player game with eye candy and low fps. That´s a fraction and SteamStats show us exactly that. Multiplayer/online gaming is king right now, e-sports moves more money than hollywood.

You can damage control how you want, it´s up to you.

Lol, damage control? From what I saw the 3900x had no problems keeping up with the 9900k in high FPS gaming and in 1440p gaming. They match single core performance and destroy in multicore performance and destroy in price/performance. The ryzen CPUs are excellent performers

As for the Video cards, yes they are not blowing away the Nivida cards but they are going to push the prices down and worse for Nvidia, they are going into both new consoles and Google Stadia. That is a very very big deal. Nvidia has owned the gaming world for so long that they get a baseline performance boost simply from developers tweaking to see the best performance for most of their users (who just happen to be nvidia users). But with the next gen consoles all going AMD this will be changing and the baseline for developers to be working on will all be AMD hardware. That will be paying dividends in 2021 and 2022.

The Polaris push didn't last long. Nvidia is back to pre-Polaris market share now @ 85%.

Dude is right about core count not being a big selling point for the majority. Add the fact AMD CPUs and GPUs aren't exactly being fought over by OEMs. The opposite actually. More than 4 cores is overkill for a lot of moms and dads out here. Aka the majority. The performance crown and consistency is what AMD needs. They are on their way and support for more cores are coming, but just like RTX, not everyone needs more cores, especially when the content consumers use doesn't benefit from it.

If Ryzen 3 and Navi are great for my needs I might even buy them, but not before the reviews.
 
What?... No comments. Because I really don´t know what to say. Can I just ask, why do we even game? :D it´s a similar question


I just think its a total waste of cash, there is no way you can feel the difference past 165Hz? And on top of that you most likely play at low settings need to OC your hardware to the max and still play at 1080p in 2019 what joy do you get out of it? I get it everyone is different and we all have our preferences but personally I think 2560x1440 at 144Hz to 165Hz its where I would stop, rest is just marketing and placebo effect :)

No it's not placebo. You notice it a lot. Stop being elitist, plus most 1440p 165hz panels are IPS wich sucks for shooters. 240hz offers less latency and better motion clarity and it is immediatly noticeable if you play like quake, battalion, apex legends, cod etc etc.

Almost half input lag of 144hz, double motion clarity. Go on blur busters forum and tell them 240hz is placebo. They will basically eat you alive.

If you are a casual player, then yeah, stick to 4k 60hz IPS if you want.
 
Last edited:
No it's not placebo. You notice it a lot. Stop being elitist, plus most 1440p 165hz panels are IPS wich sucks for shooters. 240hz offers less latency and better motion clarity and it is immediatly noticeable if you play like quake, battalion, apex legends, cod etc etc.

Almost half input lag of 144hz, double motion clarity. Go on blur busters forum and tell them 240hz is placebo. They will basically eat you alive.

If you are a casual player, then yeah, stick to 4k 60hz IPS if you want.

Less input latency... and better motion fidelity.

Navi gives you less input latency, which will give you better motion fidelity.
 
Pascal and RTX are now obsolete. RDNA is 100% gaming GPU, not a server farm/AI knock-off.

Define obsolete when 1080Ti is still superior than 5700XT in every aspect: performance, efficiency. Right now AMD can overtake Intel as the gamer first choice but is still only a cheaper alternative to Nvidia in the gaming sense. Just because you can't afford RTX doesn't make it obsolete lol.
 
No it's not placebo. You notice it a lot. Stop being elitist, plus most 1440p 165hz panels are IPS wich sucks for shooters. 240hz offers less latency and better motion clarity and it is immediatly noticeable if you play like quake, battalion, apex legends, cod etc etc.

Almost half input lag of 144hz, double motion clarity. Go on blur busters forum and tell them 240hz is placebo. They will basically eat you alive.

If you are a casual player, then yeah, stick to 4k 60hz IPS if you want.

Currently I am using an IPS 4K 60Hz monitor as I do mostly play single player shooters or RTS games but I'm thinking of going back to 1440p 144Hz to give it a try :)
 
I just think its a total waste of cash, there is no way you can feel the difference past 165Hz? And on top of that you most likely play at low settings need to OC your hardware to the max and still play at 1080p in 2019 what joy do you get out of it? I get it everyone is different and we all have our preferences but personally I think 2560x1440 at 144Hz to 165Hz its where I would stop, rest is just marketing and placebo effect :)

I would like to suggest to watch this video:
or simply search for 144hz vs 240hz on youtube. You will be surprised how much difference it makes, not to mention that blur or ghosting will mostly disappear.
And sometimes games looks ugly when you turn off blur or reaching 144hz or above and that means devs of that game were lazy and left it unomptimised and tried to hide it with blur.
Also just read all the FAQ stuff on https://www.blurbusters.com/ (parent site for https://www.testufo.com/) and check out what's the difference between 120/144/165/240hz in real life.
 
Define obsolete when 1080Ti is still superior than 5700XT in every aspect: performance, efficiency. Right now AMD can overtake Intel as the gamer first choice but is still only a cheaper alternative to Nvidia in the gaming sense. Just because you can't afford RTX doesn't make it obsolete lol.

Because for him nvidia and intel are obsolete, just because AMD finally has something relevant in the last 10 years.
 
Input lag or input latency are the same thing...

I think @m3tavision probably mixed up two things: how long it takes for the GPU to render one frame and how long it takes for one - already rendered - frame to be displayed on a monitor (both displayed/measured in milliseconds, that's why the confusion)
Also same term input lag/latency used when measuring the mouse input between the motion and the moving pixels and the right term for it is "polling rate".
 
Define obsolete when 1080Ti is still superior than 5700XT in every aspect: performance, efficiency. Right now AMD can overtake Intel as the gamer first choice but is still only a cheaper alternative to Nvidia in the gaming sense. Just because you can't afford RTX doesn't make it obsolete lol.

Hey Krizby, what TV/Monitor do you have?. Can you tell me off hand... how many watts they use..? lulz..

NO matter what Chevrolet produces, the Ferrari will always have more performance. And just because the 1080ti is fast in DX11 games, it doesn't do so well with modern games at modern resolutions. Hence the RTX2080.


With the advent of the 5700x, the older RTX2070 has to lower it's price, to compete with the 5700x. Because the 5700x has about 20% more performance. Several from the Press said that the 5700x is showing better game numbers than Vega64 LC. Which has $449 Navi flirting with 2080 in some games.

Nvidia doesn't have RDNA & all games in the future... will be built using RDNA as their platform. Absurd sounding, I know, but Nvidia is the one who started this type of "exclusivity" crap in games. Now Dr SU signed up Samsung, Microsoft, Google and SONY and others to AMD's portfolio... and now the whole gaming world is RDNA.

Those are the facts as of today.
 
Define obsolete when 1080Ti is still superior than 5700XT in every aspect: performance, efficiency. Right now AMD can overtake Intel as the gamer first choice but is still only a cheaper alternative to Nvidia in the gaming sense. Just because you can't afford RTX doesn't make it obsolete lol.

Hey Krizby, what TV/Monitor do you have?. Can you tell me off hand... how many watts they use..? lulz..

NO matter what Chevrolet produces, the Ferrari will always have more performance. And just because the 1080ti is fast in DX11 games, it doesn't do so well with modern games at modern resolutions. Hence the RTX2080.


With the advent of the 5700x, the older RTX2070 has to lower it's price, to compete with the 5700x. Because the 5700x has about 20% more performance. Several from the Press said that the 5700x is showing better game numbers than Vega64 LC. Which has $449 Navi flirting with 2080 in some games.

Nvidia doesn't have RDNA & all games in the future... will be built using RDNA as their platform. Absurd sounding, I know, but Nvidia is the one who started this type of "exclusivity" crap in games. Now Dr SU signed up Samsung, Microsoft, Google and SONY and others to AMD's portfolio... and now the whole gaming world is RDNA.

Those are the facts as of today.

Nice try with Chevy vs Ferrari lol. Corvette is $80,000 while the Ferrari is $250,000!!!! They are neck to neck but if you pour just a few grand on that vette LS engine you can have 2,000HP-2,500HP and smoke the fuc* out of the Ferrari and still have money to buy a house!
 
Hey Krizby, what TV/Monitor do you have?. Can you tell me off hand... how many watts they use..? lulz..

NO matter what Chevrolet produces, the Ferrari will always have more performance. And just because the 1080ti is fast in DX11 games, it doesn't do so well with modern games at modern resolutions. Hence the RTX2080.


With the advent of the 5700x, the older RTX2070 has to lower it's price, to compete with the 5700x. Because the 5700x has about 20% more performance. Several from the Press said that the 5700x is showing better game numbers than Vega64 LC. Which has $449 Navi flirting with 2080 in some games.

Nvidia doesn't have RDNA & all games in the future... will be built using RDNA as their platform. Absurd sounding, I know, but Nvidia is the one who started this type of "exclusivity" crap in games. Now Dr SU signed up Samsung, Microsoft, Google and SONY and others to AMD's portfolio... and now the whole gaming world is RDNA.

Those are the facts as of today.

I use an Acer Predator X34P that uses around 45W, any other question ?
Lol RDNA is a newly revamp GCN architecture, not a new feature that AMD has that Nvidia don't. Let just compare RDNA and Pascal for now since Turing has DXR + DLSS hardware feature that Pascal and RDNA don't.
Now let see 1080Ti, the second best Pascal silicon is 20% faster than RTX 2070 at 250W, vs 5700XT is around <10% faster than RTX 2070 at 225W (according to AMD, I guess averaging across a multitude of games 5700XT and RTX 2070 will be even, remember the Radeon 7 vs RTX 2080 slide ?). So architecturally Navi and Pascal has the same Perf/W, despite being on a superior node 7nm vs 16nm.
I don't know where you get the idea that all games in the future will be based on RDNA, unless somehow AMD has all the money to buy all the game developers to use their hardware lol. Just so you know Unreal Engine 4 which is heavily base on Nvidia hardware will power a lot of AAA games this year and probably in the next few years.
News flash: Cyberpunk 2077 will feature RTX (Reflections), so yeah good luck RDNA.
 
Last edited:
Very very disapointed.

1- It´s 2019, AMD is on 7nm and all they can do is a GPU with 225w TDP competing with a RTX 2070, and launching for 500€ on Europe, same price as 2070.

2- 3800x can´t even beat i7 9700k on a light engine like Overwatch and they shown comparasions on Black ops 3 (really? a game from 2015 that no one plays) and Rocket League. I wasn´t expecting AMD to be any slow on those light engines at all. I wanted to see Frostbite Battlefield benchmarks, Black Ops 4 Blackout, Quake Champions or Escape from Tarkov. Games that actually stress the CPU a lot.

3- I can see that it still won´t beat an intel cpu on high refresh gaming.

Guess I will keep my 9700k 5ghz because I won´t be "upgrading" for something with more threads but have less performance on what matters to me.

Was expecting way more from the 7nm. Now will wait for independent reviews but as Steve Burke froM GN said, AMD and Intel always pick the benchmarks that favour them, so don´t expect miracles when we see the whole picture on many different games.

Also curious to see wether dual channel memory can bottleneck the 16 core CPU or not. Steve said it will for sure in certain scenarios. Let´s wait and see.

It's still too early to tell what the final results will be. Given that Zen 1 required microprocessor updates, this iteration will likely need it as well. AMD is on a good path if this year's drops aren't it, the next two years should be. 5nm is just around the corner for AMD and Intel is just now reaching 10nm.
 
Nice post. And I agree completely. But the thing that is catching my eye is that bit by bit AMD is catching Nvidia. True they are not there yet. But they are clearly gaining on them. And when manufacturers and consumers alike, can get close to the same performance, at a much lower price point, they are going to take the better price point. When there's a significant difference, this is a bit murkier to be sure. But when it's almost even, things will get interesting. And from what I have read, Navi 10 isn't expected to be the nvidia killer, only the calling card to put a shiver into them. Navi 20 is supposed to be where nvidia needs to scramble. They are still in Hybrid mode right now. But when Navi 20 comes out, they will be full-on Rdna.
Well, Nvidia is also progressing - I suspect the reason their advances seem so slow compared to AMD is simply because they aren't being pushed... Hopefully, Ryzen makes AMD enough money so that they can better compete on the GPU front.

It's ridiculous that Nvidia can get away with selling the 2080Ti for $1300.... and even more insane that they're selling the Titan for $2500... I'm hoping that this is "pride before the fall", and that Nvidia ends up like Intel, caught with their pants down... not totally convinced of it yet though...
There's a lot of truth in that to be sure. But nobody thought AMD had any possibility to doing to Intel what they have over the last two years. So I am leaning towards believing next year when Navi 20 rolls out, they will have Nvidia in a fix. Time will tell though.
 
Interesting to see all the fan arguments as usual here. The truth of the matter is that AMD is only aiming for the consumer market at the moment because the only criteria on which AMD wins over Intel is cost (and that is initial outlay not TCO). Intel still beats AMD by a long way in terms of power per watt, you need to remember that you should compare like with like, that means comparing AMD's multi-die offering with Intel's multi-die (I.e. multi-xeon) offering. The only potential threat to Intel in the data centre is ARM, not AMD.
If you want a professional kick-*** workstation you will go with a multiple Intel Xeon rig rather than AMD.
This also holds true for the GPU, professional offerings from nvidia out perform AMD's in everything other than price.
It's the same for high-end gaming rigs, if you want the best performance you'll go Intel + nvidia, if you have a limited budget you'll probably go for AMD.
You are as wrong as wrong can get on power per watt. AMD is actually killing Intel in power per watt. And their EPYC chips are beating the Xeon in the server space. So while some of what you are saying is true, there's several things there which aren't. AMD is walking Intel down, year by year. Five years ago, AMD wasn't even in the race. Now all of the sudden they are putting out better processors, running on less power, on better architecture and for a lower price. If you can't see the progression of what's going on, then you are just looking the other way...
 
Pascal and RTX are now obsolete. RDNA is 100% gaming GPU, not a server farm/AI knock-off.

Define obsolete when 1080Ti is still superior than 5700XT in every aspect: performance, efficiency. Right now AMD can overtake Intel as the gamer first choice but is still only a cheaper alternative to Nvidia in the gaming sense. Just because you can't afford RTX doesn't make it obsolete lol.
Yeah, it kinda does... Especially when the percentage who can't afford it, or are too smart to be taken advantage of is such a huge segment of the market. The percentage of people willing to pay premium prices for Intel/nVidia crap is TINY!!! AMD is extremely smart to push the midrange market this year with Navi 10, and go after the higher end market next year with Navi 20. It lines up very well with their roadmap.
 
I think @m3tavision probably mixed up two things: how long it takes for the GPU to render one frame and how long it takes for one - already rendered - frame to be displayed on a monitor (both displayed/measured in milliseconds, that's why the confusion)
Also same term input lag/latency used when measuring the mouse input between the motion and the moving pixels and the right term for it is "polling rate".


No, fidelity... (ie: accuracy)

The higher the refresh rate, the higher the fidelity and exact your mouse movement become. And at 60hz, it is not 1 to 1 ratio of mouse to screen. The closer your monitor gets to your actual mouse input speed (500Mhz, or 1,000Mhz), the better the fidelity you have in what you do with your mouse, to exactly what happens on the screen.

250 polling rate and 250Mhz monitor, for example is 1:1 ratio.
 
Back