AMD slashes prices of Radeon RX 5700 series ahead of release

Why would anyone use RT, 20 year old doom 2 barely gets what 60 fps with it? For the same 50%+ performance impact you could increase any other setting and get farrrr more out of it. 1080p to 2k, 60 fps to 100 fps etc etc etc etc.

A 2080 ti can barely get 60 fps at 2k with RT turned on. In case you want to see reality, click here.
https://www.techspot.com/review/1759-ray-tracing-benchmarks-vol-2/

Oh I know all about RTX performance. Don't mistake my comment for someone who doesn't know.

The only reality is that whether or not it matters if anyone actually wants to use it with this generation of graphics. It probably doesn't matter.

People will pay more for the Nvidia cards because it is something they think they might be able to use.

I'm not a current advocate of RTX, but I know how the consumer is inclined to buy something else other than Navi if they perceive it to have extras or that a product is superior based on extras.
 
Last edited:
Its true its faster I got a 860 evo sata drive on a pcie4.0 motherboard and its benchmarks faster than my nvme 970 drive on my pcie 3.0 rog which should be faster then it by 4 times normally butt instead its a bit slower what a shock to see it do it over and over so there is something to pcie 4.0 support
You know 860 EVO is SATA and is already capped in speed? You can have PCIE 69.0 and it does nothing...
 
The price cuts are larger than I was expected and much sooner too. I guess competition in the GPU market is doing it's job.
Although... I suspect that the higher announced prices may have been a trap for Nvidia and their Super lineup which may have priced the new GPUs accordingly. (don't quote me on this) It's an interesting way of manipulating the reviews and competitor.
 
Well each is on different systems and my 860 is outperforming my 970 and thought it was mostly pcie 4.0 and rapid mode was the reason and nvme couldn't do it cause its already maxed out
 
Im no expert I just noticed it was doing better and the 970 is preforming find as normal speeds
 
Navi is in a heap of trouble:

I think they were in a heap of trouble but now they’ve cut prices they could be back in it. That video was made before the price cut was announced.


For me Navi needs to be cheaper than Super because of 3 reasons, first of all there are no RTX features. Secondly and more importantly they are blower style coolers on the Navi cards and historically they are awful, loud and not very effective. I go out of my way to avoid buying cards like that generally unless the card is such a good deal that it’s worth buying and then switching out for an aftermarket cooler. Of course aftermarket cards tend to fix that but on a stock to stock comparison I’m avoiding a blower style. Finally Nvidia cards run Gsync & Freesync but AMD cards can only do freesync. Having more options is better.

If the 5700XT performs the same as a 2060S then to me that’s a win for the 2060S based on its features and cooling.

Still, great stuff on the GPU wars side of things, just wish AMD could target the 2080 & 2080ti!
 
It is because those cards are terrible, have the worst possible cooling solution - a blower fan and consume more power.

There is no reason to buy them.

If you buy Nvidia, you get Freesync, Gsync, Raytracing.

If you decide to buy AMD, you risk getting lynched by the people attending Fridays for future.
 
1) Please go back and read. Don't pretend to know, but actual read and comprehend. PCIe4.0 operates at twice the speed.

You do not need to saturate the PCIe bus, to gain the advantage of information traveling at twice the speed. And do understand, it isn't twice the speed for only your GPU, but everything on the PCI4.0 bus. Even your M.2 drives (Which I am sure you have seen the throughputs on them by now...)

2) GCN is old news now and RDNA is the future. RDNA can still do GCN, so that means if a Developer was currently developing a game, that will release on the brand new Xbox, and/or PlayStation, then those Consoles (that feature only RDNA) will still be able to use GCN. Which means RDNA is backwards compatible and doesn't hinder Developers who will be transistioning from GCN, to RDNA.

As one Developer already has told us, the transition is quite effortless once you get working with RDNA.

So, since every game on the xbox and playstation will be using RDNA hardware, then almost most/nearly all developers will be familiar with it, given time. And seeing that RDNA will be in 1 billion people hand within a few years time.. there is no harm is using it.

1) You seriously don't understand how PCIe buses work. AMD's OWN PEOPLE on PC World's Full Nerd podcast stated that the PCIe 4.0 bus will not lead to much if any boost in GPU performance, so gamers' should NOT count on that making Navi faster. Think I'm full of it? Knock yourself out:

2) *facepalm* RDNA consoles don't "use" GCN... THEY ARE GCN. "RDNA" is basically GCN 2.0, with specific tweaks made for IPC and gaming specific tasks. Of course the transition is effortless; it's effectively the same base architecture! And none of this matters because, AGAIN, every console developer has been using GCN since 2013, and that didn't lead to some meteoric rise of AMD dominance in the market. It certainly helped AMD, but nothing like you're describing.

Basically, you're drunk on buzz words and marketing kool-aid and should just stop.
 
Still, great stuff on the GPU wars side of things, just wish AMD could target the 2080 & 2080ti!
Give it some time. Navi 10 is a small die GPU (almost the same as Polaris), so if it's hitting 2070 on an RX 580 sized chip, we will almost certainly see a Navi 20 that goes up above 2080 performance (assuming AMD did its homework and designed the platform to scale properly).
 
1) Please go back and read. Don't pretend to know, but actual read and comprehend. PCIe4.0 operates at twice the speed.

You do not need to saturate the PCIe bus, to gain the advantage of information traveling at twice the speed. And do understand, it isn't twice the speed for only your GPU, but everything on the PCI4.0 bus. Even your M.2 drives (Which I am sure you have seen the throughputs on them by now...)

2) GCN is old news now and RDNA is the future. RDNA can still do GCN, so that means if a Developer was currently developing a game, that will release on the brand new Xbox, and/or PlayStation, then those Consoles (that feature only RDNA) will still be able to use GCN. Which means RDNA is backwards compatible and doesn't hinder Developers who will be transistioning from GCN, to RDNA.

As one Developer already has told us, the transition is quite effortless once you get working with RDNA.

So, since every game on the xbox and playstation will be using RDNA hardware, then almost most/nearly all developers will be familiar with it, given time. And seeing that RDNA will be in 1 billion people hand within a few years time.. there is no harm is using it.

1) You seriously don't understand how PCIe buses work. AMD's OWN PEOPLE on PC World's Full Nerd podcast stated that the PCIe 4.0 bus will not lead to much if any boost in GPU performance, so gamers' should NOT count on that making Navi faster. Think I'm full of it? Knock yourself out:

2) *facepalm* RDNA consoles don't "use" GCN... THEY ARE GCN. "RDNA" is basically GCN 2.0, with specific tweaks made for IPC and gaming specific tasks. Of course the transition is effortless; it's effectively the same base architecture! And none of this matters because, AGAIN, every console developer has been using GCN since 2013, and that didn't lead to some meteoric rise of AMD dominance in the market. It certainly helped AMD, but nothing like you're describing.

Basically, you're drunk on buzz words and marketing kool-aid and should just stop.


You approached this with common sense, information and knowledge. How dare you, Sir!
 
Basically, you're drunk on buzz words and marketing kool-aid and should just stop.
Basically we have a classic case of a kid coming out of a store with a new candy, still wrapped, just to find out later that it wasn't as special as the wrap made it look. Thanks for sparing many of us a pointless discussion going nowhere. He was even comparing saturating the PCIe 4.0 SSDs' BW vs a GPU, without even realizing the former only has to saturate 4 lanes and the latter 16. Also also: storage is designed to saturate buses in sequential RWs, as long as the memory controller and chips enable it; the GPU will only do this in MEMCOPY operations, while control (configuring and yielding function calls) and syncs use a much lower percent of that.

What we know about GPUs and PCIe 3.0: 4 lanes bottleneck them, 8 lanes are enough, and the difference between 8 and 16 lanes is negligible. So... in terms of PCIe 4.0: 4 lanes will now be enough, 8 will be more than adequate, 16 pretty much unusable in the gaming space. For GPGPUs in some compute-intensive HPC applications that do not use local memory that much, I'm pretty sure they'll take advantage of anything you throw at them.
 
Some still at r9 290 390. no use to get higer then rx 480 rx 560-590. if you needing more memmory like 11 gb gtx 12 gb titan you must wait too pcie 4.0 arrives and then pcie 6.0 from intel amd nvidia (matrox) that can hanndle so much data. some are just on pcie or pci 133 mb. it takse months years to get game patches cad to run at full pcie xx speed. just like pcie and agp x8. why not making a agp x16 x32 then ? so even with 16-48 gb we cant run faster then pcie 3.0. we just have to wait for testing 7/7 2019 dd mm yyyy EU
 
Disregarding the obvious troll is the best course of action here, he has been spewing RDNA non stop in every AMD related thread, try normal reasoning and he will start calling names.

Btw AMD can't just price their card equal to Nvidia and call it a day, 5700 needs to be at 300usd to be viable. Even without RTX and DLSS features, Nvidia has the performance advantage in almost all games that people actually play. Check out steam game stats
https://store.steampowered.com/stats/Steam-Game-and-Player-Statistics?l=english

Little digging and you will see that Nvidia will perform better than AMD's counterpart in almost every game in this list (Polaris fare better in Rainbow Six but Turing made sure to cover that).

I have been calling out Steve from TechSpot for his AMD biased list of benchmark games, he has no problem removing Nvidia games that literally have hundreds of thousands of players while keeping AMD games that have 0 players or close to that, might as well do synthetic benchmarks if you benchmark games that no one play. Better check out Techpowerup review and you can see 2070Super easily surpasses Radeon 7.
 
That video was made before the price cut was announced.


For me Navi needs to be cheaper than Super because of 3 reasons:

1) First of all there are no RTX features.

2) Secondly and more importantly they are blower style coolers on the Navi cards and historically they are awful, loud and not very effective. I go out of my way to avoid buying cards like that generally unless the card is such a good deal that it’s worth buying and then switching out for an aftermarket cooler. Of course aftermarket cards tend to fix that but on a stock to stock comparison I’m avoiding a blower style.

3) Finally Nvidia cards run Gsync & Freesync but AMD cards can only do freesync. Having more options is better.



If the 5700XT performs the same as a 2060S then to me that’s a win for the 2060S based on its features and cooling. Still, great stuff on the GPU wars side of things, just wish AMD could target the 2080 & 2080ti!


Greetings.
Your post is well written, but 100% subjective, with absolutely no objectivity. You are worried about secondary issues and placing importance on things that Gamers do not care about.

1) Nobody cares about RTX features, because RTX's features are a flop. If you own an RTX2070, or 2080 you would know exactly what People are talking about. Turing just is not good at ray tracing and won't get any better.

2) Blower style coolers CAN be noisy and are usually a concern. But AMD has already released an exploded diagram of their new blower, along with db metrics. So you have zero concerns for yourself, or others, because this is not traditional.

You are right, in that aftermarket usually has better cooling solution than a stock card, that is why the 5700 series priced @ $349 is a great deal, because the aftermarket is going to use this cards potential. But Stock for stock (AMD vs Nvidia)… AMD has the 2060 super beat in power consumption and performance.

3) Nvidia G-Sync is dead. That is why they were forced to start supporting an open standard. But... just so you know Nvidia's Turing cards can only do FreeSync, where as all current AMD cards can do FreeSync2.0. <-- go on and click

Matter of fact, Samsung TV comes with FreeSync2, the new Xbox is FreeSync2, the new PlayStation is FreeSync2. (RDNA + FreeSync is essentially the new defacto Gaming standard.) Everybody I know stuck on G-Sync are looking to get out, because all the new Gaming Monitors will be FreeSync2.0. (My Acer X34 is for sale cheap, along with my RTX2080)


Lastly, you had a typo, because the 5700 competes with the 2060 SUPER, The 5700x (in leaks) is competing nicely with the 2070 Super. So it seems as nicely written as your post was, it was full of fallacies.
 
Which games are the Nvidia-favored ones high on that list that Steve is no longer using, which show the 1060 beating the 590 and the 1050Ti beating the 570?
 
1) You seriously don't understand how PCIe buses work. AMD's OWN PEOPLE on PC World's Full Nerd podcast stated that the PCIe 4.0 bus will not lead to much if any boost in GPU performance, so gamers' should NOT count on that making Navi faster. Think I'm full of it? Knock yourself out:

2) *facepalm* RDNA consoles don't "use" GCN... THEY ARE GCN. "RDNA" is basically GCN 2.0, with specific tweaks made for IPC and gaming specific tasks. Of course the transition is effortless; it's effectively the same base architecture! And none of this matters because, AGAIN, every console developer has been using GCN since 2013, and that didn't lead to some meteoric rise of AMD dominance in the market. It certainly helped AMD, but nothing like you're describing.

Basically, you're drunk on buzz words and marketing kool-aid and should just stop.

1) LOL... did they mention that PCIe4.0 will get in the way..? Or slower..? Top gamers are never concerned with their top frames per second, they are only ever concerned with their lowest frames.

When you have something (a communication bus) that is unfettered, then it is better. Or can you find an argument against information to and from your GPU twice as fast..? Or how that isn't a selling point.. and not a concern to you..? (lol)

2) RDNA is a brand new architecture (uArch) than GCN. Your OPINION of what you think RDNA is, doesn't matter because we have whitepapers and facts. And Navi is superSIMD and has two wavefronts, etc. It can do GCN on the fly alongside other code, because RDNA has advanced features. When code I written strictly for NAVi under native RDNA Navi will get faster... as more future games come out.

What you are seeing is first gen RDNA(1) with RDNA(2) coming in this winter. RDNA is 100% gaming chip with every transistor meant for Gamers. They are not hand-me-down CAD chips that Nvidia is fencing off as gamer chips.
 
Last edited:
Which games are the Nvidia-favored ones high on that list that Steve is no longer using, which show the 1060 beating the 590 and the 1050Ti beating the 570?

Let see, GTA V (130.000 peak concurrent players), PUBG (700.000), War Thunder (15.000), No Man's Sky (6000), Frostpunk (this game was dead so yeah better that it was removed but I played this game for over 200hours, still one of the best strategy game of 2018).

Games that Steve still keep in his mega benchmarks that no one play: Sniper Elite 4, Strange Brigade, Wolfenstein 2, Deus Ex: Human Divided.


Obviously these are comparison between 1060 6GB vs RX 580, 1060GB 3GB vs RX 570, GTX 1070 vs Vega 56 and GTX 1080 vs Vega 64. 1050Ti and 1650 are meant for gaming cafe that are popular in China and other Asia countries where 20 computers tightly packed in one room so yeah RX 570 don't even compete there
 
Last edited:
1) LOL... did they mention that PCIe4.0 will get in the way..? Or slower..? Top gamers are never concerned with their top frames per second, they are only ever concerned with their lowest frames.

When you have something (a communication bus) that is unfettered, then it is better. Or can you find an argument against information to and from your GPU twice as fast..? Or how that isn't a selling point.. and not a concern to you..? (lol)

2) RDNA is a brand new architecture (uArch) than GCN. Your OPINION of what you think RDNA is, doesn't matter because we have whitepapers and facts. And Navi is superSIMD and has two wavefronts, etc. It can do GCN on the fly alongside other code, because RDNA has advanced features. When code I written strictly for NAVi under native RDNA Navi will get faster... as more future games come out.

What you are seeing is first gen RDNA(1) with RDNA(2) coming in this winter. RDNA is 100% gaming chip with every transistor meant for Gamers. They are not hand-me-down CAD chips that Nvidia is fencing off as gamer chips.

The problem is these are all ideas, but not FPS. In other words, all guesses which have little relevance to actual performance. It *may* but it just as easily may not. And AMD's recent GPU history argues against a big performance increase, which is why most people are not anticipating anything too impressive. I really hope it is impressive as we need competition for Nvidia (like Zen vs. Intel) but my hopes are not high.

Only one day left to wait!
 
Let see, GTA V, PUBG, War Thunder, No Man's Sky, FrostPunk (game was dead so yeah better that it was removed but I played this game for over 200hours, still one of the best strategy game of 2018).

Games that Steve still keep in his mega benchmarks that no one play: Sniper Elite 4, Strange Brigade, Wolfenstein 2, Deus Ex: Human Divided.


Obviously these are comparison between 1060 6GB vs RX 580, 1060GB 3GB vs RX 570, GTX 1070 vs Vega 56 and GTX 1080 vs Vega 64. 1050Ti and 1650 are meant for gaming cafe that are popular in China and other Asia countries where 20 computers tightly packed in one room so yeah RX 570 don't even compete there

BTW thanks about FrostPunk, I've been hemming and hawing about getting it from GOG and I'll add it to my list.
 
Which games are the Nvidia-favored ones high on that list that Steve is no longer using, which show the 1060 beating the 590 and the 1050Ti beating the 570?

Let see, GTA V (130.000 peak concurrent players), PUBG (700.000), War Thunder (15.000), No Man's Sky (6000), Frostpunk (this game was dead so yeah better that it was removed but I played this game for over 200hours, still one of the best strategy game of 2018).

Games that Steve still keep in his mega benchmarks that no one play: Sniper Elite 4, Strange Brigade, Wolfenstein 2, Deus Ex: Human Divided.


Obviously these are comparison between 1060 6GB vs RX 580, 1060GB 3GB vs RX 570, GTX 1070 vs Vega 56 and GTX 1080 vs Vega 64. 1050Ti and 1650 are meant for gaming cafe that are popular in China and other Asia countries where 20 computers tightly packed in one room so yeah RX 570 don't even compete there

The criteria for selecting games in a benchmark suite is not popularity alone. I just love it when people claim Steve is biased, meanwhile he has given recent AMD cards lower scores then Nvidia. No, I'd say he's pretty fair.
 
The criteria for selecting games in a benchmark suite is not popularity alone. I just love it when people claim Steve is biased, meanwhile he has given recent AMD cards lower scores then Nvidia. No, I'd say he's pretty fair.

Well let see Techspot vs Techpowerup 2070Super review:
Techpowerup: 2070Super is 3% faster than Radeon 7 at 1440p
Techspot: 2070Super is 4% slower than Radeon 7 at 1440p

That is a swing of 7% due to the games selection alone, how do you think 7% affect the conclusion of the 2070Super ? Steve gave it a meh while Techpowerup gave it an Editor's Choice Award. Also what is the point of benchmarking real games when they are not indicative of real world usage (as in games no one play), might as well just use 3Dmark.
 
Back