AMD slashes prices of Radeon RX 5700 series ahead of release

Well let see Techspot vs Techpowerup 2070Super review:
Techpowerup: 2070Super is 3% faster than Radeon 7 at 1440p
Techspot: 2070Super is 4% slower than Radeon 7 at 1440p

That is a swing of 7% due to the games selection alone, how do you think 7% affect the conclusion of the 2070Super ? Steve gave it a meh while Techpowerup gave it an Editor's Choice Award. Also what is the point of benchmarking real games when they are not indicative of real world usage (as in games no one play), might as well just use 3Dmark.

Games should be selected based upon how aggressively they stress the hardware, not what's popular. League of Legends is popular, but using it to test GPUs would be *****ic.

So, you're mad at Steve because unlike almost every other publication which is just happy to take whatever nVidia gives us at whatever price and beg for more, Steve's like, "Meh, it performs better, but it's still the same inflated RTX 20xx series pricing we've had for over a year, so nothing new here."

One of the reasons I read TechSpot is because Steve DOES take both MSRP and street price into account in his work, as not all of us appreciate nVidia making the xx60 series suddenly a $350 card instead of a $250 card, and nor do we have the finances to just shrug and take it.
 
1) LOL... did they mention that PCIe4.0 will get in the way..? Or slower..? Top gamers are never concerned with their top frames per second, they are only ever concerned with their lowest frames.

When you have something (a communication bus) that is unfettered, then it is better. Or can you find an argument against information to and from your GPU twice as fast..? Or how that isn't a selling point.. and not a concern to you..? (lol)

...what?

AMD: "PCIe 4.0 will not provide an appreciable increase in GPU performance this generation, so gamers shouldn't go for X570 or Navi with this in mind."

I'll be sure to let AMD know that m3tavision disagrees, and thinks a GPU on a PCIe 4.0 lane will be twice as fast? After all, he knows WAY better than the people who actually designed the hardware.
 
Well let see Techspot vs Techpowerup 2070Super review:
Techpowerup: 2070Super is 3% faster than Radeon 7 at 1440p
Techspot: 2070Super is 4% slower than Radeon 7 at 1440p

That is a swing of 7% due to the games selection alone, how do you think 7% affect the conclusion of the 2070Super ? Steve gave it a meh while Techpowerup gave it an Editor's Choice Award. Also what is the point of benchmarking real games when they are not indicative of real world usage (as in games no one play), might as well just use 3Dmark.

Go look at any other product review, that level of variation is normal based on the games reviewed, differences in silicon quality, margin of error, ect.

Looking at TechSpot's review, there are quite a few popular games on the docket.

https://www.techspot.com/review/1865-geforce-rtx-super/

Fortnite, Rainbow six siege, Metro exodus, Resident Evil 2, ect. Fortnite alone has more concurrent players then every other game you've listed. And no, games that don't have high concurrent players are not worthless. Just because a game doesn't fit what you like doesn't mean it isn't applicable for benchmark. This isn't a popularity contest, it's benchmarking.

The Super series does deserve a meh TBH. It's finally giving us the minimum of what we should have gotten a year ago.
 
1) Please go back and read. Don't pretend to know, but actual read and comprehend. PCIe4.0 operates at twice the speed.

You do not need to saturate the PCIe bus, to gain the advantage of information traveling at twice the speed. And do understand, it isn't twice the speed for only your GPU, but everything on the PCI4.0 bus. Even your M.2 drives (Which I am sure you have seen the throughputs on them by now...)

2) GCN is old news now and RDNA is the future. RDNA can still do GCN, so that means if a Developer was currently developing a game, that will release on the brand new Xbox, and/or PlayStation, then those Consoles (that feature only RDNA) will still be able to use GCN. Which means RDNA is backwards compatible and doesn't hinder Developers who will be transistioning from GCN, to RDNA.

As one Developer already has told us, the transition is quite effortless once you get working with RDNA.

So, since every game on the xbox and playstation will be using RDNA hardware, then almost most/nearly all developers will be familiar with it, given time. And seeing that RDNA will be in 1 billion people hand within a few years time.. there is no harm is using it.

1) You seriously don't understand how PCIe buses work. AMD's OWN PEOPLE on PC World's Full Nerd podcast stated that the PCIe 4.0 bus will not lead to much if any boost in GPU performance, so gamers' should NOT count on that making Navi faster. Think I'm full of it? Knock yourself out:

2) *facepalm* RDNA consoles don't "use" GCN... THEY ARE GCN. "RDNA" is basically GCN 2.0, with specific tweaks made for IPC and gaming specific tasks. Of course the transition is effortless; it's effectively the same base architecture! And none of this matters because, AGAIN, every console developer has been using GCN since 2013, and that didn't lead to some meteoric rise of AMD dominance in the market. It certainly helped AMD, but nothing like you're describing.

Basically, you're drunk on buzz words and marketing kool-aid and should just stop.
I agree that PCI-e 4.0 won't bring any performance benefits for midrange GPUs in games (although there will be a few workloads that do benefit from it), but I have to strongly disagree with you about Navi just being GCN 2.0 --> this is completely wrong. RDNA is a new arhitecture that made huge changes compared to GCN.
This video from gamers nexus, which I've posted before, should give you a few of the technical details about what AMD did with this new architecture:
https://www.youtube.com/watch?v=V7wwDnp8p6Y
 
RDNA is a new arhitecture that made huge changes compared to GCN.

It's not an entirely new architecture, but rather an evolution of GCN specifically tailored for gaming. Vega was AMD effectively trying to make a computer card work for gaming, whereas RDNA is GCN's core foundation evolved for higher IPC and gaming-specific tasks. I'm not saying it's the exactly the same as GCN or just a faster version of it or that they haven't made significant improvements, but it's still built with the same compute engine as GCN and the same building blocks taken in a different direction.

Either way, it's not a dig at AMD (no sense in throwing out things that work for the sake of newness).
 
Most easy are to just get both and run benchmarks when pcie 4.0-6.0 gets out. no contest on who getting fastest fps. just LIKE both amd nvidia gpus motherboards ddr5 memmory.
no there are one day to release 7/7-2019 dd mm yyyy. go and get a life. now it will be revaled. now its out in stores. so go 8k it with a pcie 4.0 gpu and lots of ram. https://prisguiden.no/sok?f[c][]=10&q=x570&s=price asc
 
Last edited:
Well let see Techspot vs Techpowerup 2070Super review:
Techpowerup: 2070Super is 3% faster than Radeon 7 at 1440p
Techspot: 2070Super is 4% slower than Radeon 7 at 1440p

That is a swing of 7% due to the games selection alone, how do you think 7% affect the conclusion of the 2070Super ? Steve gave it a meh while Techpowerup gave it an Editor's Choice Award.

From the Techspot review:

"Thankfully the 2070 Super is better value than the 2070. An 8% discount per frame is not exactly a cause for celebration but it does manage to one-punch knock out the Radeon VII and RTX 2080 graphics cards, so there’s that."

Techspot places value for your money as the final arbiter of every product, whether it's a $120 RX 570 or a $1200 RTX 2080Ti. As Steve has said many times, there are no bad products, just bad prices. In this case, 2 years later the 2070 was barely better than the 1080 price point it replaced. That's the definition of 'meh'. The 2070 Super is an 8% discount per frame over that product, which makes it all of 8% better than meh.

Whoopty-doo.
 
Games should be selected based upon how aggressively they stress the hardware, not what's popular. League of Legends is popular, but using it to test GPUs would be *****ic.

So, you're mad at Steve because unlike almost every other publication which is just happy to take whatever nVidia gives us at whatever price and beg for more, Steve's like, "Meh, it performs better, but it's still the same inflated RTX 20xx series pricing we've had for over a year, so nothing new here."

One of the reasons I read TechSpot is because Steve DOES take both MSRP and street price into account in his work, as not all of us appreciate nVidia making the xx60 series suddenly a $350 card instead of a $250 card, and nor do we have the finances to just shrug and take it.

Yeah the hugely popular Nvidia favored games that Steve removed heavily stress the hardware too, how do you explain that. Last I checked GTA V, PUBG, No Man's Sky are quite demanding. I own an 2080Ti and only use medium settings in PUBG since every FPS count.

Now if you read a review that is full of synthetic benchmarks, would you bother reading them ? the same would be benchmarking games no one care, might as well get rid of them.

I disagree with choosing games based on how aggressively they stress the high end hardware, just because CSGO, LOL, DOTA, Overwatch, World of Tank, War Thunder, RUST, ARK: Survival Evolved, Rocket League, FF XIV online, etc... don't stress your high end GPU enough, they sure stress out lower performance tier cards like 1650 or RX570, these games also require high FPS. I didn't see Steve benchmarking these games in his 1650 review.

It's always the games that drive hardware sale, not the other way around. That easily explains why Nvidia dominate the Steam GPU chart, because Steam is dominated by pro Nvidia games lol.

I couldn't care less how Steve's 2070Super conclusion differ from Techpowerup, that was just an example of how different list of games affect the overall score. As a gamer and a hardware enthusiast I find Steve's choice of games to be slightly misrepresenting real world performance and I had been a fan of his reviews until now.


From the Techspot review:

"Thankfully the 2070 Super is better value than the 2070. An 8% discount per frame is not exactly a cause for celebration but it does manage to one-punch knock out the Radeon VII and RTX 2080 graphics cards, so there’s that."

Techspot places value for your money as the final arbiter of every product, whether it's a $120 RX 570 or a $1200 RTX 2080Ti. As Steve has said many times, there are no bad products, just bad prices. In this case, 2 years later the 2070 was barely better than the 1080 price point it replaced. That's the definition of 'meh'. The 2070 Super is an 8% discount per frame over that product, which makes it all of 8% better than meh.

Whoopty-doo.

Now remember when GTX 1080/1070 FE came out they were priced at 700/450usd and Steve gave them 100/100 at those prices https://www.techspot.com/review/1174-nvidia-geforce-gtx-1080/ https://www.techspot.com/review/1182-nvidia-geforce-gtx-1070/ (I assure you there were no AIBs that sold below 700/450usd for 1080/1070)

If you compare RTX 2070 to GTX 1080/1070 at original price, the RTX 2070 at 500usd is 7% faster than a perfect score 2 year old 700usd card and 27% faster than the perfect score 2 year old 450usd card and received a meh score. Now the 500usd 2070Super is around 28% faster than the 500usd GTX 1080 after the price drop in March 2017. That is the standard performance gain per generation sans the additional features that RTX cards came with.

30% improvement per generation has always been de facto in the GPU world as far as I remember, Pascal was an exception since the GPU manufacturing has been stuck in 28nm for 4 years before transitioning to 16nm, and AMD has only been catching up with Pascal until now lol.
 
Last edited:
It's not an entirely new architecture, but rather an evolution of GCN specifically tailored for gaming. Vega was AMD effectively trying to make a computer card work for gaming, whereas RDNA is GCN's core foundation evolved for higher IPC and gaming-specific tasks. I'm not saying it's the exactly the same as GCN or just a faster version of it or that they haven't made significant improvements, but it's still built with the same compute engine as GCN and the same building blocks taken in a different direction.

Either way, it's not a dig at AMD (no sense in throwing out things that work for the sake of newness).
Keeping some building blocks from the older architecture is not something new and everybody does it. The compute engine of GCN is really good. AMD also built RDNA with consoles in mind and more specifically they had to ensure that the software stack is backwards compatible.
Even so, the compute units were changed compared to GCN. Vega individual compute units has stream processors in four batches of eight CUs, across two shader engines. Navi has squeezed the CUs together to create the Dual Compute Unit (DCU) with many changes under the hood.
For more details about how they changed these things you can read about it here:
https://www.pcgamesn.com/amd/navi-rdna-architecture-release-date-specs-performance

On that note, it really is a new architecture. They reworked many of the subsystems and I'm actually surprised that they managed to do so much and still keep full backwards compatibility.
 
Last edited:
just because CSGO, LOL, DOTA, Overwatch, World of Tank, War Thunder, RUST, ARK: Survival Evolved, Rocket League, FF XIV online, etc... don't stress your high end GPU enough, they sure stress out lower performance tier cards like 1650 or RX570, these games also require high FPS. I didn't see Steve benchmarking these games in his 1650 review.

That would be because with games like these, the CPU becomes the limiting factor in FPS, not the GPU. Because, again, they're not stressing the GPU, so it becomes idle, which means it waits on the CPU to deliver draw calls, and the whole process becomes CPU bound. So, yeah, Steve doesn't test them because he understands this and you clearly don't.

And you're right, the Super cards are what Turing should have launched on day one. Instead, we got mediocre cut down versions at the same price. Again, it took AMD turning up the heat on nVidia to make nVidia actually step up and do right by consumers. So, again, not sure how you're painting Steve as the bad one here and nVidia as the victim.

Look, you bought a 2080 Ti. You spent $1,300 on a graphics card, so clearly, we're not going to see eye to eye on value and you're just looking for Steve to validate your purchase.
 
O.c nvidia are just waiting on amd nxt move. then they releases 64 gb ddr7 ram or 128-512 gb r to 1 tb and so on. if pcie 5.0-6.0 will double bandwidth they can o.c beat amd on most cases. they just needing a inel pcie xx motherboard better ddr 5 6 7 ram.
UPD
now as pcie 4.0 motherboards are out we just have to get on of eatch. mb gpu ram ssd and so on and prepare for 5k 6k 8k. we were told 1 128k would be nte new IN. to the future 1 2 .we goonna needing more bandwidt on lan fiber 5k+ an new mb has about 2.5 gb up down in speed.
the price to get new component are not so bad at all from amd. im still using i7-7740x cpu
 
Last edited:
I wish medicine would be as competitive as the GPU market. Here we have only 2 corporations, and they are fighting to death with prices and quality. In medicine we have thousands of corporations, but medicine is getting more and more expensive every year, without any major breakthroughs in decades.

Maybe AMD, Nvidia and Intel have a recipe how to make medicine as good and competitive as the CPU/GPU market...
 
That GPU in the pic has a dent in it.

It got gut punched by a Super series GPU.

Good news but not unexpected. Nobody is buying a Navi card for the same price as an RTX one which has ray tracing acceleration unless it's significantly faster.

Even then the majority will still go out and buy the Nvidia cards anyway. AMD have to win by a solid margin to sell big volumes. The 5700XT was supposed (AMD claimed) to be a little bit faster than an RTX2070, and an RTX2060 Super is a tad slower than the standard 2070 as Techspot tested this week.

By this fuzziest of comparisons the XT could be a decent buy at $399, but I have the feeling most will get FOMO about RTX and still fall for the 2060 Super.....
Really? People are buying Nvidia because of "ray tracing acceleration"?
For me, I'm not activating ray tracing which can still kill more than 50% fps (latest result in https://www.techspot.com/review/1759-ray-tracing-benchmarks-vol-2/ so I'm not going to pay a single buck for that, period.
 
Greetings.
Your post is well written, but 100% subjective, with absolutely no objectivity. You are worried about secondary issues and placing importance on things that Gamers do not care about.

1) Nobody cares about RTX features, because RTX's features are a flop. If you own an RTX2070, or 2080 you would know exactly what People are talking about. Turing just is not good at ray tracing and won't get any better.

2) Blower style coolers CAN be noisy and are usually a concern. But AMD has already released an exploded diagram of their new blower, along with db metrics. So you have zero concerns for yourself, or others, because this is not traditional.

You are right, in that aftermarket usually has better cooling solution than a stock card, that is why the 5700 series priced @ $349 is a great deal, because the aftermarket is going to use this cards potential. But Stock for stock (AMD vs Nvidia)… AMD has the 2060 super beat in power consumption and performance.

3) Nvidia G-Sync is dead. That is why they were forced to start supporting an open standard. But... just so you know Nvidia's Turing cards can only do FreeSync, where as all current AMD cards can do FreeSync2.0. <-- go on and click

Matter of fact, Samsung TV comes with FreeSync2, the new Xbox is FreeSync2, the new PlayStation is FreeSync2. (RDNA + FreeSync is essentially the new defacto Gaming standard.) Everybody I know stuck on G-Sync are looking to get out, because all the new Gaming Monitors will be FreeSync2.0. (My Acer X34 is for sale cheap, along with my RTX2080)


Lastly, you had a typo, because the 5700 competes with the 2060 SUPER, The 5700x (in leaks) is competing nicely with the 2070 Super. So it seems as nicely written as your post was, it was full of fallacies.
Wow. I’m not even going to bother with you. I’ve changed my preferences to “ignore” you from now on. I’ve read previous comments of yours, you aren’t rational, or objective and I dare say you are quite misinformed.

Find someone else to hassle.
 
Last edited:
Wow. I’m not even going to bother with you. I’ve changed my preferences to “ignore” you from now on. I’ve read previous comments of yours, you aren’t rational, or objective and I dare say you are quite misinformed.

Find someone else to hassle.

You find facts a hassle? Perhaps your reality distortion bubble is what is giving you the hassle.

RTX On is a joke. Your argument isn't with me, but the facts from every reviewer out there. "RTX On" = flop.
 
Back