AMD's Radeon RX 5600 XT takes the fight to Nvidia at $279

mongeese

Posts: 643   +123
Staff
What just happened? AMD has officially unveiled their “ultimate 1080p gaming” graphics card, the Radeon RX 5600 XT. The GPU packs 2304 cores and operates with a game clock of 1375 MHz and a boost clock of 1560 MHz. It’s priced competitively at $279 and will be released on January 21.

The Radeon RX 5600 XT is a direct challenger to Nvidia’s GTX 1660 Ti, which has 1536 cores shifting between a 1500 MHz base and 1770 MHz boost clock at the same MSRP. The two cards have an identical memory configuration. While the 1660 Ti has a definite edge in clock speeds, AMD reckons their core count advantage more than makes up for it.

On stage, AMD showed benchmarks from six games where the 5600 XT beat the 1660 Ti. Margins ranged from about 5% to 30%, with an average of 16%.

Performance in Frames Per Second
(AMD benchmark data)
Game Nvidia GTX 1660 Ti AMD Radeon RX 5600 XT
Call of Duty: Modern Warfare 88 92
The Division 2 74 88
Gears of War 5 69 87
Apex Legends 118 124
Fortnite 111 126
World of Warcraft: Battle for Azeroth 113 147

Not much of that should surprise you, if you’ve been following the leaks. But what might is this: while AMD didn’t mention it during their presentation, they’ve published a product page for the RX 5600 non-XT variant as well. It has the same speeds and memory as the 5600 XT but has its core count brought down to 2048 shaders.

Like the non-XT 5500, the 5600 is for OEM systems only, which is reasonable given there’s not much room between the 5600 XT and 5500 XT. The specs place it roughly equivalent to the Nvidia GTX 1660 Super. We've asked AMD when it will become available, but we expect it to start appearing in pre-built systems when the 5600 XT launches later this month.

On the table below you can check out the specs for the entire first generation of desktop Navi. We don’t have base clocks for the RX 5600-series as AMD has yet to confirm them, but we've heard it's around 1130 MHz.

Model Cores Base Clock Game Clock Boost Clock TFLOPs Memory Bandwidth MSRP
RX 5700 XT 2560 1605 MHz 1755 MHz 1905 MHz 8.99 8GB 448 GB/s $399
RX 5700 2304 1465 MHz 1625 MHz 1725 MHz 7.49 8GB 448 GB/s $349
RX 5600 XT 2304 ? 1375 MHz 1560 MHz 6.34 6GB 288 GB/s $279
RX 5600 2048 ? 1375 MHz 1560 MHz 5.63 6GB 288 GB/s ?
RX 5500 XT 1408 1607 MHz 1717 MHz 1845 MHz 4.84 4GB or 8GB 224 GB/s $169

AMD hasn't shown us a reference edition for the 5600 XT, so for for now we're looking at OEMs. There haven’t been any announcements but models from Asrock, Asus, Gigabyte, MSI, PowerColor, Sapphire and XFX did briefly appear in AMD’s presentation, and there have been a few leaks. There’ll be triple and double fan configurations to tackle the 5600 XT’s 150W TDP.

Stay tuned for announcements of specific models, and of course, our own benchmark tests come January 21.

Permalink to story.

 
6 months is not soon
I would recommend to wait either way because new cards mean a price drop. Unless you are in desperate need of a graphics card it's usually better to wait when things get this close. Frankly, since tech comes out so quickly, if you're always waiting for the next new thing you'll never upgrade
 
I would recommend to wait either way because new cards mean a price drop. Unless you are in desperate need of a graphics card it's usually better to wait when things get this close. Frankly, since tech comes out so quickly, if you're always waiting for the next new thing you'll never upgrade

Depends, do you play a lot of games? 6 months is a long time. In fact it's past the cyberpunk 2077 release.
 
Impressive. It's unfortunate it took this long to come out, but then again, maybe AMD will be quicker with their next gen lineup and release it in full in a timely manner. In all honesty, I just want to see positive consistency from them.
 
Depends, do you play a lot of games? 6 months is a long time. In fact it's past the cyberpunk 2077 release.
The only game I play reliability is Eve and that plays perfectly fine max settings @4k on my 1070ti. I'm not exactly a typical case study, but I've dropped settings before to make it the next few months until a new release.

To be fair, I'm waiting for a card that can play Eve @8k60. I plan on buying an 8k projector, blow it up to 100+ inches and creating my own "bridge". Don't get me wrong, my 65" QLED is perfectly fine, but the only reasons I will upgrade is either equipment failure or 8k60 Eve support.

Impressive. It's unfortunate it took this long to come out, but then again, maybe AMD will be quicker with their next gen lineup and release it in full in a timely manner. In all honesty, I just want to see positive consistency from them.
I agree, but I'm happy to see them trying. Perhaps we will see them do with their graphics devision what they did with their CPUs. I would actually like to go AMD because their Linux drivers are better, but even with AMDs better Linux drivers, the performance is still in Nvidias boat.
 
Last edited:
The only game I play reliability is Eve and that plays perfectly fine max settings @4k on my 1070ti. I'm not exactly a typical case study, but I've dropped settings before to make it the next few months until a new release.

To be fair, I'm waiting for a card that can play Eve @8k60. I plan on buying an 8k projector, blow it up to 100+ inches and creating my own "bridge". Don't get me wrong, my 65" QLED is perfectly fine, but the only reasons I will upgrade is either equipment failure or 8k60 Eve support.


I agree, but I'm happy to see them trying. Perhaps we will see them do with their graphics devision what they did with their CPUs. I would actually like to go AMD because their Linux drivers are better, but even with AMDs better Linux drivers, the performance is still in Nvidias boat.

Eve at 8K ought to be pretty sharp. That's a nice setup (or goal)
 
Buy when the program you want to play doesn't work - of course, this is based on your perception, so FPS 60 might not be 'working' for you
 
Eve at 8K ought to be pretty sharp. That's a nice setup (or goal)
Not so much sharp, but very large and clear. I remember watching Star trek:TNG in the 90s as a kid and wanting my own "starship" and "bridge".

I legitimately have been designing my own starship bridge since I built my first computer in 95'. I finally did the 65" 4k thing in 2017 after waiting 23 years to build my own starship bridge, lol.
 
Last edited:
Do u think buying gpu now worth? B coz NVidia next series is coming soon right considering this should I wait?

Yes and no. It depends on the expected outcome and the level of the other major components in the computer. This is a mid-level card, so if you have a high-end system this is probably not an upgrade.
 
Now so much sharp, but very large and clear. I remember watching Star trek:TNG in the 90s as a kid and wanting my own "starship" and "bridge".
I legitimately have been designing my own starship bridge since I built my first computer in 95'. I finally did the 65" 4k thing in 2017 after waiting 23 years to build my own starship bridge, lol.

Heh, cool beans @yRaz - I knew someone who built a 'Dalek' I think. I didn't know the show, might have been Dr. Who, idk. After checking into it, I was suprised by the level of detail and how much work it took to get it right. The finished replica looked great! Your bridge should be ton of fun.
 
Heh, cool beans @yRaz - I knew someone who built a 'Dalek' I think. I didn't know the show, might have been Dr. Who, idk. After checking into it, I was suprised by the level of detail and how much work it took to get it right. The finished replica looked great! Your bridge should be ton of fun.
Lol, you flatter me. It's more like a standing desk in front of a big TV. The 8k projector will only mean that my standing desk has to be positioned further away.
 
Do u think buying gpu now worth? B coz NVidia next series is coming soon right considering this should I wait?
meh the 3070 3080 come out in june maybe, "for data centers", the rest of the cards who knows, you can get a budget 5700 for $280 on dell at the moment and a lot of the time. I looked it up once but if you really want to know look at say 2060 launch dates after 2080 2070 etc. The time between nvidia releases, 2 yearsish. I think it's about a year until we see a 3060.

GeForce RTX 2060 - [18] - January 15, 2019

Sooo I think I'm pretty damn close.
 
Do u think buying gpu now worth? B coz NVidia next series is coming soon right considering this should I wait?
If what you have is good enough for about 6 months then you could wait and see. if not you could just sell the GPU a year later and buy something new from AMD or NVIDIA. Both companies should have next gen cards by the end of the year.
 
I would recommend to wait either way because new cards mean a price drop. Unless you are in desperate need of a graphics card it's usually better to wait when things get this close. Frankly, since tech comes out so quickly, if you're always waiting for the next new thing you'll never upgrade

If what you have is good enough for about 6 months then you could wait and see. if not you could just sell the GPU a year later and buy something new from AMD or NVIDIA. Both companies should have next gen cards by the end of the year.
I currently have 1050ti,I am asking if I should wait b coz if next series comes out I can afford a good gpu at lower price, I am not in a urge to buy b coz the only game I mostly play is fortnite
 
I currently have 1050ti,I am asking if I should wait b coz if next series comes out I can afford a good gpu at lower price, I am not in a urge to buy b coz the only game I mostly play is fortnite
In that case, you can wait and see if anything changes by the end of the year. There's no need to rush.
 
Do u think buying gpu now worth? B coz NVidia next series is coming soon right considering this should I wait?

The problem is that you wait 6 months and the Nvidia drops a new card. Specs look good. But wait, then a new AMD card is "coming soon" so in another 6 months a newer faster card hits the market. Wanna jump on that? Well, maybe not, in 6 months the next Nvidia card is coming...

No matter what you do, you will always be buying a card that is "outdated" within half a year. But you'll be happily playing games for years with whatever you choose, whenever you choose it. You'll always have "new GPU" envy as new units emerge, it's a fact of life in the PC gaming lifestyle we have chosen. The key is to buy a GPU when the time is right for you - when your current GPU is struggling and you need to modernize, or you want to move up a tier in performance, etc. If that window happens to be veeeery close to a release of new hardware, it might be worth holding on. But half a year away is a lifetime in computer hardware.
 
The RX 5600 XT is awfully "nVidia-like" with the cut down memory bus intentionally crippling performance. And 6GB in 2020? Seriously? While this is obviously faster than the RX 590 or RX 580 and more efficient, it just doesn't make much sense when those cards are available for around $200 with a full 8GB of RAM (albeit slower GDDR5) and a 256-bit bus.
 
I would recommend to wait either way because new cards mean a price drop. Unless you are in desperate need of a graphics card it's usually better to wait when things get this close. Frankly, since tech comes out so quickly, if you're always waiting for the next new thing you'll never upgrade

I would wait too unless your games are unplayable. I have a GTX 1070 (I play at 1440p) and want to upgrade so I can run AA in DCS World, but I'm waiting to when all the new stuff is out to see what happens.
 
The RX 5600 XT is awfully "nVidia-like" with the cut down memory bus intentionally crippling performance. And 6GB in 2020?
The amount of memory available is linked to the number of memory controllers - the full Navi 10 chip has four 64 bit controllers (or 4 pairs of eight 32 bit ones, depending on how you want to look at it).

The RX 5600 XT only has 3 functional controllers; the other one has been disabled either due to defects during the manufacturing process or due a design choice, to make the product fit its market sector better.

Since each controller is 64 bits wide, the attached memory modules cannot exceed 64 bits in total. All manufacturers of GDDR6 produce 32 bit modules and the ones used on the current Navi range are 256 Mbits in size. So each controllers manages two modules, which equates to 2 GB in total per controller.

The likes of Samsung does product 512 Mbit modules, so in theory the RX 5700 XT can have up to 16 GB of GDDR6 but for cost reasons in the desktop market, AMD uses 256 Mbit ones:

RX 5700 XT = 4 controllers = 4 x 2 x 256 Mbit x 32 bit = 8192 MB or 8 GB
RX 5600 XT = 3 controllers = 3 x 2 x 256 Mbit x 32 bit = 6144 MB or 6 GB
 
So Vega 56 like performance for Vega 56 like prices, minus 2GB VRAM and the overclockability and driver stability for a little efficiency. AMD might as well use all their 7nm capacity to make cheaper CPUs...
 
Last edited:
Back