New AMD Radeon RX 9060 XT info reveals high boost clocks, possible price tiers

Daniel Sims

Posts: 1,936   +53
Staff
Editor's take: AMD's Radeon RX 9060 XT is expected to launch in about a week, but pricing remains unclear. An early Amazon listing suggests it won't undercut Nvidia's competing RTX 5060 Ti, though the leak likely doesn't reflect AMD's intended MSRP. Regardless, the upcoming mainstream GPU is expected to face the same price inflation that has affected every new graphics card released this year (and last year, and..).

A VideoCardz reader recently spotted two XFX Swift graphics cards listed on Amazon sporting the upcoming Radeon RX 9060 XT. Priced at $449 for the 8GB model and $519 for the 16GB version, the now-removed listings point to pricing comparable to Nvidia's RTX 5060 Ti.

For comparison, Nvidia gave the 8GB RTX 5060 Ti a $380 MSRP and lists the 16GB version for $430, but the lowest prices currently available on Newegg are around $420 and $480, respectively.

AMD has yet to confirm official pricing for either RX 9060 XT variant, and the XFX Swift listings are likely inflated, though it's unclear by how much.

Meanwhile, leaker @momomo_us spotted a pair of Swiss listings for the RX 9060 XT. The 8GB and 16GB versions were listed at 508 CHF and 556 CHF, respectively, placing the higher-end model roughly 25 percent below the RX 9070.

While these are also unlikely to reflect the final MSRP, they suggest a broader pricing gap between AMD's tiers compared to the RTX 5060 Ti and 5070.

Aside from pricing, the RX 9060 XT listings mention fan configurations and clock speeds. XFX Swift's model features a 3,320 MHz boost clock, aligning with earlier leaks that pointed to speeds above 3 GHz. The 8GB version appears to include a dual-fan design.

The RX 9060 XT is expected to use AMD's Navi 44 GPU, featuring 32 compute units and 20 Gbps GDDR6 VRAM on a 120-bit bus. Custom builders will likely need a power supply with at least 500W.

Reports indicate that Team Red plans to launch the graphics card at the start of Computex on May 18. Meanwhile, Nvidia is expected to unveil the standard RTX 5060 the following day and has already confirmed a $300 MSRP. Little is known about the standard RX 9060, which would compete against it.

Intel may also debut the Arc B770 at Computex. While it's long been unclear whether Intel planned to release Battlemage GPUs beyond the entry-level (but surprisingly capable) B580, rumors now suggest the B770 could resemble mid-range offerings like the RTX 4070 Ti Super or 4080.

Permalink to story:

 
If AMD launch an 8GB card at $450, there’s absolutely no WAY TechSpot gives it a higher score than the 8GB card nVidia launched at $380.


…right?
No, but also, There's always places on the internet that put placeholder prices on, and everybody tends to forget this at every launch. Also, We live in the turbulent time, maybe sellers trying to anticipate possible US tariffs into account, and rest of the World will get lower MSRP.

The RX 480 8GB was $229 9 years ago. $450 for a 8GB card today is a complete ripoff and an insult.
That was long time ago. And AMD was selling them at the bare minimum if I remember. But if Their costs scyrocketed, like some claim, I would just skip launching the product that gonna be uncompetitive with used cards, call It a day and move to trading chinese lampions or whatever. Just admit that personal computer entartainment is ending and don't drag that dead body.
 
Last edited:
If AMD launch an 8GB card at $450, there’s absolutely no WAY TechSpot gives it a higher score than the 8GB card nVidia launched at $380.


…right?
Hmm my perception of Techspot is that they let AMD get away with a lot of things but not Nvidia. And I think that is because they want to help the underdog. Remember AMD only sell one GPU for every 9 Nvidia sell. Im not sure if that's professional but I understand why they are like it. So, I wouldn't be surprised if they don't criticise AMD for selling an 8GB variant.

Personally, I don't get that hung up about an 8GB thing. If it makes the card cheaper then its probably a good thing as it will give more access to gamers. If its for 1080p then it really doesn't need more than 8GB. If you buy an 8GB card to play 4K then you're dumb for not researching a product you pay hundreds of dollars for and you deserve the bad performance you get.

Besides, Techspot opposing 8GB cards is a new thing. They were recommending 8GB cards just last year according to some of their articles...
 
Hmm my perception of Techspot is that they let AMD get away with a lot of things but not Nvidia. And I think that is because they want to help the underdog. Remember AMD only sell one GPU for every 9 Nvidia sell. Im not sure if that's professional but I understand why they are like it. So, I wouldn't be surprised if they don't criticise AMD for selling an 8GB variant.

Personally, I don't get that hung up about an 8GB thing. If it makes the card cheaper then its probably a good thing as it will give more access to gamers. If its for 1080p then it really doesn't need more than 8GB. If you buy an 8GB card to play 4K then you're dumb for not researching a product you pay hundreds of dollars for and you deserve the bad performance you get.

Besides, Techspot opposing 8GB cards is a new thing. They were recommending 8GB cards just last year according to some of their articles...


Given that the top 100 most active games will run perfectly fine on a x60/50 card, I wonder if the vast majority of gamers actually need more than 8GB VRAM.
 
Given that the top 100 most active games will run perfectly fine on a x60/50 card, I wonder if the vast majority of gamers actually need more than 8GB VRAM.
of course not, but it's an addiction. I'm still happy with Skyrims graphics. Most of the games I play are from the 2000s era but even then I'm not a graphics fend
 
Given that the top 100 most active games will run perfectly fine on a x60/50 card, I wonder if the vast majority of gamers actually need more than 8GB VRAM.
From what I gather its a small but growing number of games that need more than 8GB of VRAM and only at max settings at 1440p or 4K. I definitely noticed when I couldn't run the HD texture pack on Far Cry 6 because my card only has 8GB but I then went on to finish and enjoy that game anyway.

Techspot seems to be on a bit of a vendetta against 8GB cards. And they are doing so to make the case for consumers which im all for. But if you are on a tight budget or simply don't want to spend too much, a second hand modern 8GB card will do just fine for almost any game.


lol ok.

The vast majority of games are old games. If you want to play new stuff at high res, yes you need more than 8GBs for many games.
You don't "need" 8GB to play new games. For a small handful of them you need more than 8GB to run them at high or max settings, thats it. Theres no game you can't play on an 8GB card. I have an 8GB card (3070ti), tell me what game I cant play!
 
From what I gather its a small but growing number of games that need more than 8GB of VRAM and only at max settings at 1440p or 4K. I definitely noticed when I couldn't run the HD texture pack on Far Cry 6 because my card only has 8GB but I then went on to finish and enjoy that game anyway.

Techspot seems to be on a bit of a vendetta against 8GB cards. And they are doing so to make the case for consumers which im all for. But if you are on a tight budget or simply don't want to spend too much, a second hand modern 8GB card will do just fine for almost any game.



You don't "need" 8GB to play new games. For a small handful of them you need more than 8GB to run them at high or max settings, thats it. Theres no game you can't play on an 8GB card. I have an 8GB card (3070ti), tell me what game I cant play!

Sure you can play games on an 8GB card.

People who like to play new and cutting edge games aren't doing it to play on low settings at 1080p.

And then the debate would shift to how much are 8GB cards holding back devs since they aren't going to release games that can't run on the hardware gamers actually own. 8GBs became the standard 5 generations ago with the Nvidia 10 series, but now Nvidia is stagnating on Vram because they don't want people to be able to use DLSS to avoid upgrades.
 
Last edited:
You don't "need" 8GB to play new games. For a small handful of them you need more than 8GB to run them at high or max settings, thats it. Theres no game you can't play on an 8GB card. I have an 8GB card (3070ti), tell me what game I cant play!
GTA6
 
When the vast majority of the top 100 most active games will run perfectly fine on any x60/50 card, I start to wonder how much gamers actually need more than 8GB VRAM.

of course not, but it's an addiction. I'm still happy with Skyrims graphics. Most of the games I play are from the 2000s era but even then I'm not a graphics fend

Yeah I can second that. I just picked up a new Minecraft Modpack today that runs at 60fps on an i5-1260P iGPU at 1080p.

Don’t let publishers/manufacturers bully you into spending money for the sake of spending money.
 
Having enough Vram to play modern games at high res and decent FPS is an addiction... wut?

I think you are addicted to bad takes.


I think he’s addicted to not lining the pockets of mega-billion-dollar corporations. “Decent FPS” and “high res” is marketing.
 
I don't know why people continue to argue this point...
There's multiple options and configurations available at multiple (if elevated) price points...
You all get choices, so stop ragging on other peoples choices which for the most part are based on their budget and needs, and be happy with your own choices and options...

Personally I'm looking forward to the review of the RX 9060 XT 16GB to see how much of a performance upgrade I MIGHT get if I upgrade from my current ASUS 3060 Ti 8GB card...
 
This just in!..........Companies will sell a product for as much as the market will bear. This whining is getting old and trying to publicly shame them won't change a damn thing.
 
I think this is part of planned obsolescence, because in 2-3 years time, if we’re still alive, 8GB cards will have a tough time in the secondary market and will cease to have a purpose, ensuring everyone is moved forward with the time.
 
I think he’s addicted to not lining the pockets of mega-billion-dollar corporations. “Decent FPS” and “high res” is marketing.
Lol wow that is such a stupid thing to say.

Higher FPS or higher res isn't marketing. They're very real tangible aspects of gaming.
 
Last edited:
I think he’s addicted to not lining the pockets of mega-billion-dollar corporations. “Decent FPS” and “high res” is marketing.
Lol wow that is such astupid thing to say.

Higher FPS or higher res isn't marketing. They're very real tangible aspects of gaming.

No no no, he's totally right. Games don't need more than 320x200 at 30fps. CGA-compatible of course. All other specs are just greedy companies trying to fleece you.
 
Hmm my perception of Techspot is that they let AMD get away with a lot of things but not Nvidia. And I think that is because they want to help the underdog. Remember AMD only sell one GPU for every 9 Nvidia sell. Im not sure if that's professional but I understand why they are like it. So, I wouldn't be surprised if they don't criticise AMD for selling an 8GB variant.m
Funny, because Intel sells more GPUs than AMD and Nvidia combined. Your data, as usual, is total BS.
 
My data comes from a credible source. Your data is literally fantasised in your head.

Why are you so emotionally attached to AMD?
OK, give me that "credible" source. That source is wrong anyway so use better ones.

Here is one: https://www.jonpeddie.com/news/q424-pc-gpu-shipments-increased-by-4-4-from-last-quarter/

Intel: 65%
AMD: 18%
Nvidia: 16%

FYI, Intel has sold more than AMD+Nvidia combined over a decade in a row. And before you even start to say about your share is discrete ones, notice that desktop CPU share is only 31% and laptops 69%. Btw, every Nvidia laptop GPU is discrete, whereas most AMD and Intel ones are integrated.

I don't think there is need for your "source".
 
OK, give me that "credible" source. That source is wrong anyway so use better ones.

Here is one: https://www.jonpeddie.com/news/q424-pc-gpu-shipments-increased-by-4-4-from-last-quarter/

Intel: 65%
AMD: 18%
Nvidia: 16%

FYI, Intel has sold more than AMD+Nvidia combined over a decade in a row. And before you even start to say about your share is discrete ones, notice that desktop CPU share is only 31% and laptops 69%. Btw, every Nvidia laptop GPU is discrete, whereas most AMD and Intel ones are integrated.

I don't think there is need for your "source".
Im sorry if you genuinely think that source discredits the steam survey. It emphatically doesnt. Its a limited snip of data from one market on one product. You know full well it doesnt prove anything, you are just desperate to make AMD look better. Its extremely transparent.

Can you answer my question, why are you emotionally attached to AMD? Like why do you twist reality to make AMD look better than they actually are?

I have asked you that question three times on this forum. Other forums have accused you of the same thing and you just ignore it. Where does your bias originate?
 
Im sorry if you genuinely think that source discredits the steam survey. It emphatically doesnt. Its a limited snip of data from one market on one product. You know full well it doesnt prove anything, you are just desperate to make AMD look better. Its extremely transparent.
Wtf are you talking about? Those statistics are about actual amount of GPUs shipped per manufacturer! Actual Shipped Units Total.

Steam survey 🤦‍♀️
Can you answer my question, why are you emotionally attached to AMD? Like why do you twist reality to make AMD look better than they actually are?

I have asked you that question three times on this forum. Other forums have accused you of the same thing and you just ignore it. Where does your bias originate?
How about sticking with facts? Of course, when you don't have anything else, try to go with personal stuff. Sorry but I have too much experience.

Your source seems to be missing btw.
 
Back