AMD Radeon 9060 XT details surface: 8GB & 16GB variants, 500W PSU recommendation

Daniel Sims

Posts: 1,975   +56
Staff
In a nutshell: Although AMD confirmed that the Radeon RX 9060 graphics cards are coming soon, the company has revealed almost nothing about them or any planned entry-level members of the RX 9000 lineup. While performance specifications remain unclear, a basic portrait of Team Red's new mainstream GPUs is starting to take shape.

VideoCardz claims to have received basic specifications for AMD's upcoming Radeon RX 9060 XT GPU from an unspecified board partner. The leak includes details on power, memory, and video output.

Prior reports indicated that the mainstream graphics card would have 8GB and 16GB variants, mirroring Nvidia's upcoming RTX 5060 Ti. Team Red's GPU continues to use GDDR6 VRAM, while Nvidia is upgrading nearly its entire stack to GDDR7. However, our RTX 5070 review indicates that the extra speed only makes a difference in limited scenarios. According to new information, the RX 9060 XT's memory clocks at 20 Gbps on a 120-bit bus.

Although the card's exact power requirements are unclear, a 500W power supply is recommended. This is lower than AMD's 600W recommendation for the 7600 XT, but PSU recommendations vary between board partners. Unsurprisingly, the 9060 XT is expected to use a single eight-pin power connector.

The biggest remaining mystery is likely which GPU AMD has chosen for the RX 9060 series. Whether the company will cut Navi 48 down further from the standard RX 9070 or shift to Navi 44 remains unclear. However, the 9060 XT's inclusion of only three video outputs – one HDMI 2.1 port and two DisplayPort 2.1a ports – suggests that AMD is using Navi 44.

Team Red previously confirmed that it aimed to launch the RX 9060 series in the second quarter of 2025 without specifying further. The 9060 XT and possibly also the standard 9060 could emerge next month to compete with Nvidia's RTX 5060 series.

In our reviews, the RX 9070 XT and 9070 compared favorably to Team Green's RTX 5070 and 5070 Ti, ending AMD's years-long deficit in upscaling and ray tracing performance. With disappointing specs for the RTX 5060 GPUs recently leaked, their Radeon equivalents have a good chance of competing in the popular mainstream segment.

Meanwhile, Mexican GPU retailer DD Tech recently leaked the first sign of an RX 9050 graphics card. The storefront quickly removed the category listings and didn't include further details, but the mistake proves that AMD plans to answer Nvidia's upcoming RTX 5050.

Permalink to story:

 
Sure, the 9070 XT rasters as well as the 5070 Ti, and the non XT is on-par with the 5070, but both are weaker in RT and power consumption (I could also mention nVidia’s software offerings since many buyers are into that too). The non-XT could also stand do to with a $50-70 price cut to make it a better sell against the 5070. Picking the 9070XT at $600 over $750 for the 5070Ti is currently the only clear choice for AMD.

AMD have essentially forced themselves to price the 9060 XT $50-100 under the 5060Ti, meaning they’ll likely wind up with a 9060 non-XT that is also a hard sell against the 5060 and XT variant if they are planning to sell the 9060 for $300 (assuming parity with 5060).


And yes, stock of these cards is limited. It’s for that reason that this could change at any time, but it should be noted that selling out of a product typically means it is a good offering. And, if/how nVidia manages to fix the terrible mess of a launch they’ve produced remains to be seen. It seems like this botched launch is helping AMD more than anything else tbh.
 
The 16 GB version should be quite nice, if these chips get the same per CU advantage over the 7000 series that the 9070/XT are getting. The 8 GB version is pretty meh these days unless they sell it for cheap.
 
Sure, the 9070 XT rasters as well as the 5070 Ti, and the non XT is on-par with the 5070, but both are weaker in RT and power consumption
Power consumption? Highly dependent on the title. Testing from Gamers Nexus shows the non-XT basically on par with its NVIDIA counter parts (better in one title, worse in another, on par in yet another). The XT is a bit more power hungry but not by a huge margin.

Derbauer has done some undervolting testing on the XT as well and it seems that efficiency can be improved upon quite a bit. Might be a great card for compact builds although I'd wait for more testing once the software for it matures a bit and more cards have been tested by more people.

(I could also mention nVidia’s software offerings since many buyers are into that too).
CUDA is basically the only argument to try and get NVIDIA at the moment imo. (or a need for RTX 4090/5090 performance, as AMD simply has no answer. Good luck getting one though without going through a scalper and paying $3400+).
As far as the included software (Andrenalin / NVIDIA App) goes I prefer AMDs, although NVIDIA is starting to make a comeback there. Hope they'll add support for more hardware tweaking in there sometime.

The non-XT could also stand do to with a $50-70 price cut to make it a better sell against the 5070. Picking the 9070XT at $600 over $750 for the 5070Ti is currently the only clear choice for AMD.
Every region I've seen prices of has NVIDIA at 30-70% higher prices and with placing orders either disabled or pre-order only with no known date or a date many weeks away. From everything I've heard on the rumour channels this is not expected to improve for months either.
Basically if you want NVIDIA - get an older card. DLSS 4 is available on all of them and the new multi-frame gen has such a niche use case it's hardly worth mentioning. The 5000 series is paper launch unobtanium for prices that make no sense.
If you need RTX 5090 level performance you're screwed. It and the RTX 4090 are not in stock anywhere (at least in the UK) and priced at 70%+ of their MSRP (and the MSRP is extremely high to begin with)

AMD have essentially forced themselves to price the 9060 XT $50-100 under the 5060Ti, meaning they’ll likely wind up with a 9060 non-XT that is also a hard sell against the 5060 and XT variant if they are planning to sell the 9060 for $300 (assuming parity with 5060).
Agreed. The non-XT price has been correctly criticized by all the reviews imo, it's poorly positioned against its bigger brother. AMD always needs to price a good bit under NVIDIA simply to sell. Lucky for them NVIDIA decided to have close to no production skyrocketing the prices. AMD due to that is a lot cheaper and more importantly - can actually be bought, sites still have stock.


And yes, stock of these cards is limited. It’s for that reason that this could change at any time, but it should be noted that selling out of a product typically means it is a good offering.
I've seen multiple articles and videos quoting retail workers that NVIDIA's supply was and is terrible and not expected to improve for months. So that "any time" is a "not any time soon". Same sources are also mentioning that AMD has supplied a lot more cards, as in multitudes more and that restocking is expected in large numbers as well.

And, if/how nVidia manages to fix the terrible mess of a launch they’ve produced remains to be seen. It seems like this botched launch is helping AMD more than anything else tbh.
Let's go through NVIDIA's problems, I'd do them by severity but I don't know which is worse to be honest.

* Power connector melting things.
According to NVIDIA the RTX 40 series didn't have this problem. This is quoted as saying they fixed it with the RTX 50 series (fixing supposed non-existent problems, what are they making - Apple keyboards? ;)).
There's still RTX 50 cards with the problem so I'm guessing it won't be fixed until they completely redesign the connector.
* Black screens
Think this is fixed? My friends with a NVIDIA card stopped complaining after the first driver version that was supposed to fixed it and as I understand it NVIDIA released another driver with another fix since then.
* Dropping support for 32-bit CUDA leading to terrible performance in older games
This is a "won't fix" - Seems like NVIDIA has no interest in fixing this, which is a shame (see digital foundry video if interested)
* Terrible supply and prices due to it
As I mentioned earlier, not likely to improve much for months. I hope for a friend of mine that's looking to get a RTX 5090 they'll at least be near their MSRP come Christmas but I have my doubts.
 
Power consumption? Highly dependent on the title. Testing from Gamers Nexus shows the non-XT basically on par with its NVIDIA counter parts (better in one title, worse in another, on par in yet another). The XT is a bit more power hungry but not by a huge margin.

Derbauer has done some undervolting testing on the XT as well and it seems that efficiency can be improved upon quite a bit. Might be a great card for compact builds although I'd wait for more testing once the software for it matures a bit and more cards have been tested by more people.


CUDA is basically the only argument to try and get NVIDIA at the moment imo. (or a need for RTX 4090/5090 performance, as AMD simply has no answer. Good luck getting one though without going through a scalper and paying $3400+).
As far as the included software (Andrenalin / NVIDIA App) goes I prefer AMDs, although NVIDIA is starting to make a comeback there. Hope they'll add support for more hardware tweaking in there sometime.


Every region I've seen prices of has NVIDIA at 30-70% higher prices and with placing orders either disabled or pre-order only with no known date or a date many weeks away. From everything I've heard on the rumour channels this is not expected to improve for months either.
Basically if you want NVIDIA - get an older card. DLSS 4 is available on all of them and the new multi-frame gen has such a niche use case it's hardly worth mentioning. The 5000 series is paper launch unobtanium for prices that make no sense.
If you need RTX 5090 level performance you're screwed. It and the RTX 4090 are not in stock anywhere (at least in the UK) and priced at 70%+ of their MSRP (and the MSRP is extremely high to begin with)


Agreed. The non-XT price has been correctly criticized by all the reviews imo, it's poorly positioned against its bigger brother. AMD always needs to price a good bit under NVIDIA simply to sell. Lucky for them NVIDIA decided to have close to no production skyrocketing the prices. AMD due to that is a lot cheaper and more importantly - can actually be bought, sites still have stock.



I've seen multiple articles and videos quoting retail workers that NVIDIA's supply was and is terrible and not expected to improve for months. So that "any time" is a "not any time soon". Same sources are also mentioning that AMD has supplied a lot more cards, as in multitudes more and that restocking is expected in large numbers as well.


Let's go through NVIDIA's problems, I'd do them by severity but I don't know which is worse to be honest.

* Power connector melting things.
According to NVIDIA the RTX 40 series didn't have this problem. This is quoted as saying they fixed it with the RTX 50 series (fixing supposed non-existent problems, what are they making - Apple keyboards? ;)).
There's still RTX 50 cards with the problem so I'm guessing it won't be fixed until they completely redesign the connector.
* Black screens
Think this is fixed? My friends with a NVIDIA card stopped complaining after the first driver version that was supposed to fixed it and as I understand it NVIDIA released another driver with another fix since then.
* Dropping support for 32-bit CUDA leading to terrible performance in older games
This is a "won't fix" - Seems like NVIDIA has no interest in fixing this, which is a shame (see digital foundry video if interested)
* Terrible supply and prices due to it
As I mentioned earlier, not likely to improve much for months. I hope for a friend of mine that's looking to get a RTX 5090 they'll at least be near their MSRP come Christmas but I have my doubts.

-Spot on
 
Wait did the very same news poster not just hang most of his earlier Blackwell post on the premise that 8GB sucks? It doesn't even merit a mention here. The grading on a scale could not be more blatant.

And what's this 'AMD ended its deficit in upscaling and ray tracing' stuff? Are you out of your mind right now?
 
Last edited:
Wait did the very same news poster not just hang most of his earlier Blackwell post on the premise that 8GB sucks? It doesn't even merit a mention here. The grading on a scale could not be more blatant.

And what's this 'AMD ended its deficit in upscaling and ray tracing' stuff? Are you out of your mind right now?

8GB sucks above a certain price point.
 
8GB sucks above a certain price point.

I see. And is that the price point of the 9060 XT? 9060? Or 9050? All of these will be 8GB, just like their Nvidia equivalents, which the author saw fit to mention disparagingly no less than 3 times in that particular bit of prose. Must've just slipped his mind here.
 
Wait did the very same news poster not just hang most of his earlier Blackwell post on the premise that 8GB sucks? It doesn't even merit a mention here. The grading on a scale could not be more blatant.

And what's this 'AMD ended its deficit in upscaling and ray tracing' stuff? Are you out of your mind right now?
8GB is fine if you want to play 1080P medium with 60FPS. The problem starts when they start charging $400-500 for an 8GB card.

Plenty of people would be fine with an 8GB card if it cost $200
 
Wait did the very same news poster not just hang most of his earlier Blackwell post on the premise that 8GB sucks? It doesn't even merit a mention here. The grading on a scale could not be more blatant.

And what's this 'AMD ended its deficit in upscaling and ray tracing' stuff? Are you out of your mind right now?
When nVidia does something, it is Bad, because they are Bad, for reasons. So they must be called out at every opportunity. If AMD does the same thing, it is a Good thing, because AMD is the underdog and therefore Good.

But the only reason that nVidia outsells AMD is mindshare, people always praise them and only speak negatively of AMD. Remember, everyone likes nvidia and will never call them out.

We've always been at war with Eastasia.

8GB is fine if you want to play 1080P medium with 60FPS. The problem starts when they start charging $400-500 for an 8GB card.

Plenty of people would be fine with an 8GB card if it cost $200
My hottake: no, it isnt fine. Polaris brought us 8GB $200 GPUs 9 YEARS ago. it's been nearly 12 since the 8GB 290xs hit shelves. An 8GB GPU for $200 today is highway robbery. Even at mid 1080p, you will have games dithering their texture quality or dropping frames due to VRAM limitations. Some games will straight up not work right (hi indiana jones).

8GB should have gone extinct, or been reduced to RX 6400 tier products a generation ago. Current consoles have 16GB of memory already, and we're less then 2 years away from the next generation which will likely push that to 24 or 32 GB. Buying an 8GB GPU today would be the same as buying a 512MB GPU in 2016.

It's time for 8GB GPUs to be taken to the ranch.
 
When nVidia does something, it is Bad, because they are Bad, for reasons. So they must be called out at every opportunity. If AMD does the same thing, it is a Good thing, because AMD is the underdog and therefore Good.

But the only reason that nVidia outsells AMD is mindshare, people always praise them and only speak negatively of AMD. Remember, everyone likes nvidia and will never call them out.

We've always been at war with Eastasia.


My hottake: no, it isnt fine. Polaris brought us 8GB $200 GPUs 9 YEARS ago. it's been nearly 12 since the 8GB 290xs hit shelves. An 8GB GPU for $200 today is highway robbery. Even at mid 1080p, you will have games dithering their texture quality or dropping frames due to VRAM limitations. Some games will straight up not work right (hi indiana jones).

8GB should have gone extinct, or been reduced to RX 6400 tier products a generation ago. Current consoles have 16GB of memory already, and we're less then 2 years away from the next generation which will likely push that to 24 or 32 GB. Buying an 8GB GPU today would be the same as buying a 512MB GPU in 2016.

It's time for 8GB GPUs to be taken to the ranch.
That 16GB of memory in a console is shared. Probably only half of it ends up being used by the GPU.
 
That 16GB of memory in a console is shared. Probably only half of it ends up being used by the GPU.
Only if a developer ends up using 8GB for non graphical related tasks.

Also, consoles run on low settings and with dithered resolutions to hit their performance targets. We've seen that, for three generations now, the console RAM amount should be considered the minimum for PC GPUs of the same era. 512MB GPUs were made obsolete during the 360 era, which only had 512mb total, and even 1GB cards were getting awfully snug by the end.

There's also efficiencies you get from having shared RAM pools, which PCs do not benefit from, so just because something uses 8GB on console doesnt mean it will fit in 8GB on PC.

We've already seen proof that 8GB is insufficient for some modern games, and the problem is getting worse. We've seen games that reduce visual quality on 8GB cards to keep the game from crashing. Techspot has done a few articles on it.

Buying a new 8GB GPU in 2025 is utterly foolish, no different then buying a 512GB GPU in 2016. It's an obsolete standard, for some reason people REALLY cant get over that. We had 8GB $200 GPUs 9 years ago. Paying more then that for 8GB today is highway robbery.
 
When will these cards be available for MSRP is the only question I have. It has been months since fake release of rtx 5000. Radeon 9000 GPUs had a similar start (although I still have small hope they will become available soon).
These announcements are useless if those cards do not lie on store shelves.
 
Only if a developer ends up using 8GB for non graphical related tasks.
The Xbox Series X has 8 Zen 2 cores, and reserves 2.5GB of its 16GB of memory for the OS. When I run Control on my 8-core Zen PC, it uses about 4GB of system RAM, and Hogwarts uses 8.5GB. I wouldn't expect the system RAM requirements to be much different on a console using a similar CPU, so that would leave 9.5GB for the GPU in Control and 5GB in Hogwarts. Yes, the demands for console-level graphics will be less, but there in no situation where a console will make anywhere near 16GB available to its GPU.
 
If they drop a 9050 for under $200, then that would be fine for an 8GB card.
The 9060 should have at least 12GB and the 9060XT should be 16GB, again, this is hinged on price as well.

 
Back