AMD Radeon RX 7900 spied in leaked photos with dual 8-pin connectors

Shawn Knight

Posts: 15,294   +192
Staff member
What just happened? The first purported images of AMD's upcoming Radeon RX 7000 series graphics card have hit the web courtesy of Twitter leaker HXL. The photos, from a source identified only as QQ (which is a Chinese messaging app), reveal a run-of-the-mill Radeon reference design that looks only marginally larger than the card in the side-by-side (likely an RX 6900 XT).

Tom's Hardware estimates the fans on the mystery card are likely in the 82-83mm range, making them slightly bigger than those on a reference RX 6900 XT.

A second image showing the top of the cards highlights the two eight-pin power connectors. If you recall, AMD recently confirmed upcoming RDNA 3 GPUs would not use the 12VHPWR connector that has been associated with melting RTX 4090 power adapters.

The red-striped card is also lacking a PCB backplate and has pins protruding from the rear like those commonly found on engineering samples for development purposes.

If genuine, the images could lend credence to rumors of a TPD in the 350 watt range for the new card. Each eight-pin PCIe cable can supply up to 150 watts and the PCIe slot itself can supply up to an additional 75 watts for a theoretical total of 375 watts. Realistically, a manufacturer will likely err on the side of caution and not aim for the absolute maximum draw.

The flagship Radeon RX 9700 XTX is rumored to pack 12,288 stream processors and a 384-bit memory bus width. It could also feature a dozen 16 Gbit GDDR6 memory chips running at 20 Gbps for a total of 24 GB of VRAM with 960 GB/s of bandwidth.

AMD is set to share more information about its RDNA architecture during a livestream scheduled for November 3 at 1 p.m. Pacific / 4 p.m. Eastern. We'll likely learn all of the pertinent details about the product range, launch models and rollout plans during the event. The latest scuttlebutt suggests AMD could have the first wave of cards ready for retail by the end of November.

Permalink to story.

 
Not that bigger, so I wonder if that one is the XT or the XTX one? (lord, I hate that name XTX).

Also, looks like power consumption wont be as maniacal as the web darling 4090, so I assume less heat.

Anyways, lets see how well its going to run and meanwhile...

bd5e07eaaa1a69eaad755c08221a3c546f42e7519c62029f2f81d68c6187d007.jpg

 
LOL, my last AMD card used dual 8 pins..I could probably drop the RX 7900 right into my *previous* system. Sounds to me like they can't lose.
 
Dumb naming, seriously XT and XTX. 7950XT form the 24GB model, 7900XT for the 20GB, why is that so difficult. They could call the potential 4090Ti challenger the 7970XT if required.

Anyway, can't wait for the reviews, 7800or 7900 for me this gen. Huang can shove Lovelace down his throat.
 
Unless AMD drops a performance bomb that makes FSR completely useless, and leaves nvidia sputtering on the DLSS express...
The problem with DLSS is that it's hard to implement where as, and I'm over simplifying here, you can copy and paste FSR into a game engine.

If FSR gets good enough developers would stop wasting time developing their game engines for DLSS and just use FSR. Same thing happened with freesync. Gsync is basically just a rebadged free sync now because manufactures didn't feel paying nVidias premium for it. And it's EVERYWHERE because of it.

I would love to see AMD fsr give Nvidia the middle finger and say, "DLSS is dead because you made it too proprietary"

And keep in mind, non of the consoles can use DLSS which is arguably the market segment where its needed most.
 
If FSR gets good enough developers would stop wasting time developing their game engines for DLSS and just use FSR.
That should had happened already, but remember that nvidia pay developers to use their cr@p and even help them with programmers.
Same thing happened with freesync.
I hate that one, nvidia polluted the name with their gsync bs and people swear that its something else.

I would love to see AMD fsr give Nvidia the middle finger and say, "DLSS is dead because you made it too proprietary"
As stated above, nvidia wont allow that and remember that you also have all these sites and writers pushing DLSS and RT like is the second coming and conveniently ignore the fact of being proprietary, which results in the rabid cult members demanding DLSS.
But a man can dream that logic prevails, sadly we know that money talks and BS walks.
 
That should had happened already, but remember that nvidia pay developers to use their cr@p and even help them with programmers.

I hate that one, nvidia polluted the name with their gsync bs and people swear that its something else.


As stated above, nvidia wont allow that and remember that you also have all these sites and writers pushing DLSS and RT like is the second coming and conveniently ignore the fact of being proprietary, which results in the rabid cult members demanding DLSS.
But a man can dream that logic prevails, sadly we know that money talks and BS walks.
AMD has shown that you don't need specialized hardware to have variable refresh rate tech, ray tracing or good upscaling. When these features are only on PC and can only be used on highest end of hardware there isn't a large financial incentive to go the nVidia route for developers, or Microsoft or Sony. Perhaps NV could sell a ray tracing accelerator chip like they tried with PhysX...or Gync.... but I think history is going to be on AMD's side.

You can't beat free and open source. I don't know why I have to keep reminding people this, but the console industry is MUCH larger than the PC gaming industry and is entirely run on AMD's hardware. To have the largest market penetration developers ARE going to have to use FSR and AMD's Ray tracing just like they are using Freesync now.

It's probably going to be ATLEAST another 2 generations before raytracing becomes anything other than an expensive novelty. That's about enough time for the next generation of consoles get released. If the 7000 series of graphics card is a hit, which it'll pretty be hard for it NOT to be considering how much NV can't stop pissing everyone off lately, it'll be pretty hard for FSR not to take font and center stage in the next few years.
 
AMD has shown that you don't need specialized hardware to have variable refresh rate tech, ray tracing or good upscaling.
Upscaling, no you don’t, but VRR and RT do need dedicated hardware support. In the case of the former, AMD chose to follow an industry standard; the latter is just as specialized as Intel’s and Nvidia’s. For all three vendors, the technology used is transparent to the API.
 
Back