RX 550 PCI-E 3.0 x16 (uses x8) Graphics Cards - What's the Deal Here?

TheBigFatClown

Posts: 1,110   +495
I've looked at some RX 550 graphics cards for sale on NewEgg. And some of them are described as above. "PCI-E 3.0 x16 (uses x8)". I want to understand this completely (or at least enough to not get ripped off).

From the few cards I looked at on NewEgg it appeared that there was either no difference in price between a regular x16 vs an x16(uses x8) graphics card or the x16(uses 8) was actually priced higher.

Are the 2 different types of cards being offered because the RX 550 graphics chipset is fully capable of saturating all 16 lanes under proper conditions of memory and clock speeds or is the (uses x8) version being offered to save a few pennies in manufacturing cost because no configuration of the RX 550 could possibly saturate all 16 lanes of the PCI-E 3.0 interface?

I guess my question is very similar to the question of whether the chicken or the egg came first.

Or to put it another way (this may be a different question all together), is there a way to tweak the characteristics of the graphics card itself so that an x16 graphics card's maximum theoretical bandwidth would be equal to another graphics card using only x8 PCI-E 3.0 lanes? Not "purposefully gimping it" necessarily but by way of logical tradeoffs? In other words, maybe because of criminal high prices on memory, a card manufacturer uses less of it and more of something else to save on manufacturing costs?

What was the reason for releasing x8 graphics cards into an x16 slot to begin with? I know that AMDs new AMD 2200/2400G only offers x8 lanes but this just got to market. How long have x8 graphics card been around?

Anyone who can shed any light on this issue to help me understand better is much appreciated.
 
Last edited:
X16 vs X8 vs x4 is terminology to show what the PCi interface is and it's pin connections. Here's a good read on the explanation of that. As far as GPU's go....all modern GPU's are x16.
http://blog.duropc.com/2013/the-difference-between-pcie-x1-x4-x8-x16-and-x32

Where the difference comes in was when you connected 2 GPU's together (SLI - Nvidia, Crossfire - AMD) it out throttle the dual setup to x8 because the bus lane for the 2 PCI slots where shared. Not so on modern mobo's, you can SLI and Crossfire in x16 now. That's because of the technology advancements in the mobo PCI bus speed and giving them individual bus lanes.
 
X16 vs X8 vs x4 is terminology to show what the PCi interface is and it's pin connections. Here's a good read on the explanation of that. As far as GPU's go....all modern GPU's are x16.
http://blog.duropc.com/2013/the-difference-between-pcie-x1-x4-x8-x16-and-x32

Where the difference comes in was when you connected 2 GPU's together (SLI - Nvidia, Crossfire - AMD) it out throttle the dual setup to x8 because the bus lane for the 2 PCI slots where shared. Not so on modern mobo's, you can SLI and Crossfire in x16 now. That's because of the technology advancements in the mobo PCI bus speed and giving them individual bus lanes.

So are you saying that PCI-E 3.0 x16 graphics card that are described as (using x8) were "originally intended" to be used in a Crossfire setup? Hmmm, if that's what you are saying then I think I understand why I've seen some x16 (uses x8) graphics cards selling for more than their x16 counterparts. 3rd party resellers wanted to milk the Enthusiast crowd for all their worth. :)

Still, I think there's another reason for manufacturers selling PCI-E 3.0 x16 (uses x8) according to a few things I've been told lately. It may be the case that the onboard-chipset and memory speed configurations don't exceed the required bandwidth of an x8 PCI-E 3.0 connection. So, it's just a matter of saving a few pennies in production costs.

To summarize, I think there are 2 reasons for x8 graphics cards existing in the market, still. Both yours and mine. Yes or No?
 
Back