Matrox's new C900 is the world's first single-slot, nine-output graphics card

Shawn Knight

Posts: 15,287   +192
Staff member

Matrox on Wednesday announced the latest addition to its popular C-Series line of multi-monitor graphics cards, the Matrox C900. The card is noteworthy as it's the world's first single-slot, nine-output graphics card.

The Matrox C900 is a PCIe 3.0 x16 video card that packs 4GB of memory and nine mini-HDMI connectors that consumes just 75 watts. With those connectors, it can push nine 1,920 x 1,080 displays at 60Hz either in a 9x1 or 3x3 video wall configuration. Matrox says the card offers a total desktop resolution of 5,760 x 3,240 when in the latter configuration.

If that's not enough, the card can be paired with another C900 card or a Matrox C680 six-output board to push 18- or 15-screen video walls, respectively.

The company also tells us it supports digital audio through HDMI and is DirectX 12 and OpenGL 4.4 compliant and is also compatible with Matrox Mura IPX Series 4K capture and IP encoder & decoder cards.

As you've likely surmised, a card of this caliber is best suited for signage installations in retail, corporate, entertainment and hospitality environments. Matrox also says it'd be ideal for a control room video wall in security, process control and transportation environments.

Matrox said its C900 will be publicly demonstrated at ISE 2016 in Amsterdam from February 9 through February 12. The card will be available sometime in the second quarter, we're told. No word yet on pricing.

Permalink to story.

 
Imagine an implementation with mini-DP connections and proper daisychain support throughout the whole setup, might be able to push even more displays... After recently (and personally) seeing how many airports use digital signage now, or will be converting other aspects of the airport to digital signage, these cards might be a great thing, assuming the price is not "insane" relative to the cost of another system/card solution.
 
Last edited:
Before opening the article, I was curious as to what market they were aiming at, particularly after reading the mini-HDMI part. Nice to see Matrox still kicking!
 
I'm guessing it's stupid to ask if it can play Crysis at those settings... :D
it is powered by amd graphics so why not, someone did play crysis on a 6 display eyefinity setup

"someone did play crysis..." is hardly a definitive answer.
AMD makes several GPUs. The fact that Matrox uses an AMD chip doesn't automatically imply it can play Crysis, at least not at a respectable framerate. Matrox cards are built for industrial and commercial uses like kiosks or signs so they trade off gaming performance for durability.

Notice that, unlike graphic cards used for gaming, the heatsink and fan on Matrox cards are relatively small? That's because their speed has been cut back considerably. They don't have all the graphic processing bells and whistles that normal gaming graphic cards have. Less texture units, less stream processors, lower fill-rates, lower clock speeds, etc.
 
I'm guessing it's stupid to ask if it can play Crysis at those settings... :D
it is powered by amd graphics so why not, someone did play crysis on a 6 display eyefinity setup

"someone did play crysis..." is hardly a definitive answer.
AMD makes several GPUs. The fact that Matrox uses an AMD chip doesn't automatically imply it can play Crysis, at least not at a respectable framerate. Matrox cards are built for industrial and commercial uses like kiosks or signs so they trade off gaming performance for durability.

Notice that, unlike graphic cards used for gaming, the heatsink and fan on Matrox cards are relatively small? That's because their speed has been cut back considerably. They don't have all the graphic processing bells and whistles that normal gaming graphic cards have. Less texture units, less stream processors, lower fill-rates, lower clock speeds, etc.
true but amd drivers are far better in any version to be playable at very very low settings even on a fanless card, crysis is almost 10 years by now =P
 
You realize that the Matrox C680 sells for north of $750.... The C900 will probably be closer to $1000 - and sports 4gb of VRAM...

You can get a TitanX for the same price with 12GB... or you could go 4-way Crossfire/SLI with pretty decent cards...

Not to say that the C900 is overpriced - it's clearly NOT for gaming... I deliver this information just to show you that if you bought this card for gaming purposes, you'd be insane...
 
You realize that the Matrox C680 sells for north of $750.... The C900 will probably be closer to $1000 - and sports 4gb of VRAM...

You can get a TitanX for the same price with 12GB... or you could go 4-way Crossfire/SLI with pretty decent cards...

Not to say that the C900 is overpriced - it's clearly NOT for gaming... I deliver this information just to show you that if you bought this card for gaming purposes, you'd be insane...

I'd still like to see how it would perform, just because it would be entertaining.
 
I'm guessing it's stupid to ask if it can play Crysis at those settings... :D
it is powered by amd graphics so why not, someone did play crysis on a 6 display eyefinity setup

"someone did play crysis..." is hardly a definitive answer.
AMD makes several GPUs. The fact that Matrox uses an AMD chip doesn't automatically imply it can play Crysis, at least not at a respectable framerate. Matrox cards are built for industrial and commercial uses like kiosks or signs so they trade off gaming performance for durability.

Notice that, unlike graphic cards used for gaming, the heatsink and fan on Matrox cards are relatively small? That's because their speed has been cut back considerably. They don't have all the graphic processing bells and whistles that normal gaming graphic cards have. Less texture units, less stream processors, lower fill-rates, lower clock speeds, etc.
true but amd drivers are far better in any version to be playable at very very low settings even on a fanless card, crysis is almost 10 years by now =P

"amd drivers are far better in any version". You must be joking. In my experience - over 20 years - their drivers even for their own onboard chips suck big time!
 
Back