Another company relaunches the GeForce GT 730 amid graphics card crisis

midian182

Posts: 9,748   +121
Staff member
What just happened? Another graphics card has launched on the market, albeit with little fanfare. The good news is that availability is likely to be excellent; the bad news is that it’s a GeForce GT 730, a card that first arrived back in 2014.

As we’re all too familiar, the global chip shortage continues to make the once-simple task of buying a graphics card a nightmare, with even previous-gen products selling for ridiculously high prices. It’s resulted in some companies turning to (much) older cards to meet demand.

In Asus’ case, it’s the GeForce GT 730, part of the 700 series that first arrived in 2013 and has covered Fermi, Kepler, and Maxwell architectures. This particular model uses the GK208 (Kepler) GPU. There are 384 CUDA cores, a 902 MHz boost clock in gaming mode with a 927 MHz OC boost, and 2GB of 5 Gbps GDDR5. The memory operates across a 64-bit bus, offering a total bandwidth of 40.10 GB/s.

You’re obviously not going to be playing the latest games on this card, but it could satisfy certain use cases. It comes with four HDMI 1.4b ports with HDCP 2.2 support and can output resolutions reaching 4K, so those looking for something that can facilitate multi-monitor work setups may be tempted.

Additionally, the Asus GeForce GT 730’s tiny size (single slot, 5.8 x 4.1 x 0.7 inches) and 38W (powered from the PCIe slot) should make it ideal for small form factor PCs/HTPCs—Asus recommends a 300W PSU.

While Nvidia will end support for Kepler in an upcoming GeForce driver, it will still provide LTSB (long-term support branch) R470 drivers with support for kernel revisions, OS updates, and bug fixes until 2024.

No word on price or availability yet, but seeing as a five-year-old GTX 1050 Ti often costs more than twice its original MSRP these days, don’t expect a bargain.

With Nvidia predicting that the chip shortage will last "the majority" of 2022, more companies are releasing older cards. MSI also brought back the GT 730 back in June.

h/t: @momomo_us

Permalink to story.

 
I recently decommissioned my GTX 780 into the shed, after having used it for 7 years, as inadequate for the modern day. And these clowns are selling GT 730.

Dytq1A3elVDuK7u1jgp6focV0zH-VXPfkKEwg1azk5FOpXspqDohF6yHEeyRSPPPGq3tpHdJ6Skwe_GpV9F4_AKXbOs8c8pma2ks1BKaYQhURYHql-NbxET3E7ef_SybGMTLISdyCxTZ_mnb_FesqNjUtT6BSe_gapM
 
I can see now why Linus recently reviewd the GTX 780TI.

We may be seeing a return of the GTX 780TI with 6 GB memory.


Who knows, we may even see Fermi back.
 
Why they remake those hopelessly useless cards, why not return Pascal like the 1070, 1080 or something like that.
They aren’t “hopelessly useless” at all. Graphics cards aren’t only used to play games on. This card can support 4 monitors. No APU can do that, in my office our older workstations have quadros in them for this exact purpose - to support 4 monitors.
 
So, what is it about supply constraints that makes the 730 feasible, but not a 780 or a 1080, etc. ?

No point churning out expensive 561 mm² GK110 chips in 2021.
Its 3D performance and features are obsolete.

On the other hand, a few runs of the cheap old 28nm TSMC process will get you a ton of 87 mm² GK208Bs, which are still useful for low end 2D multihead cards. That may just make sense given the lack of capacity on more advanced fabbing nodes.
 
I'm keeping a Zotac 710 as a backup GPU.
I use a GT730 every day as primary adapter in a Linux workstation.
Card is fine for that purpose.

The machine also has an RTX 2070 installed, passed through to a Windows instance running in a VM, which puts Steam Big Picture on a TV.
 
I recently decommissioned my GTX 780 into the shed, after having used it for 7 years, as inadequate for the modern day. And these clowns are selling GT 730.

Dytq1A3elVDuK7u1jgp6focV0zH-VXPfkKEwg1azk5FOpXspqDohF6yHEeyRSPPPGq3tpHdJ6Skwe_GpV9F4_AKXbOs8c8pma2ks1BKaYQhURYHql-NbxET3E7ef_SybGMTLISdyCxTZ_mnb_FesqNjUtT6BSe_gapM
Worse are the clown buying it, just because its a nvidia card.
 
Worse are the clown buying it, just because its a nvidia card.
The people buying it are people who need a cheap graphics adapter that supports 4 monitors. Before this you need an expensive and/or out of stock GPU to do this. So Nvidia are providing consumers with an alternative option. In the U.K. at least there is no sign of any stock of AMD cards with 4 monitor support outside of their expensive pro cards. Even the GT1030 does not support quad displays.

This would be a “pro-consumer” practise from Nvidia for users who want 4 displays but don’t need expensive GPUs. In my office we run screens with browsers on them that have splunk plugins to display data. This is an old solution done on old quadros but if we were building the solution today this GT 730 would be ideal as the GPU isn’t being used for anything 3D.

I know in your case it’s a difficult concept to get your head round - Nvidia doing a good thing. But restarting this line only saves users money. These cards cost less than a full price 3D game.
 
Last edited:
So, what is it about supply constraints that makes the 730 feasible, but not a 780 or a 1080, etc. ?
Same as if their were no constraints. You don't just make any GPU you want any time you want. 730's were chosen in this case because they were already being made for other companies and they just had to put orders in.
 
The people buying it are people who need a cheap graphics adapter that supports 4 monitors. Before this you need an expensive and/or out of stock GPU to do this. So Nvidia are providing consumers with an alternative option. In the U.K. at least there is no sign of any stock of AMD cards with 4 monitor support outside of their expensive pro cards. Even the GT1030 does not support quad displays.

This would be a “pro-consumer” practise from Nvidia for users who want 4 displays but don’t need expensive GPUs. In my office we run screens with browsers on them that have splunk plugins to display data. This is an old solution done on old quadros but if we were building the solution today this GT 730 would be ideal as the GPU isn’t being used for anything 3D.

I know in your case it’s a difficult concept to get your head round - Nvidia doing a good thing. But restarting this line only saves users money. These cards cost less than a full price 3D game.

But why would NVidia cater to the audience who represents the smallest percentage of the market? At least I'm assuming that. How many people need to hook up 4 monitors to 1 videocard?

I'm not going to complain about it but this move isn't for gamers and I thought that's the audience who hurting the most because of miners....
 
But why would NVidia cater to the audience who represents the smallest percentage of the market? At least I'm assuming that. How many people need to hook up 4 monitors to 1 videocard?

I'm not going to complain about it but this move isn't for gamers and I thought that's the audience who hurting the most because of miners....
The audience that is hurting from miners the most is gamers but there are a few other victims on the way, anyone who needs more capable GPUs. Nvidia don’t only cater to gamers. I have no idea how many people need this solution, it is a niche. I imagine they saw a demand and filled it, maybe complaints from hardware suppliers, Nvidia probably make a reasonable profit margin even if it’s small amounts. Even if they don’t it’s a cheap way to keep their partners happy (or mitigate their frustration).

We have quite a few machines that have 4 monitors connected as wallboards in the Network Operations Center that I work in, they all have specific GPUs for this. I can imagine a few basic monitoring solutions in other businesses using such a solution. Maybe display boards for very basic retail use?
 
Limiting to a single monitor system, is this junk any better than the 4000 HD IGP in my Intel i3-3225?
Sure, for a number of reasons. Which include that these cards provide dedicated VRAM, while your IGP takes a chunk of your system memory. And they allow you to use modern Nvidia drivers, and come with a decent H264 decoder. There are versions with VGA/HDMI/DisplayPort to fit one's need. And this GPU is still about twice as fast in 3D, for what it's worth.
 
" chip shortage will last "the majority" of 2022" great news
IDK why anybody was expecting better. TSMC & Samsung are building fabs in the USA, in Arizona & Texas respectively.
Intel has threatened to build a "super duper mega fab" at a cost of some 125 billion dollars.

Given the severity of the current chip shortage, I can't picture the siruation abating until at least one, (or possibly more), of these comes online, which is supposed to happen toward the end of 2022, god, weather, covid, and seismic activity permitting..
 
My main GPU is still a GT 730. I mainly play games from before 2018 though. I can say it's very nice this one is not a Fermi based gt 730 and that it's a GDDR5 version. My GDDR5 card is almost 30% more powerful than a ddr3 version.
 
But why would NVidia cater to the audience who represents the smallest percentage of the market? At least I'm assuming that. How many people need to hook up 4 monitors to 1 videocard?
These kind of GPUs are more popular than you think. At the major retailer I worked these GPUs outsold the gaming cards when it came to volume. In most cases they just needed something to drive multiple screens that would fit in a small prebuilt. And when it comes to low profile and single slot, your choices are really limited.
 
Back