Nvidia: RTX 3060 won't be good for Ethereum mining

nanoguy

Posts: 1,355   +27
Staff member
Why it matters: As miners and gamers fight to get their hands on the latest graphics cards, Nvidia is taking on the task of trying to please both crowds. By making mining-specific silicon and nerfing the upcoming RTX 3060 graphics card's ability to mine efficiently, the company hopes to quell criticism that it caters only to the crypto crowd.

The crypto craze is having a big impact on the availability of graphics cards, to the point where miners are snapping up laptops equipped with Nvidia's RTX Ampere GPUs to support their ever-growing operations. Some are even actively trying to tick off the gaming community by bragging about having things like mobile mining farms in the trunk of their expensive sports cars, RGB and all.

GPU manufacturers AMD and Nvidia are usually quick to respond to this type of demand, but the situation this time around is vastly different to what we saw a few years ago. Emerging trends like working and studying from home, as well as people looking to entertain themselves during lockdowns have put additional pressure on the tech industry's supply chain, leading to a lot of scalping and higher prices for computer hardware.

Gamers are understandably upset about it, but cryptocurrency mining has helped some businesses that have been affected by the pandemic to stay afloat. On the other hand, Nvidia and AMD made huge profits riding the mining waves, so they haven't felt the need to do much about it. And board partners like Zotac celebrating the issue only adds insult to injury.

That changes today, as Nvidia announced it will take two steps to ensure that both miners and gamers can get what they want, or so they claim. The company will essentially nerf the upcoming GeForce RTX 3060 in terms of hash rate for popular mining algorithms, in the hopes that it will become less attractive for miners and in turn put more cards "in the hands of gamers."

According to Nvidia, drivers will be able to detect when, say, an Ethereum mining algorithm is running and throttle the card to lower the mining efficiency to 50 percent. These performance restrictions will be applied to both Windows and Linux drivers, which means that even using custom Linux distributions like NiceHash OS or Hive OS won't work to restore full compute capabilities for mining purposes.

The second announcement is the Nvidia Cryptocurrency Mining Processor, which is not an entirely new idea. Some of Nvidia's board partners have sold mining-specific graphics cards in the past, and those were essentially the same as their consumer counterparts, save for the absence of video outputs and using a different driver.

Nvidia says CMP HX products will feature silicon that has no video outputs and is optimized for mining efficiency. The cards will be available from AIB partners in four configurations ranging from 6 GB to 10 GB of VRAM and can achieve up to 86 MH per second in Ethereum mining algorithms. Power consumption ranges from 120 watts for 26 MH per second to 320 watts for 86 MH per second. Of course, miners are able to get significantly better numbers with tuning of individual cards, but that takes additional time and effort.

At this point, there is no way to predict if Nvidia's strategy will be advantageous for consumers.

The RTX 3060 is likely to be Nvidia's top-selling RTX 3000 series graphics card, so limiting its mining appeal is a welcome move, even if it's a software lock that can be circumvented in time. And this doesn't solve the problem for people interesting in buying higher-end Ampere graphics cards.

Releasing mining-specific cards will likely reduce demand for consumer versions, but will also produce a large amount of e-waste when mining is no longer be profitable on them. But the real question is whether miners will bother buying them when gaming graphics cards can be resold to recoup some of the investment money once they can no longer turn a profit.

Permalink to story.

 
That's the opposite. I'm talking about miners hacking the handicapped cards to hash at full output.
I didn't see that part right away, they really buried the lede on this one

Either way, like with the mining-only cards, the laziness of manufacturers will be their undoing, if they just use the same card and same circuitry for the mining as non-mining chips, and determined parties will make the fixes.

Regardless, if Nvidia thinks baking usage-based restrictions into how people use their cards is acceptable, they've got another thing coming
 
Colour me uninmpressed. Unless they do the same nerf for all the line-up I couldn't care less. I would not get a 3060 no matter if there was plenty of supply at RRP.

What about the deafening silence from AMD, any idea if they'll do something similar. Highly doubt it though.
 
Make it so GPU can’t be used for this crypto nonsense
Impossible. Telling at a firmware level what software is being run on a piece of silicon is unobtanium, especially since software is continuously updates and firmware updates cannot be forced. Any restrictions in drivers/software can be reversed, and the unsigned drivers used. Not to mention linux exists.

Creating a GPU that physically cant mine would require the gimping of GPGPU functions, like the ones used for raytracing and general shading operations. The resulting GPU would perform terribly in many games.

This is just a PR move from nvidia, it wont accomplish anything. They only way to stop this would be a concentrated effort by AIBs and retailers to limit mining purchases, and that would fly in the face of their main goal, which is to sell hardware. Limiting their own hardware sales, and thus profit, so gamers can buy them instead is an own goal that nobody is willing to take one, especially in the middle of a global pandemic that is upsetting every economic system on the globe.
 
Impossible. Telling at a firmware level what software is being run on a piece of silicon is unobtanium, especially since software is continuously updates and firmware updates cannot be forced. Any restrictions in drivers/software can be reversed, and the unsigned drivers used. Not to mention linux exists.

Creating a GPU that physically cant mine would require the gimping of GPGPU functions, like the ones used for raytracing and general shading operations. The resulting GPU would perform terribly in many games.

This is just a PR move from nvidia, it wont accomplish anything. They only way to stop this would be a concentrated effort by AIBs and retailers to limit mining purchases, and that would fly in the face of their main goal, which is to sell hardware. Limiting their own hardware sales, and thus profit, so gamers can buy them instead is an own goal that nobody is willing to take one, especially in the middle of a global pandemic that is upsetting every economic system on the globe.
...Did you read the article? It says that the driver can detect what algorithms are running and limit them. Not impossible, and I don't see why they would claim something that can so easily get fact-checked by the tech community.

How effective it will be at handicapping gaming cards used for mining is a different story...
 
Sounds like mostly a PR response to consumer complaints. I'm not sure they could have done anything more anyway. The serious mining operations are already running, or at least capable of, their own drivers and firmware. A mining-only card would need mining benefits in excess of the lost resale value, and even then is ultimately still competing for wafers. Gaming consumers would see less competition for gaming models and also less gaming model cards manufactured.

There may be some mild benefit as far as scaring off some amateur would-be miners from buying any card at all I guess.
 
Impossible. Telling at a firmware level what software is being run on a piece of silicon is unobtanium, especially since software is continuously updates and firmware updates cannot be forced. Any restrictions in drivers/software can be reversed, and the unsigned drivers used. Not to mention linux exists.

Creating a GPU that physically cant mine would require the gimping of GPGPU functions, like the ones used for raytracing and general shading operations. The resulting GPU would perform terribly in many games.

Like the others already pointed out, if it can be done via driver, it can also be done in firmware. In fact, all the driver does is talk to the firmware lol.

The firmware indeed can't detect if you're running ccminer.exe but it doesn't need to know that. There is no other software that performs the same kind of hashing instructions over and over and over as miners, especially not in GPU. It takes just an instruction pattern and a counter, and once the threshold is reached, reduce clock speed, that's it. In case of Ethereum, they could also just check for the existence of DAG. It's extremely easy to detect in various ways, it's just that NV simply didn't give 2 f*cks about it until now.

And unlike drivers, firmware couldn't be hacked, at least not on NV cards, because the whole stuff is encrypted and to my best knowledge, no one has done it yet, or at least hasn't revealed it.

By the way, NV was much faster to implement such measures when it hurt their profits. For example, they intentionally nerfed various instructions used in certain professional software and made them run slower for certain tasks (AutoCAD etc) in GTX/RTX cards so that you're forced to buy the 5 times more expensive Quadro counterparts where the instructions are enabled.

They could do this mining nerf to all current cards too, not just this one new model. The fact that they don't shows just how half hearted this effort is. Not much more than just lip service. And I'll be really surprised if the driver restrictions aren't bypassed within days, if not hours after release.
 
This is all a bunch of nonsense. My guess is that these "crypto" cards will be sold at a premium, even though they're the exact same cards with actually less features (no video output for example). nVidia is pretending like they're trying to please both gamers and miners, but really they only care about their bottom line. How does that old saying go? You can others a little and still help YOURSELF a lot?
 
Regardless, if Nvidia thinks baking usage-based restrictions into how people use their cards is acceptable, they've got another thing coming
You realize that they've been doing that for a long time, right? Quadros use the same silicon as consumer cards, but consumer cards have artificially limited FP16 performance.

Impossible. Telling at a firmware level what software is being run on a piece of silicon is unobtanium, especially since software is continuously updates and firmware updates cannot be forced. Any restrictions in drivers/software can be reversed, and the unsigned drivers used. Not to mention linux exists.

Fun fact: The Linux open source drivers for Nvidia have awful performance(on anything newer than Kepler), because the GPUs are locked at very low clocks unless you have signed drivers.
 
Colour me uninmpressed. Unless they do the same nerf for all the line-up I couldn't care less. I would not get a 3060 no matter if there was plenty of supply at RRP.

What about the deafening silence from AMD, any idea if they'll do something similar. Highly doubt it though.

Are RX 6xxx cards used for mining though ? Keeps seeing AMD mentioned in the same breath as nVidia when it comes to mining but when you see pictures of mining farms there‘s hardly a RDNA2 card to be seen if at all.

Looking at ETH hash rate, it seems like RDNA2 cards are noticeably inferior to Ampere and price / performance wise a good bit behind Polaris, so it seems that AMD made sure their architecture was not particularly good at mining.

A 6900 XT has roughly the same mining performance as a 3060 Ti (62.83 MH/s @ 145W vs 60.21 @ 120W) so saying the two are in the same boat seems like damage control.


 
"The more you buy, the more you save!" - Jensen
We all laughed at him, now look who's laughing: Jensen and the miners! He was talking to them, not to gamers!

This limiting the 3060 for mining is just a kick in the face for the gamer guy that managed to convince himself to pay the $600+ scalper price this card will have by thinking he can mine on it on the side to recoup some of the fool's tax he will pay...

Serious miners won't be affected by this at all.

Also VCZ has an article about testing this card already and they say it's gimped in BIOS since the tester has no new drivers.

LNK > https://videocardz.com/newz/zotac-g...-mining-test-shows-reduced-hashrate-in-action
 
You realize that they've been doing that for a long time, right? Quadros use the same silicon as consumer cards, but consumer cards have artificially limited FP16 performance.



Fun fact: The Linux open source drivers for Nvidia have awful performance(on anything newer than Kepler), because the GPUs are locked at very low clocks unless you have signed drivers.

And you have tried HiveOS and you can confirm that you are correct, and not have made a mistake about linux, mining, drivers and so on ?!
 
I'm amazed that Nvidia didn't do it by design in hardware.

Yeah I understand that they want to sell stuff. And yes they laugh all the way to the bank when realistic 3090 price is more like 3000$ not 1500$. Seen on Amazon Marketplace last week 3090 for **6390** Euros just in case anybody wonders...

Still there was a time when BTC was minable on GPUs, seriously, when nVidia GPUs were absolute rubbish for mining. Remember 580 for example. AMD cards were miles and miles ahead in Hash rate. Like 40x more. nVidia cards were crap at mining and drew so much power while doing this it was just stupid. But people were perfectly happy to buy all 580s up to game or render (however limited that was years ago).

Why nobody designs GPUs which are on purpose dog poop at mining? It has happened before, perhaps not entirely by design, but it did happen nevertheless. Drivers at this stage are beyond stupid. What Leather Jacket is smoking?
 
I had initially thought that the CMP HX chips were just normal GPUs, taken from the bins labelled 'DON'T USE THESE' (because of major defects in multiple GPC elements, such as the RT cores, the NVENC system, or other sections). However, this image doesn't look like any GPU that Nvidia has made recently:

2021-02-18-image-21.jpg


So assuming that the image is real, then that GPU has 128 TPCs - which equates to 256 SMs. The GA102 (on the right below) has just 42 TPCs/84 SMs.

cmphx_ga102.jpg

There are no geometry processors (these are at the bottom of the columns of TPCs in the GA102) and there are two sets of command processors and L2 cache partitions (the GA102 just has the one at the bottom).

It's a little similar to the GA100:

EX_HYvcXgAAMqf8


Yes, that chip only has 64 TPCs, but they're designed to handle tensors and FP64 by the bucket load. Even so, 128 TPCs is a ridiculous amount, especially when you look at the performance figures Nvidia are giving for the 90HX - 86 MH/s at 320W.

An RTX 3090 gets around 120 MH/s at 300W or so, which suggests that if the chip layout really is like that, then it's clocked to something like 400 MHz - which doesn't make any sense at all. I think I'll stick with my original assumption that the CMP HX range is simply made from duff GA10x chips and be disappointed with the promo die shot appearing to be nothing more than an intern's vivid imagination with Photoshop.
 
Yeah I understand that they want to sell stuff. And yes they laugh all the way to the bank when realistic 3090 price is more like 3000$ not 1500$
Other than their own, Nvidia doesn't make money from the sale of graphics cards - they sell chips to the likes of Zotac, Pallit, EVGA, etc. Those 3rd party vendors sell directly and to retailers, and it's there where the price hikes are earning the big bucks.

Most parties, of course, are showering in money every morning: Nvidia, because they're selling every single chip they can make, and AIB partners for the same reason, albeit for cards. Retailers are the ones losing out, because while they're slapping on a big increase over the MSRP, they're not shifting enough volume.

Why nobody designs GPUs which are on purpose dog poop at mining? It has happened before, perhaps not entirely by design, but it did happen nevertheless.
Because if you did that, the GPU would be really poor at compute in general, and this is a big element in the rendering of games today. It has to be done via a combination of firmware and drivers to artificially limit capabilities, just like Nvidia does with their Quadro range when doing FP64 (the GeForce line of Ampere cards has 1/64th rate, whereas the Quadros have 1/32th, despite using exactly the same GPU).
 
Are RX 6xxx cards used for mining though ? Keeps seeing AMD mentioned in the same breath as nVidia when it comes to mining but when you see pictures of mining farms there‘s hardly a RDNA2 card to be seen if at all.

Looking at ETH hash rate, it seems like RDNA2 cards are noticeably inferior to Ampere and price / performance wise a good bit behind Polaris, so it seems that AMD made sure their architecture was not particularly good at mining.

A 6900 XT has roughly the same mining performance as a 3060 Ti (62.83 MH/s @ 145W vs 60.21 @ 120W) so saying the two are in the same boat seems like damage control.

RX 6000 series seriously lack memory bandwidth that many toycoins want. While 128MB Infinity cache is more than enough for games, it's not nearly enough for toycoins that need access to large amount of fast memory. And that fast memory is something 6900XT lacks.

Yes, RDNA2 was designed to be bad mining card. Accidentally or not.
 
Back