Ethereum Mining Benchmark

"NVIDIA Cards available: 1"

If this test was conducted with just 1 GPU, than the entire test is wrong & flawed. You do not mine with one GPU period, and for those that have been are just wasting money.

There is a reason why people choose AMD over NVIDIA and it boils down to cost, hashrate and performance over time. Something NIVIDA will likely not fix anytime soon unless their new Crypto GPU's account for it, and crossing my fingers they do.

These kind of articles seem to be popping up a lot lately, and for anyone that has been mining for a while are laughing at them and for good reason. I am perfectly content with using several NVIDIA GPU's, but only as a last resort or if I am selling a crypto rig off craigslist
 
"NVIDIA Cards available: 1"

If this test was conducted with just 1 GPU, than the entire test is wrong & flawed. You do not mine with one GPU period, and for those that have been are just wasting money.

There is a reason why people choose AMD over NVIDIA and it boils down to cost, hashrate and performance over time. Something NIVIDA will likely not fix anytime soon unless their new Crypto GPU's account for it, and crossing my fingers they do.

These kind of articles seem to be popping up a lot lately, and for anyone that has been mining for a while are laughing at them and for good reason. I am perfectly content with using several NVIDIA GPU's, but only as a last resort or if I am selling a crypto rig off craigslist

I'm not sure you followed the point of the article, in fact I know you didn't :D

Obviously we are benchmarking GPU performance, we're not actually mining! We failed if this wasn't blatantly obvious :(
 
"NVIDIA Cards available: 1"

If this test was conducted with just 1 GPU, than the entire test is wrong & flawed. You do not mine with one GPU period, and for those that have been are just wasting money.

There is a reason why people choose AMD over NVIDIA and it boils down to cost, hashrate and performance over time. Something NIVIDA will likely not fix anytime soon unless their new Crypto GPU's account for it, and crossing my fingers they do.

These kind of articles seem to be popping up a lot lately, and for anyone that has been mining for a while are laughing at them and for good reason. I am perfectly content with using several NVIDIA GPU's, but only as a last resort or if I am selling a crypto rig off craigslist

You come across as super bitter.

This article to seems to deliver the exact point it intended too.
 
"NVIDIA Cards available: 1"

If this test was conducted with just 1 GPU, than the entire test is wrong & flawed. You do not mine with one GPU period, and for those that have been are just wasting money.

There is a reason why people choose AMD over NVIDIA and it boils down to cost, hashrate and performance over time. Something NIVIDA will likely not fix anytime soon unless their new Crypto GPU's account for it, and crossing my fingers they do.

These kind of articles seem to be popping up a lot lately, and for anyone that has been mining for a while are laughing at them and for good reason. I am perfectly content with using several NVIDIA GPU's, but only as a last resort or if I am selling a crypto rig off craigslist

Nothing wrong with the article. Everyone seems to jump in etherium mining, most of them are gamers that have single powerful card thinking its a quick money but it looks like right. Its different if you are really into it and devote all your time and resources then one card is not enough for sure. You have a point though, I hope someone would benchmark two 1060, 1070, RX570, or RX580. Maybe you can enlighten us what is your setup and benchmark.

Back to the article, the GTX1070 seems to be the sweet spot. Holly molly, look at gtx1080.

I believe winners are the people who sold used RX470/RX480 for double the price. I know someone who sold his RX470 4gb for 500CAD (385USD).
 
1080 suffers from GDDR5X latency while 1070 have older GDDR5. Yeah 1080Ti also running an 'X' memory, but it has much more processing units and latency isn't an issue here...

My 1080Ti setup going strong although I ignore ETH entirely with many other altcoins to choose from.
 
While the article is valuable in providing high level Mining performance I think the meat of the article should have covered more.

-hashrate per watt standard clocks
-hashrate per watt optimized overclock (boost memory speeds/drop core speeds or power usage)
-Price per hashrate standard vs overclock.
-bonus: bios modded vs standard bios

If not possible on all models then maybe just on the common in market cards like the Amd 5xx series and Nvidia 1060/1070's

The per watt stuff is important as inefficient cards may require a bigger total investment in uber pricey high end power supplies. Also greater impacts the profitability model.

I believe the benefits of the Polaris cards is they can be bios modded to mine much faster with basic memory strap mods while also reducing power usage to around 80 watts while delivering around 30 on your benchmark... all the while dual mining another coin at same time. An unbeatable combo. The reason why they are sought after.

I ETH mine with a 580 and a gtx1070 by the way in different systems when they would otherwise sit idle.
 
"NVIDIA Cards available: 1"

If this test was conducted with just 1 GPU, than the entire test is wrong & flawed. You do not mine with one GPU period, and for those that have been are just wasting money.

There is a reason why people choose AMD over NVIDIA and it boils down to cost, hashrate and performance over time. Something NIVIDA will likely not fix anytime soon unless their new Crypto GPU's account for it, and crossing my fingers they do.

These kind of articles seem to be popping up a lot lately, and for anyone that has been mining for a while are laughing at them and for good reason. I am perfectly content with using several NVIDIA GPU's, but only as a last resort or if I am selling a crypto rig off craigslist

Perhaps not, but unlike CrossFireX/SLI you should get double the hash processing with 2 GPUs, triple the processing with 3 GPUs, & so on, so you can use the chart to get a scaling idea (I.e. 2 GTX 1060 [6GB] roughly = 1 GTX 1080Ti).

As for AMD vs. nVidia...if you look at where the GPUs fall in terms of both "normal" (I.e. non-crypto mining) performance & price brackets, I see a lot of similar performance between the AMD & nVidia cards -- the GTX 1050Ti is right there with the RX 460/560, the GTX 1060 is right there with the RX 470/570 & right behind the RX 480/580, etc. The only outliers are the R9 390/390X, mainly in the sense that they don't have their equivalents (I.e. no GTX 960, 970, 980, or 980Ti cards were tested).

As for performance over time...the graphs also pretty much show that, unless a driver issue can fix it, AMD cards are apparently not the long-term solution, as their hash processing drops over time -- ex. for DAG 140 an RX 580 is almost as good as a GTX 1070, but by the time you get to DAG 190 you need 2 of them just to equal/slightly surpass the same GTX 1070. Cost-wise, 2 RX 580 GPUs (pre-Etherium craze) cost about as much (if not more) than a GTX 1070, but nowadays they each cost as much or more than a GTX 1070...which means that, from a price perspective, the nVidia GPUs are looking like a better long-term investment.
 
It's apparent from these tests that over time RAM quantity will be the largest determining factor in long-term mining performance. Once the DAG eats through your RAM your card nearly becomes a paper weight. It does look like 8GB is sufficient to sustain a long-term effort though.

I still am not clear on why the 1080's performance is worse than the 1070, but the 1080ti is better by far than the 1070. I mean clearly, the 1080 ti SHOULD outperform the 1070 as it has better memory bandwidth than the 1070 and more RAM, but the 1080 has the same bandwidth for its RAM and the GPU is plenty fast enough to handle the throughput so it should also be faster than the 1070, all other things equal.

I'm wondering if Nvidia is going to drop a driver update once their mining GPUs hit the market that will patch whatever software glitch is holding the 1080 back right now since their new headless units will depend on higher mining performance for marketability, and it is in Nvidia's interest to ensure the P104 cards perform at peak mining capacity.

Only time will tell I guess.. and that time will probably be right about the time the GTX 1070 supply stores dry up ;-)

In the mean time, its all 1080 Ti's for me going forward for long-term mining performance.
 
The performance degradation of the RXs is a serious blow for the mining farms that have it.
Loosing up to 35% of performance in 50 DAG epochs, or 7 months or so, is no kidding. AFAIK a DAG epoch is 100 hours long.

It looks like for now GP10x is the way it's meant to be mined.
The true winners are 1070 and 1060 6GB, but 1080 and 1080 Ti/ Titan XP are way under the max performance. That X in the VRAM must really hurt.

@dbracer
I agree about the missing hashrate/watt and hashrate/price charts, which are the most important data for profitability and ROI. But optimized OC and BIOS modding is very time consuming, risky and not consistent with different brands, versions.

@Steve
Very good article, really informative and IMO should be part/continuation of the previous guide to Ethereum.
No benchmark on GM20x is a pity since many are bragging with the hashrate using the 347.88 drivers.
I hope you have data with power consumption like % TDP to calculate the efficiency for the hashrate/watt graph.

PS: There is a typo in the last chart, at the last two lines the value is 0.
 
God, I can't wait for this whole thing to crash.
I think one cripto did crash earlier this week. I can't stand that these miners are buying up all the cards and we gamers are getting screwed by reselers! A R9 390 can go for 3 to 4 times its original price because these aholes are buying them upo instead of getting a real job! Then they sell it without explicitly telling buyers that their cards were used in mining. I know 2 guys who bought cards without knowing on ebay and their cards lasted just long enough to run out of warranty from ebay and or paypal! Sure, not all miners do this, but it only takes 1 or 2 miners to screw ebay gamers and then they all pay. I also notice that Newegg is selling cards at almost double the price from last year.
 
Last edited:
"NVIDIA Cards available: 1"

If this test was conducted with just 1 GPU, than the entire test is wrong & flawed. You do not mine with one GPU period, and for those that have been are just wasting money.

There is a reason why people choose AMD over NVIDIA and it boils down to cost, hashrate and performance over time. Something NIVIDA will likely not fix anytime soon unless their new Crypto GPU's account for it, and crossing my fingers they do.

These kind of articles seem to be popping up a lot lately, and for anyone that has been mining for a while are laughing at them and for good reason. I am perfectly content with using several NVIDIA GPU's, but only as a last resort or if I am selling a crypto rig off craigslist

I'm not sure you followed the point of the article, in fact I know you didn't :D

Obviously we are benchmarking GPU performance, we're not actually mining! We failed if this wasn't blatantly obvious :(


lol DUH....I saw it he did not!
 
"NVIDIA Cards available: 1"

If this test was conducted with just 1 GPU, than the entire test is wrong & flawed. You do not mine with one GPU period, and for those that have been are just wasting money.

There is a reason why people choose AMD over NVIDIA and it boils down to cost, hashrate and performance over time. Something NIVIDA will likely not fix anytime soon unless their new Crypto GPU's account for it, and crossing my fingers they do.

These kind of articles seem to be popping up a lot lately, and for anyone that has been mining for a while are laughing at them and for good reason. I am perfectly content with using several NVIDIA GPU's, but only as a last resort or if I am selling a crypto rig off craigslist

I'm not sure you followed the point of the article, in fact I know you didn't :D

Obviously we are benchmarking GPU performance, we're not actually mining! We failed if this wasn't blatantly obvious :(


lol DUH....I saw it he did not!

Look at the first screenshot
"NVIDIA Cards available: 1"

If this test was conducted with just 1 GPU, than the entire test is wrong & flawed. You do not mine with one GPU period, and for those that have been are just wasting money.

There is a reason why people choose AMD over NVIDIA and it boils down to cost, hashrate and performance over time. Something NIVIDA will likely not fix anytime soon unless their new Crypto GPU's account for it, and crossing my fingers they do.

These kind of articles seem to be popping up a lot lately, and for anyone that has been mining for a while are laughing at them and for good reason. I am perfectly content with using several NVIDIA GPU's, but only as a last resort or if I am selling a crypto rig off craigslist

You come across as super bitter.

This article to seems to deliver the exact point it intended too.


HAHAHAHAHAAHAHA if the point was meant to spread more misinformation. And you better damn believe I am bitter. Ever since these *****s on wall street started spreading this BS the quality of farms have decreased. At this point I am just running 3 rigs on my own.

READ http://wccftech.com/ethereum-mining-gpu-performance-roundup/
READ http://www.legitreviews.com/best-gpu-ethereum-mining-nvidia-amd-tested_195229

If you want actual, legitimate information
 
Nice article, for those unsatisfied with single card results, here's 2 x gtx 1060 6gbs on ethminer.
http://imgur.com/KAx8LZT

Everyone has their own preferences on what's useful in mining benchmarks as already mentioned, but if the article was that in-depth, they would only be able to cover a few cards at most without taking all week.

I for one like to see GPU teardowns, VRM location/cooling, load temp measurements etc.
But for a simple "this is how these cards perform and what you can expect in the future" the article was great. Even if the ethminer hashrates on a tweaked cards are a few MH/s higher, you can still get a rough idea on the kind of drop to be expected nomatter what your cards are running at & the article delivers in that regard.

Cheers
 
"NVIDIA Cards available: 1"

If this test was conducted with just 1 GPU, than the entire test is wrong & flawed. You do not mine with one GPU period, and for those that have been are just wasting money.

There is a reason why people choose AMD over NVIDIA and it boils down to cost, hashrate and performance over time. Something NIVIDA will likely not fix anytime soon unless their new Crypto GPU's account for it, and crossing my fingers they do.

These kind of articles seem to be popping up a lot lately, and for anyone that has been mining for a while are laughing at them and for good reason. I am perfectly content with using several NVIDIA GPU's, but only as a last resort or if I am selling a crypto rig off craigslist

I'm not sure you followed the point of the article, in fact I know you didn't :D

Obviously we are benchmarking GPU performance, we're not actually mining! We failed if this wasn't blatantly obvious :(


Yes, it is very much not accurate :). Sorry if it comes off mean. But there is a bunch of these lone wolves who join pools I'm in killing my returns. 1 GPU aside accurate results would require a month of testing as each day the hashrate & power consumption increases.


It also doesn't help I am a long time investor of NVIDIA and along with Wall Street analyst who have NO CLUE what they are talking about are setting up NVIDIA for an inevitable brick wall when GPU's to Hype sales don't add up :(.
 
"NVIDIA Cards available: 1"

If this test was conducted with just 1 GPU, than the entire test is wrong & flawed. You do not mine with one GPU period, and for those that have been are just wasting money.

There is a reason why people choose AMD over NVIDIA and it boils down to cost, hashrate and performance over time. Something NIVIDA will likely not fix anytime soon unless their new Crypto GPU's account for it, and crossing my fingers they do.

These kind of articles seem to be popping up a lot lately, and for anyone that has been mining for a while are laughing at them and for good reason. I am perfectly content with using several NVIDIA GPU's, but only as a last resort or if I am selling a crypto rig off craigslist

I'm not sure you followed the point of the article, in fact I know you didn't :D

Obviously we are benchmarking GPU performance, we're not actually mining! We failed if this wasn't blatantly obvious :(

Oh and I forgot to add. The main reason why GDDR5 hinders GTX performance is because of the bandwidth usage along with the the amount of voltage needed. (this is not a knock on NVIDIA, actually it is the primary reason why their so powerful in the gaming market, but their GDDR5 tech was never meant for continual performance at a low power usage. Will have to see what Micron's GDDR6 looks like.

This article explains it a bit better http://blog.logicalincrements.com/2017/02/types-vram-explained-hbm-vs-gddr5-vs-gddr5x/ AMD uses HBM & Nvidia uses GDDR5/GDDR5x. T
 
"NVIDIA Cards available: 1"
You do not mine with one GPU period, and for those that have been are just wasting money.

wrong. earning £3.30 per day with i73930k and gtx 1060 6gb. at a cost of 63p per day in electricity. you also say nvidia doesnt utilize power well. I disagree. you can tweak miner configs/nvidia control panel to find a great balance.

now you may say that amount earned is nothing. but thats free money that will gain interest as cryptos gain value. and will do until buy more gpus.
 
"NVIDIA Cards available: 1"
You do not mine with one GPU period, and for those that have been are just wasting money.

wrong. earning £3.30 per day with i73930k and gtx 1060 6gb. at a cost of 63p per day in electricity. you also say nvidia doesnt utilize power well. I disagree. you can tweak miner configs/nvidia control panel to find a great balance.

now you may say that amount earned is nothing. but thats free money that will gain interest as cryptos gain value. and will do until buy more gpus.

Its not "free money". How much did the rig cost you, and would you have it if you weren't mining? If so, have you accounted for the fact that it'll burn out quicker because of mining, and thus will require replacing sooner and cost more as a result?
 
I have a RX 580 8gb. I just happened to be building a new PC the day the RX 580 was released. After a week or two I saw that the card I picked was going up in price due to the "crypto currency bubble". I never knew anything about mining previously other than hearing the term (I actually still don't lol). So I did a Google search on how to get started. They pointed me to Nice-hash. Installed it and hit start. I have been letting in run when it would otherwise would be idle (during the day when I'm at work & when Im sleeping). Since it's not full time I probably make $15-$20/wk. I remember thinking "wow this is great, FREE MONEY!" There were a couple little things to get used to... My bachelor pad/studio apartment seemed like it was always hot & sticky. No biggie, I'll just crank the A/C...
Well a month later, my stance on FREE MONEY is beginning to change. I have received 2 Bitcoin payments into my wallet each of about $33 ($66.xx total). I have also just received my electric bill, $228.54! My last electric bill was $44 that is with no mining and no A/C.
My thinking today is much different. Nice-hash gives you an estimate of daily profits. When I started it was telling me I would average about $4.50 a day. Since then it has went down to $2.50 a day. I don't understand if this is because the "difficulty" getting harder or the price of the coins dropping. But I'm coming to the conclusion that the idea of "free money" doesn't exist. I haven't decided what to do yet. I'm leaning towards either selling the RX 580 8gb for what ever profit I squeeze out or stopping mining all together during the heat of the summer and maybe look at it again around the time it cools off and I need to heat my tiny apartment anyway.
 
So wrong. Most miners mod the amd cards. You should take that into consideration. Because a 10minute task can boost your 19mhs rx470 to 28mhs.
 
While the article is valuable in providing high level Mining performance I think the meat of the article should have covered more.

-hashrate per watt standard clocks
-hashrate per watt optimized overclock (boost memory speeds/drop core speeds or power usage)
-Price per hashrate standard vs overclock.
-bonus: bios modded vs standard bios

If not possible on all models then maybe just on the common in market cards like the Amd 5xx series and Nvidia 1060/1070's

The per watt stuff is important as inefficient cards may require a bigger total investment in uber pricey high end power supplies. Also greater impacts the profitability model.

I believe the benefits of the Polaris cards is they can be bios modded to mine much faster with basic memory strap mods while also reducing power usage to around 80 watts while delivering around 30 on your benchmark... all the while dual mining another coin at same time. An unbeatable combo. The reason why they are sought after.

I ETH mine with a 580 and a gtx1070 by the way in different systems when they would otherwise sit idle.

This, performance per watt is the most important parameter for a lot of miners out there
 
The performance degradation of the RXs is a serious blow for the mining farms that have it.
Loosing up to 35% of performance in 50 DAG epochs, or 7 months or so, is no kidding. AFAIK a DAG epoch is 100 hours long.

It looks like for now GP10x is the way it's meant to be mined.
The true winners are 1070 and 1060 6GB, but 1080 and 1080 Ti/ Titan XP are way under the max performance. That X in the VRAM must really hurt.

@dbracer
I agree about the missing hashrate/watt and hashrate/price charts, which are the most important data for profitability and ROI. But optimized OC and BIOS modding is very time consuming, risky and not consistent with different brands, versions.

@Steve
Very good article, really informative and IMO should be part/continuation of the previous guide to Ethereum.
No benchmark on GM20x is a pity since many are bragging with the hashrate using the 347.88 drivers.
I hope you have data with power consumption like % TDP to calculate the efficiency for the hashrate/watt graph.

PS: There is a typo in the last chart, at the last two lines the value is 0.


AMD fixed their drivers. no more DAG slowdown.
 
Back