Nvidia's GTX 2070 and 2080 cards could be unveiled next month, launch late April

I have no problem with Nvidia mildly tweaking Pascal, then just shrinking it on a refined process. Dump those 5,000+ Volta cores from Titan V on a chip minus the deep learning stuff for the high end GPUs.

So you get a couple cards. One equal to and one a bit faster than the GTX1080ti at the top of the range. GTX1080 performance becomes midrange, 150w TDP or whatever while GTX1060 refresh becomes low end ~100w.

All this sounds perfectly great to me, because it's not like AMD have anything good coming for the next 12 months lets face it!

The only thing anyone really wants are the prices to reflect a normal generation shift like that (I.e they all drop down by a big chunk), which they probably won't.
 
Hmm, I did a bit of reading about that. I would actually say it’s absurd to suggest that DX12 would not have existed without Mantle but yes it does appear that Vulkan and DX12 had a lot of the work done for it by Mantle. It’s actually looking more like AMD tried to make these developments proprietary to AMD which is a bit of a **** move. Still, mantle was killed well over 3 years ago, hardly a recent innovation. One thing I can think of though is freesync. I love freesync mostly because it’s free. Adaptive sync tech for me isn’t a game changer or anything spectacular but at least AMD don’t charge for it!

Vulkan had all of work done by Mantle. DirectX 12 had almost everything. It's quite easy to prove. Microsoft had Mantle capable hardware on XBox one. OK, if Microsoft Really had something like DirectX 12 in mind, they would have surely used it with Xbox one immediately. Well, Microsoft didn't say a word about anything DirectX 12 related when XBox launched. Microsoft used Mantle to create DirectX 12 and well after XBox one launch said that Xbox one will use DirectX 12 much later. So first came Xbox one, then came DirectX 12. In case Microsoft had developed DirectX 12 without Mantle, they would have launch it with XBox one.

Mantle proprietary? Give some examples of AMD's proprietary technologies on recent years?

Agreed about Freesync.

All this sounds perfectly great to me, because it's not like AMD have anything good coming for the next 12 months lets face it!

7nm Vega? Current Vega's weakest point is low clock speed, TSMC's 7nm tech brings AMD on par with Nvidia's manufacturing tech.
 
Vulkan had all of work done by Mantle. DirectX 12 had almost everything. It's quite easy to prove. Microsoft had Mantle capable hardware on XBox one. OK, if Microsoft Really had something like DirectX 12 in mind, they would have surely used it with Xbox one immediately. Well, Microsoft didn't say a word about anything DirectX 12 related when XBox launched. Microsoft used Mantle to create DirectX 12 and well after XBox one launch said that Xbox one will use DirectX 12 much later. So first came Xbox one, then came DirectX 12. In case Microsoft had developed DirectX 12 without Mantle, they would have launch it with XBox one.

Mantle proprietary? Give some examples of AMD's proprietary technologies on recent years?

Agreed about Freesync.



7nm Vega? Current Vega's weakest point is low clock speed, TSMC's 7nm tech brings AMD on par with Nvidia's manufacturing tech.
Why do you want more examples of AMDs proprietary techs? Are you trying to claim that Mantle wasn’t? It was AMD trying to corner the market like any large money grabbing corporation would.

I do think your naive if you think DX12 wouldn’t exist without Mantle. Even Vulkan, which really was the successor to OpenGL. Both took big parts from Mantle but attributing AMD with the existence of those API’s is only something a proper AMD fan would attempt to do. Really it would be more accurate for AMD to thank Microsoft and the Vulkan dev team for incorporating tech that favours its cards.

You say you agree about freesync but I don’t think you do, really AMD just innovated the price. Nvidia were first up with adaptive sync and they implemented it better. AMD just copied it, probably to stay competitive. I however do not think it’s worth the premium. And think it’s a great free addition.

And Hardreset we all know you bleed for AMD for some reason. In fact why do you love that company so much?
 
Why do you want more examples of AMDs proprietary techs? Are you trying to claim that Mantle wasn’t? It was AMD trying to corner the market like any large money grabbing corporation would.

I do think your naive if you think DX12 wouldn’t exist without Mantle. Even Vulkan, which really was the successor to OpenGL. Both took big parts from Mantle but attributing AMD with the existence of those API’s is only something a proper AMD fan would attempt to do. Really it would be more accurate for AMD to thank Microsoft and the Vulkan dev team for incorporating tech that favours its cards.

AMD has been very favourable to open source. So I would like to know about AMD's proprietary techs. If you cannot name any, why you think Mantle was supposed to be proprietary? Even when AMD gladly gave it away. And shared source code immediately :cool:

As said, Vulkan took everything from Mantle. Everything. And for Microsoft, it would just be stupid to launch Xbox One without DirectX 12 if they really had something similar.

You say you agree about freesync but I don’t think you do, really AMD just innovated the price. Nvidia were first up with adaptive sync and they implemented it better. AMD just copied it, probably to stay competitive. I however do not think it’s worth the premium. And think it’s a great free addition.

And Hardreset we all know you bleed for AMD for some reason. In fact why do you love that company so much?

Adaptive sync = FreeSync. Same thing, different names. Nvidia have never supported adaptive sync. Also AMD did not copy as AMD's tech works without proprietary scaler (that of course costs money).

I suggest you do your homework before commenting I favour AMD.
 
I'm very excited to see the new line of Nvidia cards.

Can't wait to see what the replacement for the 1080ti and the Titan Xp are.

I'll trade my TitanXp for cash towards the next Titan just like I did before.
 
:) Only a fool would buy an exhausted graphic card that has run at full load 24/24 hours a day whilst being suffocated like canned sardines; the buyer would be lucky if it would be working just a week or two till the final gasp.
As long as you keep thermals in check the idea that you can work a card to death is a myth. Also, all of my cards are underclocked and undervolted. 30% reduction in power with less than a 5% loss in hash rate.

I cook my 1080ti and 1070 at 69C all week. The secret to properly baking a GPU is to slow-cook it at a very low temperature over a long period of time so that the die just falls off the PCB when the used buyer gets it. If you do what most gamers do, and leave it running at 82 degrees all the time, the card comes out tough and unsavory. Learned this from a pit master.
 
AMD has been very favourable to open source. So I would like to know about AMD's proprietary techs. If you cannot name any, why you think Mantle was supposed to be proprietary? Even when AMD gladly gave it away. And shared source code immediately :cool:

As said, Vulkan took everything from Mantle. Everything. And for Microsoft, it would just be stupid to launch Xbox One without DirectX 12 if they really had something similar.



Adaptive sync = FreeSync. Same thing, different names. Nvidia have never supported adaptive sync. Also AMD did not copy as AMD's tech works without proprietary scaler (that of course costs money).

I suggest you do your homework before commenting I favour AMD.
G sync is a form of adaptive sync that’s generally specified higher. It’s like saying Mercedes and BMW aren’t the same things, sure they aren’t but they are both still cars. I would pick Gsync over freesync because it does work better but I wouldn’t pay for either.

I have done my reading and mantle was open source but to suggest that Nvidia adopting mantle wouldn’t have given AMD some sort of advantage is absurd, in fact reading between the lines it’s obvious that Nvidia didn’t support it because it doesn’t give Nvidia cards an advantage and that’s important when you are a giant corporation with shareholders. Many game developers had already snubbed mantle. When many of the benefits of mantle were employed by Vulcan and DX12 Nvidia would be disadvantaged to ignore it and therefore it’s supported and AMD cards tend to feel that benefit.

You didn’t answer the question I wanted answering the most though, why all the love for AMD? They are just another chip corporation. I’ve owned cards from both manufacturers over the years, More have been AMD/ATI cards than Nvidia parts. However in 2018 I really don’t see why anyone outside of the mining community would opt for an AMD card over an Nvidia card, mostly due to pricing, as I mentioned earlier I have recently acquired a 1080ti and I would have opted for Vega 64 if it had been available for a reasonable price but it’s not and won’t be for a while yet I don’t think. One thing that does bother me about AMD recently was the Vega fiasco, remarkably overpriced frontier edition for early access and then the fake launch price introduced to get reviewers to produce reviews lauding the card for its value for money. Actions like this don’t leave a good taste in the mouth. Still, if their cards can be had for a decent price then I wouldn’t hesitate to buy them. Nvidia aren’t much better and even if they were, only the fans and journalists care about this sort of behaviour.

Would you pick a Vega 64 over a 1080ti at today’s pricing?
 
7nm Vega? Current Vega's weakest point is low clock speed, TSMC's 7nm tech brings AMD on par with Nvidia's manufacturing tech.

7nm Vega is supposed to be sampling in 2018, but the word is it's not a consumer series. It doesn't look due for at least 6 months either when they do drag themselves out the door it'll probably end up compute focused.

Assuming best case and they DO dish out a 7nm consumer part for late this year, Vega 64 Liquid overclocked struggles against a mere GTX1080 overclocked in games. It still loses on the tests I have seen. Forget about the present top end 1080ti! How much clockspeed do you think they could even gain on 7nm for mass produced regular consumer parts?? Certainly not much beyond a max OC liquid cooled Vega 64 one imagines.

So if we wildly calculate best case scenarios (consumer actually planned, it won't be delayed, it samples well) it still isn't going to put AMD in the game much above mid range any time soon. Seems very likely to me GTX1080 performance or so will end up mid range by the summer.
 
Last edited:
G sync is a form of adaptive sync that’s generally specified higher. It’s like saying Mercedes and BMW aren’t the same things, sure they aren’t but they are both still cars. I would pick Gsync over freesync because it does work better but I wouldn’t pay for either.

G-sync is not form of Adaptive sync. Both do basically same things but are different technologies. And generally G-Sync is not "better" or "specified higher" than Freesync, it's just more expensive.

I have done my reading and mantle was open source but to suggest that Nvidia adopting mantle wouldn’t have given AMD some sort of advantage is absurd, in fact reading between the lines it’s obvious that Nvidia didn’t support it because it doesn’t give Nvidia cards an advantage and that’s important when you are a giant corporation with shareholders. Many game developers had already snubbed mantle. When many of the benefits of mantle were employed by Vulcan and DX12 Nvidia would be disadvantaged to ignore it and therefore it’s supported and AMD cards tend to feel that benefit.

Nvidia wants to create their own technologies so it's not surprising Nvidia didn't want to use Mantle. As already stated, there was no need for Mantle after DirectX 12 and Vulkan. Game developers snubbed Mantle? Game developers were those who Wanted Mantle "(y)"

You didn’t answer the question I wanted answering the most though, why all the love for AMD? They are just another chip corporation. I’ve owned cards from both manufacturers over the years, More have been AMD/ATI cards than Nvidia parts. However in 2018 I really don’t see why anyone outside of the mining community would opt for an AMD card over an Nvidia card, mostly due to pricing, as I mentioned earlier I have recently acquired a 1080ti and I would have opted for Vega 64 if it had been available for a reasonable price but it’s not and won’t be for a while yet I don’t think. One thing that does bother me about AMD recently was the Vega fiasco, remarkably overpriced frontier edition for early access and then the fake launch price introduced to get reviewers to produce reviews lauding the card for its value for money. Actions like this don’t leave a good taste in the mouth. Still, if their cards can be had for a decent price then I wouldn’t hesitate to buy them. Nvidia aren’t much better and even if they were, only the fans and journalists care about this sort of behaviour.

Would you pick a Vega 64 over a 1080ti at today’s pricing?

What was "fake launch price"? So if AMD sells cards to retailers for price X and retailer asks price 2X, then it's AMD's fault that price is high? It just happened because AMD cards are so awesome for mining and prices tend to get high if there is much demand.

I wouldn't pick any mid end or high end graphic card on today's pricing.

7nm Vega is supposed to be sampling in 2018, but the word is it's not a consumer series. It doesn't look due for at least 6 months either when they do drag themselves out the door it'll probably end up compute focused.

Assuming best case and they DO dish out a 7nm consumer part for late this year, Vega 64 Liquid overclocked struggles against a mere GTX1080 overclocked in games. Forget about the present top end 1080ti! How much clockspeed do you think they could even gain on 7nm for mass produced regular consumer parts?? Certainly not much beyond a max OC liquid cooled Vega 64 one imagines.

So if we wildly calculate best case scenarios (consumer actually planned, it won't be delayed, it samples well) it still isn't going to put AMD in the game much above mid range any time soon. Seems very likely to me GTX1080 performance or so will end up mid range by the summer.

You seem to forget that 14nm LPP is not suitable for high clock speeds and probably not for big die area chips either. I doubt same applies to TSMC's 7nm process. Even if it does, Nvidia will have same problems.
 
I cook my 1080ti and 1070 at 69C all week. The secret to properly baking a GPU is to slow-cook it at a very low temperature over a long period of time so that the die just falls off the PCB when the used buyer gets it. If you do what most gamers do, and leave it running at 82 degrees all the time, the card comes out tough and unsavory. Learned this from a pit master.
You, sir, win all of my internetz for the month
 
G-sync is not form of Adaptive sync. Both do basically same things but are different technologies. And generally G-Sync is not "better" or "specified higher" than Freesync, it's just more expensive.



Nvidia wants to create their own technologies so it's not surprising Nvidia didn't want to use Mantle. As already stated, there was no need for Mantle after DirectX 12 and Vulkan. Game developers snubbed Mantle? Game developers were those who Wanted Mantle "(y)"



What was "fake launch price"? So if AMD sells cards to retailers for price X and retailer asks price 2X, then it's AMD's fault that price is high? It just happened because AMD cards are so awesome for mining and prices tend to get high if there is much demand.

I wouldn't pick any mid end or high end graphic card on today's pricing.



You seem to forget that 14nm LPP is not suitable for high clock speeds and probably not for big die area chips either. I doubt same applies to TSMC's 7nm process. Even if it does, Nvidia will have same problems.
Are you genuinely unaware of the launch fiasco of Vega? AMD told reviewers and vendors that the cards were $550 on launch with a $100 rebate from AMD to the vendors making them $450. This rebate was terminated by AMD I believe on day one of Vegas release, conveniently after reviews went out. Watch OC3Ds Vega 64 review to find out more. I do find it amusing that you have already tried to make the point that if it’s not AMDs fault specifically then it’s ok. No it’s not, if the price hike wasn’t AMDs fault then consumers are still getting a bad deal. Only fans care whose fault it is. Although in this case, it was absolutely AMDs fault, they specifically intended to mislead reviewers. Tiny Tom Logan from OC3D is not the only reviewer to express frustration about it.

Oh and Gsync is better, I have a freesync monitor, freesync only kicks in from 45fps or more, it has an upper limit too but I never found it - AMD cards don’t go fast enough at 4K it seems! Gsync by contrast is on from frame zero. This definitely makes Gsync a superior tech, especially as its benefits are often felt at lower frame rates.

Finally, a fair amount of game developers snubbed mantle before Nvidia did. Why spend money developing for an API that’s only a tiny percentage of the market have access to? Google it, it’s probably one of the reasons Nvidia felt they could ignore it at the time. I had AMD cards back then, I can’t remember more than one or two titles employing it. And it didn’t mean much as my CPU was up to the task.

So, Hardreset, which card would you pick at today’s pricing? Vega 64 or a 1080ti?
 
You seem to forget that 14nm LPP is not suitable for high clock speeds and probably not for big die area chips either. I doubt same applies to TSMC's 7nm process. Even if it does, Nvidia will have same problems.

I didn't forget that you're not addressing the actual point. A 1750MHz core Vega 64 still can't beat even a GTX1080 with an OC in games. At this point it's also drawing 500 watts versus under 300 for the Nvidia card.

If you think that 7nm is going to allow the power characteristics to increase the core speed of Vega so far beyond this to challenge higher than Nvidia's mid range lineup (coming sooner this year) you are living in la la land. It'll still probably end up costing more to produce as well.

It'll just be a re-run of GTX1060 v RX480. 7nm Vega would get stomped thoroughly sales wise by Nvidia, assuming as I pointed out, it even makes it to market for joe consumer....

AMD are absolutely nowhere in 2018 for consumer graphics.
 
Are you genuinely unaware of the launch fiasco of Vega? AMD told reviewers and vendors that the cards were $550 on launch with a $100 rebate from AMD to the vendors making them $450. This rebate was terminated by AMD I believe on day one of Vegas release, conveniently after reviews went out. Watch OC3Ds Vega 64 review to find out more. I do find it amusing that you have already tried to make the point that if it’s not AMDs fault specifically then it’s ok. No it’s not, if the price hike wasn’t AMDs fault then consumers are still getting a bad deal. Only fans care whose fault it is. Although in this case, it was absolutely AMDs fault, they specifically intended to mislead reviewers. Tiny Tom Logan from OC3D is not the only reviewer to express frustration about it.

There was no fiasco. It's all about Nvidia fanboys. AMD built up many cards for launch that actually sold price AMD told. Too bad, Miners bought immediately everything and when that launch stock was gone, AMD raised price.

I cannot see any problem there. Because there isn't.

Oh and Gsync is better, I have a freesync monitor, freesync only kicks in from 45fps or more, it has an upper limit too but I never found it - AMD cards don’t go fast enough at 4K it seems! Gsync by contrast is on from frame zero. This definitely makes Gsync a superior tech, especially as its benefits are often felt at lower frame rates.

Freesync technology has no 45Hz minimum limit. Problem is your monitor, not Freesync.

Finally, a fair amount of game developers snubbed mantle before Nvidia did. Why spend money developing for an API that’s only a tiny percentage of the market have access to? Google it, it’s probably one of the reasons Nvidia felt they could ignore it at the time. I had AMD cards back then, I can’t remember more than one or two titles employing it. And it didn’t mean much as my CPU was up to the task.

So, Hardreset, which card would you pick at today’s pricing? Vega 64 or a 1080ti?

Snubbed, how? Using same logic, even today many game developers snub Vulkan and DirectX 12, some even snub DirectX 11. Again, Nvidia wants to use their own technologies. No wonder they rarely use anything AMD (or Intel) has developed.

I didn't forget that you're not addressing the actual point. A 1750MHz core Vega 64 still can't beat even a GTX1080 with an OC in games. At this point it's also drawing 500 watts versus under 300 for the Nvidia card.

If you think that 7nm is going to allow the power characteristics to increase the core speed of Vega so far beyond this to challenge higher than Nvidia's mid range lineup (coming sooner this year) you are living in la la land. It'll still probably end up costing more to produce as well.

It'll just be a re-run of GTX1060 v RX480. 7nm Vega would get stomped thoroughly sales wise by Nvidia, assuming as I pointed out, it even makes it to market for joe consumer....

AMD are absolutely nowhere in 2018 for consumer graphics.

Can't beat on some old DirectX 11 titles or Nvidia Gameworks craps. Situation is different on modern games. Power draw is mostly from manufacturing tech. 14nm LPP just isn't suitable for high clock speeds. You can figure rest yourself.

At least with 7nm tech difference will be much smaller, if both Nvidia and AMD both make only die shrinks. We still don't know if Nvidia's upcoming cards are nothing else. Vega of course costs more to produce as it's much more advanced card. That's why miners want it. That's simple.

AMD sells all Vegas it can produce so there's not much AMD could do better right now sales wise.
 
Can't beat on some old DirectX 11 titles or Nvidia Gameworks craps. Situation is different on modern games. Power draw is mostly from manufacturing tech. 14nm LPP just isn't suitable for high clock speeds. You can figure rest yourself.

At least with 7nm tech difference will be much smaller, if both Nvidia and AMD both make only die shrinks. We still don't know if Nvidia's upcoming cards are nothing else. Vega of course costs more to produce as it's much more advanced card. That's why miners want it. That's simple.

AMD sells all Vegas it can produce so there's not much AMD could do better right now sales wise.

Wrong again, it's a fallacy that AMD dominate modern games. 19 months after the GTX1060 launched it is still faster than the RX480 on average. Why do you think the RX580 exists and Nvidia didn't bother revising up the clocks on the GTX1060.

https://babeltechreviews.com/rx-veg...e-overclocking-showdown-vs-the-gtx-1080-fe/3/
On these tests GTX1080 OC wins the majority against an RX Vega 64 liquid max overclocked, and it's typically much closer to the AMD card when it loses. The DX12 tests are split. But when the AMD card loses, it often loses bad in the other recent games. This is how tragically CRAP AMD Vega actually is for gamers.

If shrunk to 7nm it'll still end up being a much larger die than an equivalent Nvidia part, and still use more power, and still likely struggle to clock much higher to make a difference.

You're a crazy fanboy now that much is clear. Vega is a pile of junk frankly for gamers and assuming AMD actually bother with a 7nm consumer version in 6 or 9 months it won't be competing with Nvidia's higher end cards due out in just a few months. Probably struggle to find much of a spot against mid range cards where power efficiency and cost are key. AMD themselves likely know this which explains why they haven't even confirmed a consumer version, it might not be financially worth it and wouldn't capture much market share.

Finally I am sure they sell each Vega they make, but then they don't produce many Vegas because it isn't a good business model to lose money on each card you sell.......
 
Wrong again, it's a fallacy that AMD dominate modern games. 19 months after the GTX1060 launched it is still faster than the RX480 on average. Why do you think the RX580 exists and Nvidia didn't bother revising up the clocks on the GTX1060.

https://babeltechreviews.com/rx-veg...e-overclocking-showdown-vs-the-gtx-1080-fe/3/
On these tests GTX1080 OC wins the majority against an RX Vega 64 liquid max overclocked, and it's typically much closer to the AMD card when it loses. The DX12 tests are split. But when the AMD card loses, it often loses bad in the other recent games. This is how tragically CRAP AMD Vega actually is for gamers.

AMD seem to dominate on Vulkan and DirectX 12. Except those with Nvidia crap (Hitman and ROTR for example). That's exactly what I said. So AMD is for modern games, Nvidia for old ones that nobody cares.

If shrunk to 7nm it'll still end up being a much larger die than an equivalent Nvidia part, and still use more power, and still likely struggle to clock much higher to make a difference.

You're a crazy fanboy now that much is clear. Vega is a pile of junk frankly for gamers and assuming AMD actually bother with a 7nm consumer version in 6 or 9 months it won't be competing with Nvidia's higher end cards due out in just a few months. Probably struggle to find much of a spot against mid range cards where power efficiency and cost are key. AMD themselves likely know this which explains why they haven't even confirmed a consumer version, it might not be financially worth it and wouldn't capture much market share.

Finally I am sure they sell each Vega they make, but then they don't produce many Vegas because it isn't a good business model to lose money on each card you sell.......

It will clock relatively higher (vs Nvidia) than current Vega.

It's possible that AMD does not bother to release 7nm Vega, but that remains to be seen. But if what you say is true (Nvidia's current offerings crush AMD cards), then why Nvidia is so stupid they plan to release new high end cards? Yeah, why? If your current product is awesome against competitor, then you have no need to launch anything new. So If Nvidia launches new cards to replace current ones, then Nvidia is not that much ahead you seem to think.

Source for that AMD is right now losing money for every Vega sold?
 
There was no fiasco. It's all about Nvidia fanboys. AMD built up many cards for launch that actually sold price AMD told. Too bad, Miners bought immediately everything and when that launch stock was gone, AMD raised price.

I cannot see any problem there. Because there isn't.



Freesync technology has no 45Hz minimum limit. Problem is your monitor, not Freesync.



Snubbed, how? Using same logic, even today many game developers snub Vulkan and DirectX 12, some even snub DirectX 11. Again, Nvidia wants to use their own technologies. No wonder they rarely use anything AMD (or Intel) has developed.



Can't beat on some old DirectX 11 titles or Nvidia Gameworks craps. Situation is different on modern games. Power draw is mostly from manufacturing tech. 14nm LPP just isn't suitable for high clock speeds. You can figure rest yourself.

At least with 7nm tech difference will be much smaller, if both Nvidia and AMD both make only die shrinks. We still don't know if Nvidia's upcoming cards are nothing else. Vega of course costs more to produce as it's much more advanced card. That's why miners want it. That's simple.

AMD sells all Vegas it can produce so there's not much AMD could do better right now sales wise.
Haha, freesync has very few limits, hence why you get monitors made that only works from 30/45 FPS onwards etc. I should point out that once again, as a true fanboy, you are trying to deflect blame from AMD and shift it to the monitor manufacturer or myself for the lesser experience. People who just want the best experience don’t care who’s fault a poorer solution is. Nvidia by contrast require monitor manufacturers to implement no minimums, ensuring a better user experience. Of course I don’t overly value the tech, I’d rather 45fps onwards freesync for free over paying a premium for Gsync but the Nvidia implementation is superior and more consistent for that money.

The fiasco was absolutely real, reviews stated a $450 price point for the Vega 64 yet it barely sold at that price st all. This wasn’t something that was terribly well hidden, or are you going to try and claim that Vega 64’s were available for that price for anything more than a few days at launch? Of course it is also a lot easier to trust multiple tech reviewers themselves on this than a proven AMD fanboy like yourself.

By “snub” I mean many games developers stated they would not or simply did not support Mantle. To suggest this did not happen and that all devs adopted it would be laughable. Many devs also do “snub” Vulcan or DX12 but far more devs dont.

So which card would you buy at today’s prices? Vega 64 or a 1080 ti? Are you avoiding this question?
 
AMD seem to dominate on Vulkan and DirectX 12. Except those with Nvidia crap (Hitman and ROTR for example). That's exactly what I said. So AMD is for modern games, Nvidia for old ones that nobody cares.

It will clock relatively higher (vs Nvidia) than current Vega.

It's possible that AMD does not bother to release 7nm Vega, but that remains to be seen. But if what you say is true (Nvidia's current offerings crush AMD cards), then why Nvidia is so stupid they plan to release new high end cards? Yeah, why? If your current product is awesome against competitor, then you have no need to launch anything new. So If Nvidia launches new cards to replace current ones, then Nvidia is not that much ahead you seem to think.

Source for that AMD is right now losing money for every Vega sold?

Seem to? Not according to those DX12 tests there. They are split. Every AMD fanboy said the same thing with Polaris, while the overwhelming evidence is that even 18 months after RX480 it still doesn't beat GTX1060 averaged on a massive array of modern games since. And that Vega 64 does not beat GTX1080, with DX12 games being split and the gaps on them being small. Whereas AMD losing hard in the comparisons on many other hugely popular titles and many newer titles not using DX12.

DX12 itself has not been very successful. By the time any newer API is it won't even matter, both cards will be as good as obsolete. It's an argument that didn't hold water 18 months ago and still doesn't now especially in light of DX12's failure to take a firm hold. So saying "Nvidia for old [games] that nobody cares." marks you out as more than a bit silly!

It's also obvious why Nvidia will launch new higher end cards, because they can. In an investor meeting last year they stated they want to take better advantage of AMD's lack of competition with new higher end cards than they did with Pascal. Domination is motivation enough for most corporations.

I think everyone with common sense realises Nvidia are well ahead of AMD right now and have been for several years, it's definitely not my sole perception...

As for the source that AMD lose money on MSRP, they aren't going to verify it. However it's a widespread alleged claim and not without considerable merit when you start sniffing around for yourself and look at estimated manufacturing costs. It's no great shakes to 'sell out' a product that you hardly manufacture any of is it?

AMD lack motivation to build many Vega cards and the lack of ability/motivation to flood the market with them, if they really had full production. It isn't much of a leap to understand precisely why at this point is it? Vega in it's current guise is simply not a particularly profitable venture for AMD.

Bottom line is still this: Best case scenario for 7nm Vega is that when (if) it arrives around the end of 2018 it competes ok on performance with Nvidia's 2018 mid range lineup, which has probably been out for 6 months already. At which point it won't sell very many like RX480/580 didn't because the rule of thumb is unless AMD are way better for the same money, the majority buy Nvidia.
 
Last edited:
As long as you keep thermals in check the idea that you can work a card to death is a myth. Also, all of my cards are underclocked and undervolted. 30% reduction in power with less than a 5% loss in hash rate.

sounds like something a crypto miner would say,.oh, that may or may not have a bunch of used gpu for sale in a few months.

I guess this Article from a couple of weeks ago was true then...though this wasn't what I had in mind.but I'll take it..

https://www.techspot.com/news/73225-nvidia-working-hard-increase-gpu-supply-remains-focused.html
 
Last edited:
I cook my 1080ti and 1070 at 69C all week. The secret to properly baking a GPU is to slow-cook it at a very low temperature over a long period of time so that the die just falls off the PCB when the used buyer gets it. If you do what most gamers do, and leave it running at 82 degrees all the time, the card comes out tough and unsavory. Learned this from a pit master.


That's too funny..
My dad used to say when cooking a GPU ,ya put a rock in the pot with the GPU and slow cook both ,then when the rock is soft ,throw out the GPU and eat the rock. or was that a loon he was cooking?should work for the GPU though.Both are pretty tough.
 
Of course they are, I just bought a 1080 Ti!
know how u feel. nvidia did this to me after like 3 months after me purchasing a 980 from best buy (reference edition) or whatever its called so it has 1 fan. now im sitting with a halfway weak card that runs hot no matter how much airflow I have
 
TweakTown also said that when they talked to Nvidia that Nividia said that to expect high graphic card prices for at least another year or more ! TweakTown is just full of good news today ( well for Nividia and AMD anyways not so much for us though )
 
sounds like something a crypto miner would say,.oh, that may or may not have a bunch of used gpu for sale in a few months.

I guess this Article from a couple of weeks ago was true then...though this wasn't what I had in mind.but I'll take it..

https://www.techspot.com/news/73225-nvidia-working-hard-increase-gpu-supply-remains-focused.html
You can hate me for having more money to buy graphics cards all you want. Fact of the matter is that I both have more money and graphics cards than you
 
You can hate me for having more money to buy graphics cards all you want. Fact of the matter is that I both have more money and graphics cards than you

I don't hate you . though I don't really like anybody .but your an easy target.. :p I have lots of graphics cards but they go back generations. both NVidia and ATI/AMD.maybe 30+ . had I gotten into mining back when it was just starting ,I wouldn't need to work,oh, I don't need to work.. unless you are really a Nigerian prince .I don't see a need to mine ,it wouldn't be very profitable here .,poor bandwidth and expensive electricity.

if you are a stone mason ,you can't be all bad .I'm also a mason ,brick ,block ,glass block, arcatectural block , ledgerock , flagstone.fieldstone .you would like it here ,granite everywhere.

Been a Boilermaker now 20 years ,do some refractory,fireplaces , as well.. :)
 
Last edited:
Back