Nvidia's GTX 2070 and 2080 cards could be unveiled next month, launch late April

I don't hate you . though I don't really like anybody .but your an easy target.. :p I have lots of graphics cards but they go back generations. both NVidia and ATI/AMD.maybe 30+ . had I gotten into mining back when it was just starting ,. unless you are really a Nigerian prince .I don't see a need to mine ,it wouldn't be very profitable here .,poor bandwidth and expensive electricity.
I'm stepping out of character here for a moment, I make far more than you in passive income than you do at whatever job you do. Hate me being a miner if you want, you're still poor
 
Now who is hating? I edit my last post a bit .but you got a post in so quick ,you missed it ,,I don't consider myself poor for where I live.I take home more pension than many people here do working.I still do a little mason work ,fire place hearths, small jobs.

I've taken home over 20g's in a month ,lets see crypto rake in coin like that.pffft. get back in your character.

I care about the environment obviously more than you do ,that makes me wealthier than you,and wiser.theres more to life than money.and greed.I guess you wouldn't know much about that.your only 30.my kid is older than you.and wiser.
 
Last edited:
Reposting an unfounded rumor first reported by Tweaktown a week ago. Not only are you guys a week behind, you added literally nothing that wasn't in the tweaktown report, which again had absolutely nothing of substance in it beyond an unverified rumor. What a waste of bandwidth. Is this really what passes for journalism in 2018?
 
Reposting an unfounded rumor first reported by Tweaktown a week ago. Not only are you guys a week behind, you added literally nothing that wasn't in the tweaktown report, which again had absolutely nothing of substance in it beyond an unverified rumor. What a waste of bandwidth. Is this really what passes for journalism in 2018?

you start up a new account to post and further waste bandwidth,to each their own..they own the site and can post as they see fit ,not every one read at tweaktown a week ago.

seems by the date on the article .it was posted only 3 days and 19 hours ago.it was posted here yesterday at just after 11,so it was posted here only 2 days + later .how many days in a week where you are?

By: Anthony Garreffa | Video Cards News | Posted: 3 days, 19 hours ago Read more: https://www.tweaktown.com/news/60992/nvidia-launch-new-geforce-gtx-series-during-gtc-2018/index.html
 
Last edited:
One thing I've learned over the years is it's a very bad idea to buy the first wave of new cards that come out every generation. Wait a month (or better yet at least two) until manufacturing faults are ironed out. You don't wanna be messing around with RMA processes when the stock isn't even there and your money is stuck in limbo. Better to wait it out and get a more refined product.
 
Are you genuinely unaware of the launch fiasco of Vega? AMD told reviewers and vendors that the cards were $550 on launch with a $100 rebate from AMD to the vendors making them $450. This rebate was terminated by AMD I believe on day one of Vegas release, conveniently after reviews went out. Watch OC3Ds Vega 64 review to find out more. I do find it amusing that you have already tried to make the point that if it’s not AMDs fault specifically then it’s ok. No it’s not, if the price hike wasn’t AMDs fault then consumers are still getting a bad deal. Only fans care whose fault it is. Although in this case, it was absolutely AMDs fault, they specifically intended to mislead reviewers. Tiny Tom Logan from OC3D is not the only reviewer to express frustration about it.

AMD told that Vega will have reduced price on launch. All cards were immediately sold. That rebate was no longer valid since cards were sold out. Problem?

It's not anything new that some reviewers tend to be stupid (or act like stupid) when trying to make headlines.

Oh and Gsync is better, I have a freesync monitor, freesync only kicks in from 45fps or more, it has an upper limit too but I never found it - AMD cards don’t go fast enough at 4K it seems! Gsync by contrast is on from frame zero. This definitely makes Gsync a superior tech, especially as its benefits are often felt at lower frame rates.

Freesync supports refresh rate range of 9-240 Hz. So there is nothing wrong with Freesync technology. Also you can buy monitors that support Freesync on 30 Hz. So it's you who make stupid buying decisions and blame AMD for your own stupidity.

Finally, a fair amount of game developers snubbed mantle before Nvidia did. Why spend money developing for an API that’s only a tiny percentage of the market have access to? Google it, it’s probably one of the reasons Nvidia felt they could ignore it at the time. I had AMD cards back then, I can’t remember more than one or two titles employing it. And it didn’t mean much as my CPU was up to the task.

So, Hardreset, which card would you pick at today’s pricing? Vega 64 or a 1080ti?

Snubbed = do not use something immediately? Mantle was quite fast replaced by DirectX 12 and Vulkan so no wonder many developers didn't have time to adopt it.

Today's pricing? it's quite hard to find any place that has Vega on stock. So what is Vega's current price?

Seem to? Not according to those DX12 tests there. They are split. Every AMD fanboy said the same thing with Polaris, while the overwhelming evidence is that even 18 months after RX480 it still doesn't beat GTX1060 averaged on a massive array of modern games since. And that Vega 64 does not beat GTX1080, with DX12 games being split and the gaps on them being small. Whereas AMD losing hard in the comparisons on many other hugely popular titles and many newer titles not using DX12.

DirectX 12 tests are NOT split. AMD wins on every game that is neither AMD or Nvidia optimized. Only games where Nvidia wins are either DirectX 11 games with slapped DX12 support or deliberately Nvidia optimized. So AMD is way better on DX12, and just looking at hardware, it's not surprising.

Modern games are those that use either DX12, Vulkan or Mantle. Other titles are not modern. Well, you can call "modern" game that uses graphics API derived from 2003 DirectX 9.0c (DirectX 11 titles) or game that support physics at most 60 Hz (FO4), but...

DX12 itself has not been very successful. By the time any newer API is it won't even matter, both cards will be as good as obsolete. It's an argument that didn't hold water 18 months ago and still doesn't now especially in light of DX12's failure to take a firm hold. So saying "Nvidia for old [games] that nobody cares." marks you out as more than a bit silly!

It's also obvious why Nvidia will launch new higher end cards, because they can. In an investor meeting last year they stated they want to take better advantage of AMD's lack of competition with new higher end cards than they did with Pascal. Domination is motivation enough for most corporations.

I think everyone with common sense realises Nvidia are well ahead of AMD right now and have been for several years, it's definitely not my sole perception...

DX12 has not been very succesfull, yet, but it's still much more advanced and modern than any DirectX before. Nvidia runs old games better but usually those games are played years old low end hardware. So yes, nobody cares when talking about modern cards. For this obsolete thing. I don't consider Radeon HD 7950 "obsolete" as it's way better than many current low end cards. And that is 6 year old card that sold for under $200 4 years ago!

Why that is obvious? Neither AMD, Intel or Nvidia has did that before.

Everyone with common sense realize that AMD is technologically more advanced and years ahead Nvidia. I said this before and what happened? Another crypto boom caused huge demand for technologically more advanced AMD cards. Nvidia is ahead on DirectX 11 or older titles. Everywhere else AMD is dominating. No wonder Intel decided to use put AMD's Vega graphic chip along their own CPU core. Is Intel really so stupid it uses "crap" Vega with their own CPU? Or is this "Vega is crap" only Nvidia fanboy dreaming? Not hard to tell.

As for the source that AMD lose money on MSRP, they aren't going to verify it. However it's a widespread alleged claim and not without considerable merit when you start sniffing around for yourself and look at estimated manufacturing costs. It's no great shakes to 'sell out' a product that you hardly manufacture any of is it?

AMD lack motivation to build many Vega cards and the lack of ability/motivation to flood the market with them, if they really had full production. It isn't much of a leap to understand precisely why at this point is it? Vega in it's current guise is simply not a particularly profitable venture for AMD.

Widespread :D Fudzilla makes claim, with no proof, other sites copy Fudzilla's claims and so it's widespread *nerd*

Because it's hard to find Nvidia high end graphic cards either it means Nvidia is also losing money with every high end GTX they sale?

Bottom line is still this: Best case scenario for 7nm Vega is that when (if) it arrives around the end of 2018 it competes ok on performance with Nvidia's 2018 mid range lineup, which has probably been out for 6 months already. At which point it won't sell very many like RX480/580 didn't because the rule of thumb is unless AMD are way better for the same money, the majority buy Nvidia.

You are saying GTX 1080 Ti (or card with equivalent speed) is Nvidia's mid range card next summer?
 
If they can count, they should go with 1180... Or if they feel it's a massive jump in the process. reset the naming convention. why is it x80? why not just make it 118 = the 1180? the zero seems like a waste and strickly for marketing. for marketing call it the 11n8 generation nvidia tier. so then people will call it the n8 for short. or the n7. or pick a letter, idc.
 
Hard Reset still proving himself to be desperately in love with AMD as usual.

Does he know he’s humiliating himself?
 
Freesync supports refresh rate range of 9-240 Hz. So there is nothing wrong with Freesync technology. Also you can buy monitors that support Freesync on 30 Hz. So it's you who make stupid buying decisions and blame AMD for your own stupidity.
aaaawwwww! Bless ya! Really struggling to grasp this one aren't you :D

Don't worry, I'll break it down as much as I can:

Freesync: Left up to monitor manufacturer's to decide working range, lots of rubbish monitors on the market as a result.
G-Sync: Nvidia forced working range, therefore all monitors have a decent working range. No Rubbish implementations.

I'll pray to the AMD god's you at least understand where @Shadowboxer is coming from but won't hold my breath.

Love You @HardReset

*** Damn it, No love heart or kiss emoji's. To the MOD's that like deleting my posts, would you at least consider adding some love emoji's ? :)
 
Every please pray that nVidia and AMD both start producing mining cards! Us gamers have been completely screwed by miners!
 
Seem to? Not according to those DX12 tests there. They are split. Every AMD fanboy said the same thing with Polaris, while the overwhelming evidence is that even 18 months after RX480 it still doesn't beat GTX1060 averaged on a massive array of modern games since. And that Vega 64 does not beat GTX1080, with DX12 games being split and the gaps on them being small. Whereas AMD losing hard in the comparisons on many other hugely popular titles and many newer titles not using DX12.

DX12 itself has not been very successful. By the time any newer API is it won't even matter, both cards will be as good as obsolete. It's an argument that didn't hold water 18 months ago and still doesn't now especially in light of DX12's failure to take a firm hold. So saying "Nvidia for old [games] that nobody cares." marks you out as more than a bit silly!

It's also obvious why Nvidia will launch new higher end cards, because they can. In an investor meeting last year they stated they want to take better advantage of AMD's lack of competition with new higher end cards than they did with Pascal. Domination is motivation enough for most corporations.

I think everyone with common sense realises Nvidia are well ahead of AMD right now and have been for several years, it's definitely not my sole perception...

As for the source that AMD lose money on MSRP, they aren't going to verify it. However it's a widespread alleged claim and not without considerable merit when you start sniffing around for yourself and look at estimated manufacturing costs. It's no great shakes to 'sell out' a product that you hardly manufacture any of is it?

AMD lack motivation to build many Vega cards and the lack of ability/motivation to flood the market with them, if they really had full production. It isn't much of a leap to understand precisely why at this point is it? Vega in it's current guise is simply not a particularly profitable venture for AMD.

Bottom line is still this: Best case scenario for 7nm Vega is that when (if) it arrives around the end of 2018 it competes ok on performance with Nvidia's 2018 mid range lineup, which has probably been out for 6 months already. At which point it won't sell very many like RX480/580 didn't because the rule of thumb is unless AMD are way better for the same money, the majority buy Nvidia.
directX 12 does work better and has been successful. its not direct X 12 that's the problem, but the programmers. Intel has gotten these programmer so damn lazy, that they dont want to write anything passed 4 cores and DX 11. DX12 utilizes the gpus more efficiently, but to do this, you have to write more to use its power. Also fully using Dx 12, will end up with games reaching over 15 gb size. companies have to pay for this. its all about money. look at the new game PUBG.
 
GTX 1180 please, 2080 its wrong

While 2080 may be wrong, I'm almost certain Nvidia will choose that name. Because from the marketing perspective it's a total hit. And Nvidia was always about marketing (in addition to cheating and rigging the benchmark tests). 30% of the speed of Nvidia cards is by working with game makers to optimize the game to run worse on AMD.

When AMD provides faster RAM, they rig games to use more shaders and less RAM. When AMD introduces more shaders, they rig the games to use more ROPS. Etc. They win with dirty tricks as much as with hardware.

At the same time ATI (now AMD) was always known of having the crappiest possible drivers. Their software department should have been fired long time ago. Even when they somehow make the drivers work, AMD Control Panel will fill your log with errors. Not to mention how slow and unstable it is.

Nvidia makes good hardware, but also cheats a lot. While AMD can't make decent software.
 
That would be nVidia not AMD, and personally I can't wait for launch. I want to see how badly they make AMD cry.
They just pushed back the launch and will push it back once more so they should be released by September or October as there current line is selling good !

https://www.techspot.com/news/73542-nvidia-unlikely-unveil-new-gaming-hardware-month.html

I don't think AMD is crying as they are selling all they can produce and gaining market shares !

https://www.techspot.com/news/73482-cryptocurrency-miners-help-amd-close-gap-nvidia-gpu.html
 
You can hate me for having more money to buy graphics cards all you want. Fact of the matter is that I both have more money and graphics cards than you
My Prince, I'm just going to continue hating you for the same reason I always have, "Because you're beautiful". By your leave, of course, sire.

(For those of you either too young to get the joke, live in a vacuum, or have never owned a television):

 
Back