Nvidia stock surges after company posts record sales

There are several games now that you can't max at 1440p with a 290X and get 60FPS.

Yep but from the most part you can get away with disabling options that have little to no visual impact and get a great 1440p experience. Motion blur, lens flair, and anything from Nvidia (of course, this last one goes for nearly everyone).
 
1440p is great on the 390x / 290x so I don't know what your talking about. What proof do you have that the 390x / 290x isn't a good 1440p card?
They can barely push 60 fps at 1080p with some new games. What games are you 2 playing?

https://www.techspot.com/review/1006-the-witcher-3-benchmarks/page3.html

https://www.techspot.com/review/1060-metal-gear-solid-phantom-pain-benchmarks/page2.html

Cant even hit 60 fps on arkahm knight... at 1440
https://www.techspot.com/review/1022-batman-arkham-knight-benchmarks/page3.html

fallout 4 1440 barely hits 60 fps on the 390x, if you bought the 390x for 1440, I guess if you only plan on maxing some new games at 1440 for a year or so..
https://www.techspot.com/review/1089-fallout-4-benchmarks/page3.html

Dying light
https://www.techspot.com/review/956-dying-light-benchmarks/page4.html

You guys playing pokemon island adventures? League of legends?
 
Last edited:
They can barely push 60 fps at 1080p with some new games. What games are you 2 playing?

https://www.techspot.com/review/1006-the-witcher-3-benchmarks/page3.html

https://www.techspot.com/review/1060-metal-gear-solid-phantom-pain-benchmarks/page2.html

Cant even hit 60 fps on arkahm knight... at 1440
https://www.techspot.com/review/1022-batman-arkham-knight-benchmarks/page3.html

fallout 4 1440 barely hits 60 fps on the 390x, if you bought the 390x for 1440, I guess if you only plan on maxing some new games at 1440 for a year or so..
https://www.techspot.com/review/1089-fallout-4-benchmarks/page3.html

Dying light
https://www.techspot.com/review/956-dying-light-benchmarks/page4.html

You guys playing pokemon island adventures? League of legends?

I'd just like to point out the irrationality of some of those benchmarks you posted. Anti-Aliasing at 1440p, Nvidia features enabled? benchmarks are to test a graphics cards limits and not a real representation of the actual FPS you will be getting. 1440p doesn't require AA and that alone will give you a good 20 fps. Fallout 4 has Nvidia GodRays which tanks performance. Once again disabled without any visual degradation and excellent FPS.

Now are your actually reading these benchmarks and thinking to yourself "hey these would be settings I would use in real life if I had that video card" or are you just desperately clawing for any shred of evidence? Why would any sane AMD user turn features on that give then no benefits while greatly reducing FPS?
 
I'd just like to point out the irrationality of some of those benchmarks you posted. Anti-Aliasing at 1440p, Nvidia features enabled? benchmarks are to test a graphics cards limits and not a real representation of the actual FPS you will be getting. 1440p doesn't require AA and that alone will give you a good 20 fps. Fallout 4 has Nvidia GodRays which tanks performance. Once again disabled without any visual degradation and excellent FPS.

Now are your actually reading these benchmarks and thinking to yourself "hey these would be settings I would use in real life if I had that video card" or are you just desperately clawing for any shred of evidence? Why would any sane AMD user turn features on that give then no benefits while greatly reducing FPS?
Why would they be using AMD to start with?
 
I'd just like to point out the irrationality of some of those benchmarks you posted. Anti-Aliasing at 1440p, Nvidia features enabled? benchmarks are to test a graphics cards limits and not a real representation of the actual FPS you will be getting. 1440p doesn't require AA and that alone will give you a good 20 fps. Fallout 4 has Nvidia GodRays which tanks performance. Once again disabled without any visual degradation and excellent FPS.

Now are your actually reading these benchmarks and thinking to yourself "hey these would be settings I would use in real life if I had that video card" or are you just desperately clawing for any shred of evidence? Why would any sane AMD user turn features on that give then no benefits while greatly reducing FPS?
You ignores benchmarks\reality. I can't argue with your "beliefs" on how cards should be graded at 2k 4k.

Here is another fact about 4k. Put up real numbers or don't talk.

Since TechSpot already has the data, I went ahead and calculated the summary.
Here are the 4K framerates for the 390: 21, 36, 21, 35, 19, 27, 30, 33, 28, 45, 33, 29, 43, 15, 26, 26, 21
Here are the 4K framerates for the 970: 19, 35, 23, 39, 16, 24, 27, 29, 29, 37, 30, 24, 36, 16, 24, 24, 18
In the 17 games tested, the 390 averaged 28.70 FPS, while the 970 averaged 26.47 FPS. That makes the 390 8.42% faster on average at 4K. Note also that it's a review from last july, so it's using relatively old drivers. TechPowerUp's most recent summary says the 390 is on average 12.89% faster at 4K (normalizing the 390's "97%" score as 100% and increasing the 970's 86% score in the same proportion). So I wouldn't exactly say they are evenly matched.

Source
https://www.techspot.com/community/...s-for-every-budget.223244/page-2#post-1520898
 
Last edited:
You ignores benchmarks\reality. I can't argue with your "beliefs" on how cards should be graded at 2k 4k.

Here is another fact about 4k. Put up real numbers or don't talk.



Source
https://www.techspot.com/community/...s-for-every-budget.223244/page-2#post-1520898

There's a couple problems with your post that makes me question if you were following the conversation at all. We were talking about the 290x / 390x, not the 390. Nice try at cherry picking results though. You also neglected to mention that the benchmarks you gathered your data from used different models of GTX 970s and R9 390s, which is going to add variability not attributed to the stock card.

Here, how about this

http://www.tomshardware.com/reviews/amd-radeon-r9-390x-r9-380-r7-370,4178-6.html

One link so it's impossible to cherry pick like you tried 3-4 times and it's from a top tech website. The R9 390x beats the GTX 970 at every resolution but 1080p (which is really irrelevant because they both get over 60 fps).
 
There's a couple problems with your post that makes me question if you were following the conversation at all. We were talking about the 290x / 390x, not the 390. Nice try at cherry picking results though. You also neglected to mention that the benchmarks you gathered your data from used different models of GTX 970s and R9 390s, which is going to add variability not attributed to the stock card.

Here, how about this

http://www.tomshardware.com/reviews/amd-radeon-r9-390x-r9-380-r7-370,4178-6.html

One link so it's impossible to cherry pick like you tried 3-4 times and it's from a top tech website. The R9 390x beats the GTX 970 at every resolution but 1080p (which is really irrelevant because they both get over 60 fps).

The site you link shows 5 video games, 4 of the 5 do not have 60 fps at 1440p.
 
The site you link shows 5 video games, 4 of the 5 do not have 60 fps at 1440p.

You really cannot follow a conversation, eh? This is the last time I'm going to repeat myself, if I have to do it again, I'm just going to ignore you.

BENCHMARK. This is pushing the settings to the max regardless of the impact of each setting. In real life, smart users know to disable a couple of settings that make no difference in graphics and give them more FPS.

Your obviously grasping for straws as anyone who's every PC gamed would know this. Tell me, does anti-aliasing make sense for 4k? Is 2x MSAA worth half your frames? Do you know how hard it even is to see jaggies at 4k?
 
Your obviously grasping for straws as anyone who's every PC gamed would know this. Tell me, does anti-aliasing make sense for 4k? Is 2x MSAA worth half your frames? Do you know how hard it even is to see jaggies at 4k?
No but he is talking about 1440p... NOT 4k.
at 1440p anti-aliasing is actually quite nice to have on...
 
Why are you so excited about a top of the line card beating a second tier card anyway? It does pitifully against the top of the competition where its supposed to compete.
 
No but he is talking about 1440p... NOT 4k.
at 1440p anti-aliasing is actually quite nice to have on...

Subjective at best and your marginalizing the conversation. It wasn't only about Anti-Aliasing so stop trying to make it seem that way.
 
Um... AMD already did it's quarterly report. It was bad, as expected. Everyone and their mother was recommending you buy a GTX 970. The problem is, because of that, now you can get a R9 390x at the same price. Twice the vRAM and much more RAW power. The GTX 970 will only ever be good at 1080p gaming and only if you plan on overclocking. People who upgraded to the 970 will need another upgrade if they plan on going above 1080p.

GTX970 performs perfectly fine @2K.
 
You really cannot follow a conversation, eh? This is the last time I'm going to repeat myself, if I have to do it again, I'm just going to ignore you.

BENCHMARK. This is pushing the settings to the max regardless of the impact of each setting. In real life, smart users know to disable a couple of settings that make no difference in graphics and give them more FPS.

Your obviously grasping for straws as anyone who's every PC gamed would know this. Tell me, does anti-aliasing make sense for 4k? Is 2x MSAA worth half your frames? Do you know how hard it even is to see jaggies at 4k?
I'm sorry that your own link disagreed with your point and that no one made a benchmark based off your personal settings.
 
I'm sorry that your own link disagreed with your point and that no one made a benchmark based off your personal settings.

And I'm sorry that you have resorted to fallacies to try take a shot at me after you failed to prove your point. /Blocked
 
Everybody buy AMD/ATI!!

We need to keep nvidia in check. Let's not let it become a monopoly.
Besides, AMD has the best bang for the buck with it's 390X and R9 Fury Nano.
 
Everybody buy AMD/ATI!!

We need to keep nvidia in check. Let's not let it become a monopoly.
Besides, AMD has the best bang for the buck with it's 390X and R9 Fury Nano.
Considering that the leader in Graphics over all is Intel with about 75% of all computers running it, I wouldn't worry about nVidia becoming a monopoly, besides, there are already laws in place to prevent that. Ever hear of the Sherman Anti-trust Act?

AMD certainly has the cheapest products, best?, I don't think so.
 
Back