Rockstarrrr
Posts: 97 +72
No, no, not the Crysis 3.. The very first one, from 2007
Sounds like you should create your own site and do your own review. Then we can all stop by and tell you how to do your job! Deal?@Steven Walton
Steve.
I did notice any comments to the contrary so I can only assume that you tested 1080Ti with Asynchronous Compute DISABLED.
GTX 1080 actually slowed down when running DX12 benchmarks as NVidia does not support mulit-core gaming with Async Shader Pipelines or Asynchronous Compute Engines. Both which are protected by AMD patents and a major performance multiplier while playing DX12 games that use 3d engines.
I am also going to assume that you did not DISABLE that feature with the AMD dGPU cards tested?
Please confirm that you did NOT DISABLE Asynch Compute!! Your silence on this will be the answer that you did indeed DISABLE Asynch Compute.
Another great feature of DX12 that you did not seem to mention were bench scores using 2 dGPU cards. Not in SLI or Crossfire but rather Explicit Multi-Adaptor (EMA) which is NATIVE to DX12. Of course DX11 supports SLI and Crossfire, not as good though and the game needs to be heavily coded for multi-gpu and of course DX11 only allows serial execution of multi-core gaming if you can get past 2 cores actually doing any work. Again DX12 allows multi-core gaming natively.
EMA of course allowed DX12 to use all GPU resources available to it and of course offers massive scalability.
It would appear you also did not compare multi-gpu scalability either.
You also did not run the Star Swarm benchmark. Why? Nor did you run 3d Mark API Drawcall Feature Overhead test. Again WHY?
Not really a comprehensive review now is it?
@Steven Walton
Steve.
I did notice any comments to the contrary so I can only assume that you tested 1080Ti with Asynchronous Compute DISABLED.
GTX 1080 actually slowed down when running DX12 benchmarks as NVidia does not support mulit-core gaming with Async Shader Pipelines or Asynchronous Compute Engines. Both which are protected by AMD patents and a major performance multiplier while playing DX12 games that use 3d engines.
I am also going to assume that you did not DISABLE that feature with the AMD dGPU cards tested?
Please confirm that you did NOT DISABLE Asynch Compute!! Your silence on this will be the answer that you did indeed DISABLE Asynch Compute.
Another great feature of DX12 that you did not seem to mention were bench scores using 2 dGPU cards. Not in SLI or Crossfire but rather Explicit Multi-Adaptor (EMA) which is NATIVE to DX12. Of course DX11 supports SLI and Crossfire, not as good though and the game needs to be heavily coded for multi-gpu and of course DX11 only allows serial execution of multi-core gaming if you can get past 2 cores actually doing any work. Again DX12 allows multi-core gaming natively.
EMA of course allowed DX12 to use all GPU resources available to it and of course offers massive scalability.
It would appear you also did not compare multi-gpu scalability either.
You also did not run the Star Swarm benchmark. Why? Nor did you run 3d Mark API Drawcall Feature Overhead test. Again WHY?
Not really a comprehensive review now is it?
Wow. Gee...just $500? That's still more expensive than most modern full computers.
I was thinking the same thing. The 980ti was the hottest card going for a long time and still competes well against the 10xx series cards.
@Steven Walton
Steve.
I did notice any comments to the contrary so I can only assume that you tested 1080Ti with Asynchronous Compute DISABLED.
GTX 1080 actually slowed down when running DX12 benchmarks as NVidia does not support mulit-core gaming with Async Shader Pipelines or Asynchronous Compute Engines. Both which are protected by AMD patents and a major performance multiplier while playing DX12 games that use 3d engines.
I am also going to assume that you did not DISABLE that feature with the AMD dGPU cards tested?
Please confirm that you did NOT DISABLE Asynch Compute!! Your silence on this will be the answer that you did indeed DISABLE Asynch Compute.
Another great feature of DX12 that you did not seem to mention were bench scores using 2 dGPU cards. Not in SLI or Crossfire but rather Explicit Multi-Adaptor (EMA) which is NATIVE to DX12. Of course DX11 supports SLI and Crossfire, not as good though and the game needs to be heavily coded for multi-gpu and of course DX11 only allows serial execution of multi-core gaming if you can get past 2 cores actually doing any work. Again DX12 allows multi-core gaming natively.
EMA of course allowed DX12 to use all GPU resources available to it and of course offers massive scalability.
It would appear you also did not compare multi-gpu scalability either.
You also did not run the Star Swarm benchmark. Why? Nor did you run 3d Mark API Drawcall Feature Overhead test. Again WHY?
Not really a comprehensive review now is it?
The 4K performance is impressive.
Now you can get a single GPU that's 'somewhat affordable' for 4K gaming, or 2K @ 100+ Hz.
1440p widescreen curved monitors, typically 34" diagonal, roughly speaking, demand about half the GPU power of 4K. This is about as close to 2K gaming as you can get in today's market, I think. You can drive one of those monitors at ultra graphical quality quite well with a GTX-1080.
Problem: there aren't any G-sync compatible 1440p widescreen curved monitors on the market, yet. They're all Freesync compatible. AMD doesn't offer any GPUs yet that can drive them well enough, but that should change when Vega gets here.
Rumor has it that Samsung's quantum dot monitors, including their 34" widescreen 1440p monitor, will be available for G-sync later this year. But that's only a rumor. I don't think Samsung has made any official announcements for it yet.
For gamers wanting to get into 1440p widescreen today, they're kinda screwed. You can drive one with an nVidia Pascal card, but you won't get any display sync. You can get display sync with an AMD GPU, but the frame rates will be terrible. Or you can wait, either for Vega to get to the market, or for monitor vendors to offer G-sync in 1440p widescreen.
Be sure to pair G-sync monitors with an nVidia card, or Freesync monitors with an AMD card. If you neglect the proper pairing, you'll lose out on display sync, and that's a really nice technology for gamers to have.
And for the cost of a nice steak dinner you can buy 50 Taco Bell tacos. But they're not even in the same league, so the comparison is meaningless.Wow. Gee...just $500? That's still more expensive than most modern full computers.
Problem: there aren't any G-sync compatible 1440p widescreen curved monitors on the market, yet. They're all Freesync compatible.
That's not how things work mate, you don't get "clean power" just because you have "dual-FET" configuration. Clean power means usually either a reduced reactive power in AC system (not the case here as you have a DC supply if I am not mistaken) or power supplied by a renewable source like from solar panels. Efficiency has nothing to do with clean power. Also, you cannot have 250 Amps of power, Ampere are units to measure current and 250 Amps is a pretty big value, certainly the power bus will look interestingly large. Maybe you meant 250W (Watts) ?Compared to the GeForce GTX 1080 Founders Edition board, the GTX 1080 Ti’s power subsystem has been substantially enhanced. If you recall, all GeForce GTX 10 Series GPUs were equipped with a dual-FET power supply on both the GPU and memory, which provided cleaner power to these components compared to prior GPUs. This improved power efficiency, reliability, and overclocking.
For the GTX 1080 Ti Founders Edition we’ve incorporated a 7-phase 2x dual-FET power design that’s capable of supplying up to 250 amps of power to the GPU.
That's not how things work mate, you don't get "clean power" just because you have "dual-FET" configuration. Clean power means usually either a reduced reactive power in AC system (not the case here as you have a DC supply if I am not mistaken) or power supplied by a renewable source like from solar panels. Efficiency has nothing to do with clean power. Also, you cannot have 250 Amps of power, Ampere are units to measure current and 250 Amps is a pretty big value, certainly the power bus will look interestingly large. Maybe you meant 250W (Watts) ?
That's not how things work mate, you don't get "clean power" just because you have "dual-FET" configuration. Clean power means usually either a reduced reactive power in AC system (not the case here as you have a DC supply if I am not mistaken) or power supplied by a renewable source like from solar panels. Efficiency has nothing to do with clean power. Also, you cannot have 250 Amps of power, Ampere are units to measure current and 250 Amps is a pretty big value, certainly the power bus will look interestingly large. Maybe you meant 250W (Watts) ?
These words come from Nvidia engineers, they know more than me so I took them at their word.
The answer is simple: Nobody cares about the GTX 980 Ti anymore, it's at least a century old in tech terms. As for your decision to upgrade, only you can make that... possibly with a little help from your bank manager.Why no reviews comparing to the 980 ti? I need to know if I should upgrade or wait a gen.
"Problem: there aren't any G-sync compatible 1440p widescreen curved monitors on the market, yet."The 4K performance is impressive.
Now you can get a single GPU that's 'somewhat affordable' for 4K gaming, or 2K @ 100+ Hz.
1440p widescreen curved monitors, typically 34" diagonal, roughly speaking, demand about half the GPU power of 4K. This is about as close to 2K gaming as you can get in today's market, I think. You can drive one of those monitors at ultra graphical quality quite well with a GTX-1080.
Problem: there aren't any G-sync compatible 1440p widescreen curved monitors on the market, yet. They're all Freesync compatible. AMD doesn't offer any GPUs yet that can drive them well enough, but that should change when Vega gets here.
Rumor has it that Samsung's quantum dot monitors, including their 34" widescreen 1440p monitor, will be available for G-sync later this year. But that's only a rumor. I don't think Samsung has made any official announcements for it yet.
For gamers wanting to get into 1440p widescreen today, they're kinda screwed. You can drive one with an nVidia Pascal card, but you won't get any display sync. You can get display sync with an AMD GPU, but the frame rates will be terrible. Or you can wait, either for Vega to get to the market, or for monitor vendors to offer G-sync in 1440p widescreen.
Be sure to pair G-sync monitors with an nVidia card, or Freesync monitors with an AMD card. If you neglect the proper pairing, you'll lose out on display sync, and that's a really nice technology for gamers to have.
The GTX980Ti performs +/- as 1070, then you can compare, good luck!Why no reviews comparing to the 980 ti? I need to know if I should upgrade or wait a gen.
The 4K performance is impressive.
Now you can get a single GPU that's 'somewhat affordable' for 4K gaming, or 2K @ 100+ Hz.
1440p widescreen curved monitors, typically 34" diagonal, roughly speaking, demand about half the GPU power of 4K. This is about as close to 2K gaming as you can get in today's market, I think. You can drive one of those monitors at ultra graphical quality quite well with a GTX-1080.
Problem: there aren't any G-sync compatible 1440p widescreen curved monitors on the market, yet. They're all Freesync compatible. AMD doesn't offer any GPUs yet that can drive them well enough, but that should change when Vega gets here.
Rumor has it that Samsung's quantum dot monitors, including their 34" widescreen 1440p monitor, will be available for G-sync later this year. But that's only a rumor. I don't think Samsung has made any official announcements for it yet.
For gamers wanting to get into 1440p widescreen today, they're kinda screwed. You can drive one with an nVidia Pascal card, but you won't get any display sync. You can get display sync with an AMD GPU, but the frame rates will be terrible. Or you can wait, either for Vega to get to the market, or for monitor vendors to offer G-sync in 1440p widescreen.
Be sure to pair G-sync monitors with an nVidia card, or Freesync monitors with an AMD card. If you neglect the proper pairing, you'll lose out on display sync, and that's a really nice technology for gamers to have.
The GTX line from Nvidia are GPUs for budget, mainstream, and enthusiast PC gamers; hence, that's why there are gaming benchmarks.And what would I use this card for? Oh! Must be gaming, 'cause there are all the gaming benchmarks.
A Hyundai Sonata starts at $21,950,a BMW 5-series starts at $51,200, and a Porsche Panamara starts at $85,400. All 3 vehicles have 4 doors, wheels, an engine of some sorts, and all provide transportation. They differ in their performance.For the same money, I could have a sparkly new iPhone of Samsung Galaxy Android phone. Or I could buy groceries for my family for almost 2 months. Or gasoline to drive my car for a couple of months. Or... Or... Or... Frivolous! Or maybe I never learned to appreciate the value of gaming.
If you're interested in those benchmarks you likely should be looking into the Quadro and FirePro lines. If you're a "hobbyist" looking to save money on the GTX line there are likely forums with users like yourself to get this information after they're in the wild for a time.How about rendering of solid objects to be fabricated, another task that likes the best graphics and the fastest CPU? Or other relevant non-trivial non-gamer benchmarks?
And what would I use this card for? Oh! Must be gaming, 'cause there are all the gaming benchmarks. For the same money, I could have a sparkly new iPhone of Samsung Galaxy Android phone. Or I could buy groceries for my family for almost 2 months. Or gasoline to drive my car for a couple of months. Or... Or... Or... Frivolous! Or maybe I never learned to appreciate the value of gaming.
How about rendering of solid objects to be fabricated, another task that likes the best graphics and the fastest CPU? Or other relevant non-trivial non-gamer benchmarks?