Upcoming GeForce GTX 1070 is faster than Titan X for $379

They are enthusiasts, and they do their research more than anyone.
TL;DR they knew Pascal was coming and are always looking for an excuse to upgrade.
I did my research when I decided to purchase a 780.

The Ti came out a few months later, with no warning.

Much more powerful than the 780, released for the same price. I was burned so hard. It was the first ever x80 class GPU I purchased; my first real enthusiast piece of hardware. And it took months of saving.

I'll stay salty over it my entire life.
 
"Founders Edition" sounds to me like it will have specially binned chips. But $379 for a 1070? That is even more expensive the GTX 970. With the continued price gouging, I hope the 900 series drop in price. I will scoop up a 970 if we see fire sale pricing.
 
Holy Cow, $350 for faster than a titan x?!!

Must feel pretty crappy for those guys that bought titan X cards just a few months ago to hear they can get more for 1/3 the price now.
Not really. No one buys a Titan on a whim. They are enthusiasts, and they do their research more than anyone.
TL;DR they knew Pascal was coming and are always looking for an excuse to upgrade.

I highly doubt it will be $379. I have seen this sort of thing before. Remember the Surface was suppose to come out at $199 but instead it was sold for $449. I hope this $379 is true, but I am not holding my breath. However this should shake things up for the red team.

The X70 GPUs are always on par or superior to the previous generation's flagship GPU, and are always priced between $300-$400. There's nothing unexpected here except the fact you'll probably need to upgrade your CPU to avoid bottlenecking the 1070. The 6700K is the fastest CPU for gaminging and it already holds back the TitanX. The 1080 will be horribly bottlenecked by today's CPUs and the 1070 will be in a similar position as the TitanX. If you're not running a Core i7 4790k or 6700K at 4.4ghz or higher, neither GPU will benefit you.
 

Digital Foundry did a comprehensive look into CPU performance and it matches my own experience. My 4690K was pretty much maxed out paired with a GTX 970. The 1070 is basically twice as fast. No way my CPU can feed enough data to a TitanX, 1070, or 1080. A faster CPU will be necessary unless you're cool with 100% CPU usage and 60% GPU usage out of your system.
 
Holy Cow, $350 for faster than a titan x?!!

Must feel pretty crappy for those guys that bought titan X cards just a few months ago to hear they can get more for 1/3 the price now.

Not really. No one buys a Titan on a whim. They are enthusiasts, and they do their research more than anyone.
TL;DR they knew Pascal was coming and are always looking for an excuse to upgrade.

I don't think many enthusiasts bought a titan x. It was more of a fanboy card. The 980 Ti released a few weeks later in the $600 price range, something every enthusiast knew was coming. You'd have to blatantly ignore common sense to have purchase a Titan X.
 
Digital Foundry did a comprehensive look into CPU performance and it matches my own experience. My 4690K was pretty much maxed out paired with a GTX 970. The 1070 is basically twice as fast. No way my CPU can feed enough data to a TitanX, 1070, or 1080. A faster CPU will be necessary unless you're cool with 100% CPU usage and 60% GPU usage out of your system.

That's the complete opposite of what I've experienced and heard from others about CPU performance.
I'll check out the article...

Digital Foundry says GTA IV is the most CPU instensive in their test suite. And TechSpot's testing also shows you are CPU bound at 2560x1440:

https://www.techspot.com/review/991-gta-5-pc-benchmarks/page6.html

"As you can see, GTA V is quite CPU bound and this is evident by the performance gains seen when moving from the Core i3 to the Core i5 (around 20%). The Core i5-4690K was just 3fps slower than the Core i7-5960X, so it is safe to say investing in a Core i7 processor for GTA V isn't money well spent."


The only time CPU usage is mentioned is in The Witcher III, and he says ALL CPU's tested reached 100%.
 
Last edited:
I just hope that Nvidia doesn't forget about the previous generation when it comes to driver updates like they did in the past. One of the things I like about AMD is that even cards several generations old benefit a lot from driver updates.
 
Well the 980 had 4gb of ram compared to the 980ti's 6gb so it would be safe to assume that he 1080ti will have 12gb compared to the 1080's 8gb. I think I'll be waiting for the 1080ti. I've waited this long with my 680, I can wait longer. I'm the type that likes to crank up view distance so the extra memory is a big one for me
 
That's the complete opposite of what I've experienced and heard from others about CPU performance.
I'll check out the article...

Digital Foundry says GTA IV is the most CPU instensive in their test suite. And TechSpot's testing also shows you are CPU bound at 2560x1440:

https://www.techspot.com/review/991-gta-5-pc-benchmarks/page6.html

"As you can see, GTA V is quite CPU bound and this is evident by the performance gains seen when moving from the Core i3 to the Core i5 (around 20%). The Core i5-4690K was just 3fps slower than the Core i7-5960X, so it is safe to say investing in a Core i7 processor for GTA V isn't money well spent."


The only time CPU usage is mentioned is in The Witcher III, and he says ALL CPU's tested reached 100%.

Like you said you're even CPU bound at 1440P (which is far less likely,) but at 1080P, which most people use, you're even more CPU limited. Biggest games where CPU limitation is evident are GTA V (as you said,) Crysis 3 (for framerate dips,) Witcher 3, and Arkham Knight (due to terrible optimization.)

Check this out
even comparing older i7's at the same speed you see that they're all CPU limited. If they weren't CPU limited you wouldn't see a performance increase.
 
"It's not completely clear what extra benefits the Founder's Edition card provides..."
There was a slide shown at the event that said, "up to 2GHz overclcocks" so it has to be Foundry cards getting that high more easily. And also the only thing Jensen said about the Founder cards were they were best for overclocking.
No, and...
"Founders Edition" sounds to me like it will have specially binned chips.
...no.
The Founders Edition cards are just the reference polygonal HSF blower/shroud design. No special binning, stock factory clocks.
AIB vendor designs start at $599 MSRP (the usual MSI Gaming, EVGA ACX, Asus DC3, Gigabyte Windforce etc), and the reference/Founder's Edition will go up against the OC'ed AIB custom cards like the Asus Strix/Matrix, Gigabyte G1, EVGA SSC etc.)......good luck with that.
I don't think many enthusiasts bought a titan x. It was more of a fanboy card. The 980 Ti released a few weeks later in the $600 price range, something every enthusiast knew was coming. You'd have to blatantly ignore common sense to have purchase a Titan X.
Or you have a completely different workload in mind for the cards. You'd have to blatantly ignore the actual GPGPU market not to have realized this...
4u_hdca_web.png


While a fair number of people bought the Titan X for benchmarking and the bragging rights of having the worlds fastest GPU for a while, I think you underestimate that like any hobby, assigning a cash value for the enjoyment is a flawed exercise...and not everyone sees an extra $350 expenditure as a deal breaker - especially when the card holds up well in spite of competition from the similarly priced EVGA 980 Ti Classified Kingpin and Galax HOF LN2 Ed.
Anyhow, here's a guy building his GPGPU rack. He bought eight Titan X's - you probably need to contact him and lecture him about common sense or something.
 
No, and...

...no.
The Founders Edition cards are just the reference polygonal HSF blower/shroud design. No special binning, stock factory clocks.
AIB vendor designs start at $599 MSRP (the usual MSI Gaming, EVGA ACX, Asus DC3, Gigabyte Windforce etc), and the reference/Founder's Edition will go up against the OC'ed AIB custom cards like the Asus Strix/Matrix, Gigabyte G1, EVGA SSC etc.)......good luck with that.

Or you have a completely different workload in mind for the cards. You'd have to blatantly ignore the actual GPGPU market not to have realized this...
4u_hdca_web.png


While a fair number of people bought the Titan X for benchmarking and the bragging rights of having the worlds fastest GPU for a while, I think you underestimate that like any hobby, assigning a cash value for the enjoyment is a flawed exercise...and not everyone sees an extra $350 expenditure as a deal breaker - especially when the card holds up well in spite of competition from the similarly priced EVGA 980 Ti Classified Kingpin and Galax HOF LN2 Ed.
Anyhow, here's a guy building his GPGPU rack. He bought eight Titan X's - you probably need to contact him and lecture him about common sense or something.

The link you provided shows that the Titan X merely holding it own against gpus $400 cheaper. It's computer performance isn't that much better than a 980 Ti and it doesn't have as good FP processing as previous titans have had.

' assigning a cash value for the enjoyment is a flawed exercise"

You can dance around the point all day. Heck, by your logic the diamond studded iPhone case is worth it to someone. Good for them but purely looking at fact the Titan X doesn't provide a good performance / cost ratio except for a very very small segment of the already small ultra-enthusiast class. You point above is essentially just taking one isolated example and trying to pass it off as fact with a "matter of fact" tone. Yeah, because so many PC enthusiast use that kind of setup. No, as stated above, it's a micro-fraction of the market.
 
Holy Cow, $350 for faster than a titan x?!!

Must feel pretty crappy for those guys that bought titan X cards just a few months ago to hear they can get more for 1/3 the price now.

Not really. No one buys a Titan on a whim. They are enthusiasts, and they do their research more than anyone.
TL;DR they knew Pascal was coming and are always looking for an excuse to upgrade.

If they do their research more than anyone, they wouldn't buy a Titan-X in the first place, specially when an overclocked GTX 980 Ti costs 50% less and is around 30% faster.
 
Like you said you're even CPU bound at 1440P (which is far less likely,) but at 1080P, which most people use, you're even more CPU limited. Biggest games where CPU limitation is evident are GTA V (as you said,) Crysis 3 (for framerate dips,) Witcher 3, and Arkham Knight (due to terrible optimization.)

Check this out
even comparing older i7's at the same speed you see that they're all CPU limited. If they weren't CPU limited you wouldn't see a performance increase.

CPU limited doesn't mean 100% usage though.
 
No, and...

...no.
The Founders Edition cards are just the reference polygonal HSF blower/shroud design. No special binning, stock factory clocks.
AIB vendor designs start at $599 MSRP (the usual MSI Gaming, EVGA ACX, Asus DC3, Gigabyte Windforce etc), and the reference/Founder's Edition will go up against the OC'ed AIB custom cards like the Asus Strix/Matrix, Gigabyte G1, EVGA SSC etc.)......good luck with that.

Or you have a completely different workload in mind for the cards. You'd have to blatantly ignore the actual GPGPU market not to have realized this...
4u_hdca_web.png


While a fair number of people bought the Titan X for benchmarking and the bragging rights of having the worlds fastest GPU for a while, I think you underestimate that like any hobby, assigning a cash value for the enjoyment is a flawed exercise...and not everyone sees an extra $350 expenditure as a deal breaker - especially when the card holds up well in spite of competition from the similarly priced EVGA 980 Ti Classified Kingpin and Galax HOF LN2 Ed.
Anyhow, here's a guy building his GPGPU rack. He bought eight Titan X's - you probably need to contact him and lecture him about common sense or something.

Gotcha.
 
CPU limited doesn't mean 100% usage though.

Only if the code doesn't utilize all threads your CPU has, but modern games support 6-8 threads. This is why either Techspot or DF have actually started recommending locked i7 CPUs over any i5s. We'll know in a few weeks how current CPUs stack up. Also Vulkan and DX12 have been designed to reduce CPU overhead. This may change things, but until it's mainstream we won't really know how true it is.
 
Only if the code doesn't utilize all threads your CPU has, but modern games support 6-8 threads. This is why either Techspot or DF have actually started recommending locked i7 CPUs over any i5s. We'll know in a few weeks how current CPUs stack up. Also Vulkan and DX12 have been designed to reduce CPU overhead. This may change things, but until it's mainstream we won't really know how true it is.

DX12 makes the CPU much less important in general. If it does happen to catch on it won't matter if you are using AMD or Intel, pretty much anything with 4 cores or more will be good.
 
Back