Rumor: Nvidia to cut GPU prices in November to compete with AMD

By on September 30, 2013, 7:00 PM

According to DigiTimes, who has "sources from graphics card players", Nvidia may be gearing up to cut the prices of their graphics cards in late November, to compete with the freshly announced AMD R7 and R9 series GPUs. Nvidia may also look to releasing one or two new cards in the US$149-249 segment towards the end of this year, again to boost their competitiveness in the market.

A number of cards from AMD's new line-up are re-badges of existing HD 7000 series GPUs, presumably with little (if any) performance increases. This should allow Nvidia to strategically price their existing cards to compete on a price-to-performance level, especially after benchmarks of AMD's new cards have been released.

Nvidia's current GeForce 700 series features four cards: the GTX 760, GTX 770, GTX 780 and GTX Titan, occupying price points north of $249, with older GeForce 600 cards still on the market for lower price segments. It's possible that Nvidia will release new GK104- or GK106-based graphics cards for the $149-249 price segment, to compete with AMD's Radeon R9 270X and R7 260X.

Also on the table, as previously mentioned, is a new dual-GPU card based on GK110 cores, potentially labeled as the GeForce GTX 790. The GK110 silicon has also yet to be fully 'unlocked', with the GTX Titan using 2,688 of 2,880 CUDA cores on the die, meaning there's still room for a new high-end single-CPU card.

Either way, this upcoming holiday season will be interesting for system builders and graphics card buyers. Although AMD's new range will be coming to market in the next month or so, it may be worth waiting until November to see how Nvidia responds.




User Comments: 29

Got something to say? Post a comment
H3llion H3llion, TechSpot Paladin, said:

Hmm either I better sell the 780, save and get the R90X, "790" or just pop another 780 inside the rig when the prices come down, hmm :/

lchu12 lchu12 said:

I'm might just pick up another 770 when the price goes down...

1 person liked this | cliffordcooley cliffordcooley, TechSpot Paladin, said:

Hmm either I better sell the 780, save and get the R90X, "790" or just pop another 780 inside the rig when the prices come down, hmm :/

Or, be happy with what you have. lol

Shhh, don't tell anyone all I have is a single GTX 660. :/

1 person liked this | dividebyzero dividebyzero, trainee n00b, said:

Also on the table, as previously mentioned, is a new dual-GPU card based on GK110 cores, potentially labeled as the GeForce GTX 790. The GK110 silicon has also yet to be fully 'unlocked', with the GTX Titan using 2,688 of 2,880 CUDA cores on the die, meaning there's still room for a new high-end single-CPU card.

Technically, GK110 has been unlocked- just not yet for GeForce branding.

The dual-GPU card could be an ideal (if any card consuming 300+ watts can be considered ideal) fit a reduced core part such as the "780i" rumours that seem to be doing the rounds.

Guest said:

I still have my GTX 690 from over a year ago, I don't plan to upgrade for quite some time. I mean unless some sick high resolution monitors come out for me to upgrade from my current 27 inch 2560x1440...

EEatGDL said:

I didn't know the Titan was considered 700 series since it was launched in the 600 series era, even when the top 700 series share the same chip with the Titan. I saw it like the precursor of the 700 series, a "taste" of what was next.

ensabrenoir ensabrenoir said:

Nice..... gonna get me another 670 and a 7 series for a new build....

2 people like this | dividebyzero dividebyzero, trainee n00b, said:

I didn't know the Titan was considered 700 series since it was launched in the 600 series era, even when the top 700 series share the same chip with the Titan. I saw it like the precursor of the 700 series, a "taste" of what was next.

Originally, when launched the Titan was in a class by itself (figuratively as well as performance and price), but with the launch of the GTX 780 which utilizes the same GPU (albeit designated GK110-400 rather than -300), it is now lumped in with the 700 series.

Xenite Xenite said:

Looking forward to picking up the 290x Titan Killer. In our world of console ports and Mantle being in effect Xbox ones low level API, AMD is the way to go for the foreseeable future.

spydercanopus spydercanopus said:

They need to make a branded CPU. It would send shockwaves.

Obzoleet Obzoleet said:

We all knew this was going to happen....I think its horrible...the rational side in me tells me to wait until November now... but I wanna pre order the 290X

1 person liked this | jsonUK jsonUK said:

Well that's good news. I've been wanting to upgrade my GTX 560 for a while, was thinking of taking the leap to a GTX 770 - and if they're slashing prices, might as well wait until then.

Not much of an AMD fan, even though they are bringing out their new range. I much prefer Nvidia drivers.

GhostRyder GhostRyder said:

They need to make a branded CPU. It would send shockwaves.

They make one already, its the Nvidia Tegra processor though its mostly a tablet/mobile processor.

Im glad they are finally thinking of cutting the prices to up the competition. Recently, Nvidia has been charging extreme prices for all their GPU lines and if the 780 and titan drop a bit, it will definitely stiffen up the competition. Depending on who is coming out with Dual GPU cards this gen (If AMD decides to wait to long again) or depending on the performance of the two, its going to be tough for me to choose so long as the prices are competitive! I my actually jump back to NVidia depending on the prices (Though im still drooling over the 290X)

1 person liked this | Burty117 Burty117, TechSpot Chancellor, said:

Well they need to drop the prices of the 780, I'm playing Battlefield 4 Beta right now and, granted, it is a beta, but the framerate is all over the shop! it goes down to below 20fps at times then all the way back up to 70+, its very annoying to say the least. I guess if I SLI'ed, The lowest it would go would be around 35fps

GhostRyder GhostRyder said:

Well they need to drop the prices of the 780, I'm playing Battlefield 4 Beta right now and, granted, it is a beta, but the framerate is all over the shop! it goes down to below 20fps at times then all the way back up to 70+, its very annoying to say the least. I guess if I SLI'ed, The lowest it would go would be around 35fps

But aren't you gaming on a 2560x1440p monitor? BF4 even though its Beta is going to be pushing things to the limits (Though if your having issues im getting a little nervous, I just DLed it last night so I have yet to try it because of work today). I hope my twins can keep me above 60 for the time being at least :P.

However, I would not worry yet burty, im sure the price will come down to like 500-600 range first and then go from there!

ddg4005 ddg4005 said:

Since I recently purchased two 4GB GTX 770 cards (one for each box) I'm done with graphics cards for awhile. The additional memory does help with games on a 27-inch screen at 1920x1080 (I didn't believe it before).

amstech amstech, TechSpot Enthusiast, said:

Since I recently purchased two 4GB GTX 770 cards (one for each box) I'm done with graphics cards for awhile. The additional memory does help with games on a 27-inch screen at 1920x1080 (I didn't believe it before).

Nice setup but overkill for 1080p 95% of the time.

At 1080p you will very very rarely use more then 2GB RAM.

Your good to go for 1440p/1600p gaming though.

Skidmarksdeluxe Skidmarksdeluxe said:

If they drop the prices enough I may consider upgrading from my beloved Riva TNT 2. It's finally starting to show it's weaknesses with my new 30" monitor. XD

Guest said:

I've still only got a 460 lol. I'm gonna wait till next year to upgrade

Guest said:

The museum phoned they want there graphics card back

Burty117 Burty117, TechSpot Chancellor, said:

But aren't you gaming on a 2560x1440p monitor? BF4 even though its Beta is going to be pushing things to the limits (Though if your having issues im getting a little nervous, I just DLed it last night so I have yet to try it because of work today). I hope my twins can keep me above 60 for the time being at least :p.

However, I would not worry yet burty, im sure the price will come down to like 500-600 range first and then go from there!

Yeah true, I've found a way round it now though, I just overclock the Graphics Card to 1GHz on the Core and set the power option to "prefer performance mode" now it doesn't drop anywhere near as much, hell, I'm lucky if it drops below 45fps! Much better, I think the servers have a lot to do with it though, if I go on a British server with say 15-20 people on conquest I get 60fps+ 99% of the time, if I go on a 60-64 player map from Germany frame rate drops considerably, down to around 45fps, do the same on a lower ping server (British server) and it pretty much sits around 60fps most of the time.

Here in Britain you can get hold of a GTX 780 for £495, if they can lower it to just below £400 would be wonderful, I would then SLI the beast.

Have you had a chance to try Battlefield 4 Beta on the twins xD

GhostRyder GhostRyder said:

Yeah true, I've found a way round it now though, I just overclock the Graphics Card to 1GHz on the Core and set the power option to "prefer performance mode" now it doesn't drop anywhere near as much, hell, I'm lucky if it drops below 45fps! Much better, I think the servers have a lot to do with it though, if I go on a British server with say 15-20 people on conquest I get 60fps+ 99% of the time, if I go on a 60-64 player map from Germany frame rate drops considerably, down to around 45fps, do the same on a lower ping server (British server) and it pretty much sits around 60fps most of the time.

Here in Britain you can get hold of a GTX 780 for £495, if they can lower it to just below £400 would be wonderful, I would then SLI the beast.

Have you had a chance to try Battlefield 4 Beta on the twins xD

My Pair of 6990s seem to run pretty well I just got a chance today once I fiddled with the settings and got everything at ultra and what not. MSI afterburner would not display for me in-game but fraps did and registered with the VSYNC on range of 50-60FPS. With it off I skipped around from 120-45FPS (Gaming at 1080p). Now with 3 way eyefinity enabled, the FPS on ultra stayed around 40-80 on ultra with Vsync off though I did not play it that way for long because I had to work. Ill try enabling my Overclocks to see if I can up the performance much, I normally just run the twins at the stock Bios 2 880mhz option.

Though since this is beta, im curious if they are doing the same thing with BF4 like they did in BF3 where the graphics option may have said it was Ultra, but it was really high. At release ill make my decision on when and if im going to go ahead and upgrade my rig, but if things continue the way they are, ill be sticking with these for a bit longer.

JC713 JC713 said:

I wonder if that 770Ti is actually true. We will see.

Boilerhog146 Boilerhog146 said:

I see a third and maybe a fourth 4Gig 670 coming cheap.I like the competition, good job amd though I won't buy a 290x it's a blessing anyway...

ddg4005 ddg4005 said:

Nice setup but overkill for 1080p 95% of the time.

At 1080p you will very very rarely use more then 2GB RAM.

Your good to go for 1440p/1600p gaming though.

I know it seems like overkill but I actually saw stuttering in Mass Effect and Mass Effect 3 with all the detail settings at maximum with my 2GB cards. It was something I'd never noticed on my 23-inch Samsung monitors so I deduced it to be the GTX 670s I was running in my boxes. I can also state that gaming just seems...smoother with the additional RAM; again it could just be my imagination but I don't think so.

GhostRyder GhostRyder said:

I know it seems like overkill but I actually saw stuttering in Mass Effect and Mass Effect 3 with all the detail settings at maximum with my 2GB cards. It was something I'd never noticed on my 23-inch Samsung monitors so I deduced it to be the GTX 670s I was running in my boxes. I can also state that gaming just seems...smoother with the additional RAM; again it could just be my imagination but I don't think so.

2gb of VRAM is becoming more and more of a problem with new games in this day and age. Some games start to really get hungry even at 1080p but its still enough for many peoples needs.

Gaming beyond 1080p, its definitely a necessity to have more than 2gb vram, but that's still not a wide variety of people. Mass effect series im unsure of how much they use the vram, but mass effect 3 was a huge game so it would not surprise me.

1 person liked this | LNCPapa LNCPapa said:

Mass Effect 1 does not use much VRAM at all so I seriously doubt any stuttering was due to running out of memory. I game at 2560 and VERY few games use 2GB of VRAM even at the typical max settings. I keep a pretty close eye on VRAM usages via the G15 support on MSI Afterburner (it's always up and running) so I'm fairly certain about this.

I don't own ME3 so I can't say for that game, but from what I've heard it's mostly a console port so it would surprise me if it used more than 2GB also.

amstech amstech, TechSpot Enthusiast, said:

Mass Effect 1 does not use much VRAM at all so I seriously doubt any stuttering was due to running out of memory. I game at 2560 and VERY few games use 2GB of VRAM even at the typical max settings. .

Agreed.

I game at 1600p and I don't remember the last time I hit my VRAM limit, maybe Metro 2033? I tweaked one setting that made no visual difference and it was fine. Newer games use more but for 1440p/1600p and below 2GB is fine 95% of the time, VRAM is so overblown. It also depends on how you picky you are, mods etc, I am an old f*ck now so if I have to turn a setting down here or there its fine with me, the old me, not so much.

That being said its almost 2014 so I wouldn't advise someone purchasing now to game at 1440p/1600p with a 2GB GPU, atleast get a 3GB/4GB GPU for some specific reasons + future proofing.

ddg4005 ddg4005 said:

Actually Mass Effect supports Ambient Occlusion which is a bit a resource hog so the extra VRAM comes in handy. But I agree that games nowadays are starting to push past needing 2GB of VRAM. I think before long we'll start seeing cards that sport 4GB of VRAM like it's nothing.

Load all comments...

Add New Comment

TechSpot Members
Login or sign up for free,
it takes about 30 seconds.
You may also...
Get complete access to the TechSpot community. Join thousands of technology enthusiasts that contribute and share knowledge in our forum. Get a private inbox, upload your own photo gallery and more.