Nvidia reportedly working on GeForce GTX 750 Ti

By on September 11, 2013, 10:00 AM
nvidia, geforce, gpu, graphics card, gtx 750 ti, gtx 700 series

So far, Nvidia's GeForce 700 series comprises of the high-end GTX 780 and GTX 770, as well as the mid-range GTX 760. But in preparation for this holiday season's blockbuster game releases, such as Battlefield 4 and Call of Duty: Ghosts, Nvidia are reportedly preparing a poster-boy GPU: the GeForce GTX 750 Ti.

The mid-range graphics card from Nvidia is allegedly being designed to succeed the GTX 650 Ti, while also outperforming the GTX 660. It's said to be based on the same GK104 GPU that's found in the GTX 770 and GTX 760, but obviously with some features stripped out to keep the prices low.

MyDrivers reports that the GTX 750 Ti will come with 960 CUDA cores - less than the GTX 760's 1152, and the same as the GTX 660 - but with 32 ROPs (up from 24 on the GTX 660) and a 256-bit memory bus width (up from 192 on the GTX 660). Clock speeds are also said to be higher, with a base clock of 1033 MHz, a boost clock of 1098 MHz, and a memory clock of 6000 MHz (effective) on GDDR5 DRAM.

It isn't known when the GeForce GTX 750 Ti will be officially released, but it's unlikely Nvidia will schedule a launch at the same time as AMD's late-September 'Volcanic Islands' GPU unveiling. This means we're looking at either a release in the next few weeks, or in early October, if the rumors prove to be correct.




User Comments: 18

Got something to say? Post a comment
JC713 JC713 said:

The Radeon 7790 would be a 650Ti Boost killer if it werent for the 128bit memory interface.

ikesmasher said:

"while out performing the GTX 660"

F UUUUUUUUU NVIDIA

I hate instantly obsolete parts.

1 person liked this | Skidmarksdeluxe Skidmarksdeluxe said:

"while out performing the GTX 660"

F UUUUUUUUU NVIDIA

I hate instantly obsolete parts.

So we can safely say you hate all PC parts?...

dividebyzero dividebyzero, trainee n00b, said:

Can anyone explain why the supposed GPU-Z screen shows two different drivers ( 326.80 AND 326.41)

BTW: The 256 bit bus width says the GPU is a GK 104, while the device ID (11C7) says it is a GK 106

/cues circus music

2 people like this | ikesmasher said:

So we can safely say you hate all PC parts?...

exactly. Its a love/hate relationship.

Guest said:

It'd be interesting to see at what price band Nvidia is going to place the new GTX 750 Ti.

GTX 650 Ti was a sweet deal in terms of price/performance and GTX 750 Ti should be at least around $180 range if it need to attract budget gamers despite having more raw power. Just my 2.

Laimis Laimis said:

Honestly I think the current selection of Nvidia GPU's is enough. Recently I've scavenged myself a new PC and popped in a 760, which I've chosen as an 'enough' option for my 24'' 1920x1200 screen. And so far it has been more than enough to play what I like.

From what I've learned reading all the reviews on GPU's if I would pop in another 760, the 2 of them would outperform the next generation *70 or even *80 GPU solution.

GhostRyder GhostRyder said:

Can anyone explain why the supposed GPU-Z screen shows two different drivers ( 326.80 AND 326.41)

BTW: The 256 bit bus width says the GPU is a GK 104, while the device ID (11C7) says it is a GK 106

/cues circus music

Attribute it to poor support on the CPU-Z end, its probably unable to register what the true part is. Its even naming it as 2 different things (760 SE or 750 ti) so its a mystery. For all we know though, its on the same list showing cards that don't exist for the lawlz, I don't recall seeing a 750ti in the list coming out this year, so like I said before, its a mystery.

Honestly I think the current selection of Nvidia GPU's is enough. Recently I've scavenged myself a new PC and popped in a 760, which I've chosen as an 'enough' option for my 24'' 1920x1200 screen. And so far it has been more than enough to play what I like.

From what I've learned reading all the reviews on GPU's if I would pop in another 760, the 2 of them would outperform the next generation *70 or even *80 GPU solution.

Well true, you an always do that and its a good way to get a real powerful machine while saving a buck or two. I know loads of people who do it and ive done it in the past.

In my opinion, I wish both companies would stop with these odd numbered cards and versions of cards. I was hoping this gen to say goodbye to TI variants and get back to just 740, 750, 760, 770, 780 or from AMD 9750, 9770, 9850, 9870, 9950, 9970. I always found it annoying when they would make a 750 or 60 TI, or a 7790, or just things like a GHZ edition. Not that any of those cards are bad cards, they are great cards, I just like it clean cut and straight to the point without all the in-between cards existing.

dividebyzero dividebyzero, trainee n00b, said:

Attribute it to poor support on the CPU-Z end, its probably unable to register what the true part is.

Firstly, it's GPU-Z

Secondly, the latest iteration of GPU-Z released today has support for AMD's Volcanic Islands (which aren't due for release for at least 6 weeks) but makes no mention of what essentially would be no more than a clock bumped 660 ?

Thirdly, GPU-Z would never list two different drivers for a single device.

The only area where GPU-Z could misreport would be the bus width - unlikely since overclocking the physical bits is pretty damn unorthodox. The PCI device ID for GK 104 (256-bit) is 11xx (1180 for GTX 680, 1189 for GTX 670, 1187 for GTX 760, 1184 for GTX 770 etc) while the ID for GK 106 (192-bit) is 11Cx ( 11C0 for GTX 660, 11C2 for GTX 650 Ti Boost, 11C6 GTX 650 Ti etc)

In short, while there could be a 750 Ti in the making, it certainly isn't going to conform to that GPU-Z capture- which is almost certainly shopped, especially since there is no GPU-Z validation for it - and for what its worth, [link]

GhostRyder GhostRyder said:

Firstly, it's GPU-Z

Secondly, the latest iteration of GPU-Z released today has support for AMD's Volcanic Islands (which aren't due for release for at least 6 weeks) but makes no mention of what essentially would be no more than a clock bumped 660 ?

Thirdly, GPU-Z would never list two different drivers for a single device.

The only area where GPU-Z could misreport would be the bus width - unlikely since overclocking the physical bits is pretty damn unorthodox. The PCI device ID for GK 104 (256-bit) is 11xx (1180 for GTX 680, 1189 for GTX 670, 1187 for GTX 760, 1184 for GTX 770 etc) while the ID for GK 106 (192-bit) is 11Cx ( 11C0 for GTX 660, 11C2 for GTX 650 Ti Boost, 11C6 GTX 650 Ti etc)

In short, while there could be a 750 Ti in the making, it certainly isn't going to conform to that GPU-Z capture- which is almost certainly shopped, especially since there is no GPU-Z validation for it - and for what its worth, [link]

I never said it was a legit post, I said it can be attributed to it, this happens pretty often where things are min-interpreted by GPU-Z or CPU-Z, its pretty common, however it can be a fake as that happens just as much. Since it was not in the official charts, it was more than likely a faked image.

Also, the Volcanic islands "Leaks" and talks have been going a lot longer than the supposed 750ti, which would more attribute to having GPU-z support for them over something that just appeared in the last 2 days.

dividebyzero dividebyzero, trainee n00b, said:

Also, the Volcanic islands "Leaks" and talks have been going a lot longer than the supposed 750ti, which would more attribute to having GPU-z support for them over something that just appeared in the last 2 days.

That's not how it works. GPU-Z support is linked to the release date on the cards. The closer the launch, means that the vendor is supplying W1zzard with the relevant information. Cards don't launch without GPU-Z support these days.

As for supposed validity, that is the whole purpose behind the actual validation process...otherwise every Tom, D1ck, or Harry would just edit the INF file and claim whatever BS they like*.

*Note that you can't falsify the PCI device ID or claim two separate drivers for the same device.

GhostRyder GhostRyder said:

That's not how it works. GPU-Z support is linked to the release date on the cards. The closer the launch, means that the vendor is supplying W1zzard with the relevant information. Cards don't launch without GPU-Z support these days.

As for supposed validity, that is the whole purpose behind the actual validation process...otherwise every Tom, D1ck, or Harry would just edit the INF file and claim whatever BS they like*.

Sigh...

The 750ti is not out nor is there full proof yet it exists as this debate is showing, hence low-none GPU-Z support.

I never said this was real, I said;

I never said it was a legit post, I said it can be attributed to it, this happens pretty often where things are min-interpreted by GPU-Z or CPU-Z, its pretty common, however it can be a fake as that happens just as much. Since it was not in the official charts, it was more than likely a faked image.

Also, the Volcanic islands "Leaks" and talks have been going a lot longer than the supposed 750ti, which would more attribute to having GPU-z support for them over something that just appeared in the last 2 days.

I said it could be faked because as you pointed out you cant fake things like the Nvidia version that show up in GPU-Z, however there would not be support for a card that's first leak talk was today (If it is legit).

Firstly, it's GPU-Z

Secondly, the latest iteration of GPU-Z released today has support for AMD's Volcanic Islands (which aren't due for release for at least 6 weeks) but makes no mention of what essentially would be no more than a clock bumped 660 ?

Thirdly, GPU-Z would never list two different drivers for a single device.

I rest my case, im saying that your right, it looks faked, but there is always the chance that some of the attributes from that supposed screenshot are legit but due to GPU-Z missing something though highly unlikely since as you said it has two different driver versions listed.

Though an obvious mistake to the person who "Faked" it seems kind of like a hard miss to be frank.

havok585 havok585 said:

The Radeon 7790 would be a 650Ti Boost killer if it werent for the 128bit memory interface.

Bullsh1T ! u need a 7850 and OC'ed to beat the ti boost.

JC713 JC713 said:

Bullsh1T ! u need a 7850 and OC'ed to beat the ti boost.

I said that... if the 7790 had a 192bit interface like the 650Ti boost it will demolish it because it has more cores and a high clock.

1 person liked this | dividebyzero dividebyzero, trainee n00b, said:

I said that... if the 7790 had a 192bit interface like the 650Ti boost it will demolish it because it has more cores and a high clock.

Extremely unlikely. AMD chose a 128-bit bus because it balances the back end ( 16 ROP's- or more specifically four partitions @ 4 colour ops per clock). Adding another memory controller without being able to process the render just moves the bottleneck from the memory bus to raster ops.

This is precisely why the GTX 650 Ti Boost featured both an increase in bus ( from 128-bit to 192-bit) as well as a 50% increase in ROP count (from 16 to 24) from the original GTX 650 Ti

JC713 JC713 said:

Extremely unlikely. AMD chose a 128-bit bus because it balances the back end ( 16 ROP's- or more specifically four partitions @ 4 colour ops per clock). Adding another memory controller without being able to process the render just moves the bottleneck from the memory bus to raster ops.

This is precisely why the GTX 650 Ti Boost featured both an increase in bus ( from 128-bit to 192-bit) as well as a 50% increase in ROP count (from 16 to 24) from the original GTX 650 Ti

Ah, thanks for clearing that up.

Load all comments...

Add New Comment

TechSpot Members
Login or sign up for free,
it takes about 30 seconds.
You may also...
Get complete access to the TechSpot community. Join thousands of technology enthusiasts that contribute and share knowledge in our forum. Get a private inbox, upload your own photo gallery and more.