Preliminary specs for Nvidia's rumored GTX 1180 may have been leaked

Polycount

Posts: 3,017   +590
Staff

Although the rumors have been wrong so far regarding the release date of Nvidia's upcoming Turing-based GPUs, according to information provided to wccftech, we might finally have preliminary specifications for Nvidia's rumored upcoming GTX 1180.

The card will reportedly launch with a core clock of 1.6Ghz, boostable to about 1.8Ghz. The card's architecture is expected to be 3584:224:64 which is comparable to the GTX 1080 Ti's 3584:224:88 architecture.

As far as other improvements over last-generation Nvidia cards go, the 1180 will include "8-16GB" of GDDR6 memory. The memory bandwidth is set to be 16Gbps, a sizable leap over the GTX 1080 Ti's 11Gbps memory bandwidth.

The GTX 1180 will reportedly include a 400mm² die, which is considerably smaller than the 1080 Ti's 471mm² die but much larger than the GTX 1080's 314mm² die. The 1180's CUDA core count, on the other hand, will be identical to the GTX 1080 Ti's at 3584 but a considerable improvement over the 1080's 2560 CUDA cores.

Power consumption could either be slightly lower or slightly higher than the GTX 1080's, according to wccftech. The 1180's rumored spec sheet lists "170-200W" as the card's TDP. No matter where in that range the cards' TDP lands, it'll consume much less power than the GTX 1080 Ti which has a TDP of 250W.

If you're wondering how much the 1180 might cost, sources say customers should expect to pay around $699 for the new GPU. While this is unconfirmed, the pricing would make sense - Nvidia's GTX 1080 Founders Edition launched with the same MSRP.

As interesting as these specs may be, a healthy degree of skepticism may be warranted. After all, none of this information has been officially confirmed. Regardless, we'll likely learn more about Nvidia's upcoming GPUs in the coming months.

Permalink to story.

 
Launch MSRP is low balled by about $400-$500.

That said, the 1180Ti is going to be a beast.
 
We're still waiting for a GPU that can drive 4K monitors for high-end gaming (e.g. considerably better frame rates than 60 FPS). It doesn't seem as if the GTX 1180 will get us there, if this leak is accurate.

Which is fine. Current 4K monitors are too small; it's difficult to read text on them unless you are uncomfortably close to the monitor.

Eventually, 4K monitors will be large enough, and GPUs will appear that will drive those monitors at high frame rates. But for now, 1440p monitors and GPUs to drive them at high frame rates (GTX 1080 and GTX 1080Ti) represent the sweet spot for gamers in today's market. I don't see that changing this year.

When the GTX 1180 arrives, we should see some price drops on the GTX 1080 and GTX 1080Ti GPUs. Smart gamers looking to upgrade from 1080p monitors to 1440p should consider taking advantage.
 
We're still waiting for a GPU that can drive 4K monitors for high-end gaming (e.g. considerably better frame rates than 60 FPS). It doesn't seem as if the GTX 1180 will get us there, if this leak is accurate.

Which is fine. Current 4K monitors are too small; it's difficult to read text on them unless you are uncomfortably close to the monitor.

Eventually, 4K monitors will be large enough, and GPUs will appear that will drive those monitors at high frame rates. But for now, 1440p monitors and GPUs to drive them at high frame rates (GTX 1080 and GTX 1080Ti) represent the sweet spot for gamers in today's market. I don't see that changing this year.

When the GTX 1180 arrives, we should see some price drops on the GTX 1080 and GTX 1080Ti GPUs. Smart gamers looking to upgrade from 1080p monitors to 1440p should consider taking advantage.

Personally, I still haven't found a reason to upgrade from 1080p, much less upgrade to 4K. I can only speak for myself, but what I'm hoping for from high-end video cards in future generations is consistent 144FPS gameplay at 1080p (on a 24" monitor, nothing is stretched). Right now, even with a 1080 Ti, framerate fluctuations are far too common from game to game.

Again, just personal preference - I prefer higher FPS to sharper resolutions (Plus, I can usually crank the settings up higher and get excellent graphics and FPS), but I understand the appeal of both.
 
Personally, I still haven't found a reason to upgrade from 1080p, much less upgrade to 4K. I can only speak for myself, but what I'm hoping for from high-end video cards in future generations is consistent 144FPS gameplay at 1080p (on a 24" monitor, nothing is stretched). Right now, even with a 1080 Ti, framerate fluctuations are far too common from game to game.

Again, just personal preference - I prefer higher FPS to sharper resolutions (Plus, I can usually crank the settings up higher and get excellent graphics and FPS), but I understand the appeal of both.

I get what you're saying. Frame rate fluctuations are a bit of a drag, and probably a bit moreso at 1440p.

But speaking generally, my GTX 1080 does pretty well driving my 1440p monitor. I'm playing RPGs, though, and doing nothing competitive. I do see some frame rate fluctuations, but they aren't getting in the way of my games much.

If frame rates were important to success or failure in a competition, I'd go with 1080p monitors, too.

What I like about my 27" 1440p monitor: I can't see individual pixels (with 1080p, I can), and that helps with immersion. I'm getting good frame rates at max graphical fidelity on most titles. But the resolution isn't so ridiculously high that I have difficulty reading text. Frame rate fluctuations don't bother me much at this resolution.

Moving to 4K doesn't make sense for *any* gamers at this juncture, because the GPUs can't drive frame rates well-enough to please us, and because the monitors are still too small for that resolution. That's why I called 1440p the 'sweet spot' for gamers.

But needs vary. For competitive gaming, the higher frame rates you can get at 1080p will surely be more important than the higher resolution.
 
I get what you're saying. Frame rate fluctuations are a bit of a drag, and probably a bit moreso at 1440p.

But speaking generally, my GTX 1080 does pretty well driving my 1440p monitor. I'm playing RPGs, though, and doing nothing competitive. I do see some frame rate fluctuations, but they aren't getting in the way of my games much.

If frame rates were important to success or failure in a competition, I'd go with 1080p monitors, too.

What I like about my 27" 1440p monitor: I can't see individual pixels (with 1080p, I can), and that helps with immersion. I'm getting good frame rates at max graphical fidelity on most titles. But the resolution isn't so ridiculously high that I have difficulty reading text. Frame rate fluctuations don't bother me much at this resolution.

Moving to 4K doesn't make sense for *any* gamers at this juncture, because the GPUs can't drive frame rates well-enough to please us, and because the monitors are still too small for that resolution. That's why I called 1440p the 'sweet spot' for gamers.

But needs vary. For competitive gaming, the higher frame rates you can get at 1080p will surely be more important than the higher resolution.

Funnily enough, I actually don't play any competitive games. I just enjoy how smooth gameplay is on a G-Sync or FreeSync 144Hz display. Like you, I play primarily RPGs and other single player titles.

I guess it depends on what the majority of gamers want - higher framerates at the cost of resolution (sort of, if you get a 24" monitor it isn't noticeable) or higher resolution at the cost of framerates.

1440p is definitely a sweet spot in that it can balance the two, but it's not going to please anyone who feels strongly one way or the other. It's not demanding enough to drag you down below 60 FPS on a reasonably high-end card (a 1080 is a good example) but it's just demanding enough to put you out of range of 120-144 FPS gameplay on more demanding/modern titles.

However, if people just want to push past the traditional console 30FPS "limit" (which is changing as of late), it definitely has its place
 
No way that MSRP is even close to what they'll sell for in the first 6 months at least. I'm guessing they'll be hard to get for less than 1k and could command even more than that.
 
Funnily enough, I actually don't play any competitive games. I just enjoy how smooth gameplay is on a G-Sync or FreeSync 144Hz display. Like you, I play primarily RPGs and other single player titles.

I guess it depends on what the majority of gamers want - higher framerates at the cost of resolution (sort of, if you get a 24" monitor it isn't noticeable) or higher resolution at the cost of framerates.

1440p is definitely a sweet spot in that it can balance the two, but it's not going to please anyone who feels strongly one way or the other. It's not demanding enough to drag you down below 60 FPS on a reasonably high-end card (a 1080 is a good example) but it's just demanding enough to put you out of range of 120-144 FPS gameplay on more demanding/modern titles.

However, if people just want to push past the traditional console 30FPS "limit" (which is changing as of late), it definitely has its place

That's a fair-minded take on it, I think.

It's true, my GTX-1080, when pushing a 1440p monitor, is not attaining 120-144 FPS on any titles I play. But on many titles, it's consistently over 100, and it's rarely below 80 on any of them. That's fast enough to please me, given my play style. A GTX-1080Ti should do even better.

I played on a 24" monitor at 1080p for years. I couldn't go back to it now. I'm just too happy with the larger screen real estate and higher resolution, and with frame rates that are actually better than any I experienced at 1080p (owing to having upgraded my GPU at the same time as I bought the monitor). It's true that if I returned to the smaller screen and kept the GPU the same, I could eek out more FPS, but that would be solving a problem I don't perceive.

But I am not everyone. For me, that's where the sweet spot is. For someone who thrives on fast FPS, settling for slightly lower FPS in return for better resolution/more screen real estate won't make much sense.

I've never owned a console, whether a 30 FPS console or otherwise. I suppose I'm one of those "PC Master Race" gamers that console gamers loathe so much. :p
 
Who cares when it costs twice as much as a 980 or 780 or 680 back when they released. Its not about the 80 at the end of the name but about the price.

At least we're finally getting rid of GDDR5.

Looks like a Titan Xp but with less cuda cores and some other inferior specs but better architecture. So performance whise we're looking at a Titan Xp for maybe 200$ less.

Uninteresting card.
 
Funnily enough, I actually don't play any competitive games. I just enjoy how smooth gameplay is on a G-Sync or FreeSync 144Hz display. Like you, I play primarily RPGs and other single player titles.

I guess it depends on what the majority of gamers want - higher framerates at the cost of resolution (sort of, if you get a 24" monitor it isn't noticeable) or higher resolution at the cost of framerates.

1440p is definitely a sweet spot in that it can balance the two, but it's not going to please anyone who feels strongly one way or the other. It's not demanding enough to drag you down below 60 FPS on a reasonably high-end card (a 1080 is a good example) but it's just demanding enough to put you out of range of 120-144 FPS gameplay on more demanding/modern titles.

However, if people just want to push past the traditional console 30FPS "limit" (which is changing as of late), it definitely has its place

That's a fair-minded take on it, I think.

It's true, my GTX-1080, when pushing a 1440p monitor, is not attaining 120-144 FPS on any titles I play. But on many titles, it's consistently over 100, and it's rarely below 80 on any of them. That's fast enough to please me, given my play style. A GTX-1080Ti should do even better.

I played on a 24" monitor at 1080p for years. I couldn't go back to it now. I'm just too happy with the larger screen real estate and higher resolution, and with frame rates that are actually better than any I experienced at 1080p (owing to having upgraded my GPU at the same time as I bought the monitor). It's true that if I returned to the smaller screen and kept the GPU the same, I could eek out more FPS, but that would be solving a problem I don't perceive.

But I am not everyone. For me, that's where the sweet spot is. For someone who thrives on fast FPS, settling for slightly lower FPS in return for better resolution/more screen real estate won't make much sense.

I've never owned a console, whether a 30 FPS console or otherwise. I suppose I'm one of those "PC Master Race" gamers that console gamers loathe so much. :p
This is all perfectly fair. And, honestly, when next-gen cards come out, I'll probably snag a 1440p, 144Hz display then and never go back, much like I never went back when I upgraded to a 144Hz monitor, or a 60Hz monitor way back when.

I'm sure 1440p is a better experience, it's just not yet worth the trade-off for me. Hopefully it will be sometime this year.
 
So it will be around 1080 Ti performance at around 1080 Ti price. That's after Nvidia "Founders Edition" Stuff. I think I'll wait until they release the 1180 Ti with 30% more performance and pick it up at the same price as this 1180.
 
So it will be around 1080 Ti performance at around 1080 Ti price. That's after Nvidia "Founders Edition" Stuff. I think I'll wait until they release the 1180 Ti with 30% more performance and pick it up at the same price as this 1180.

That may be too optimistic if the current price hikes due to mining continue. Both the 1080 and 980 were comparable to their Ti predecessors , but cost much less when they came out. The 1180 having the same MSRP as launch 1080Ti is not a good sign.
 
Funnily enough, I actually don't play any competitive games. I just enjoy how smooth gameplay is on a G-Sync or FreeSync 144Hz display. Like you, I play primarily RPGs and other single player titles.

I guess it depends on what the majority of gamers want - higher framerates at the cost of resolution (sort of, if you get a 24" monitor it isn't noticeable) or higher resolution at the cost of framerates.

1440p is definitely a sweet spot in that it can balance the two, but it's not going to please anyone who feels strongly one way or the other. It's not demanding enough to drag you down below 60 FPS on a reasonably high-end card (a 1080 is a good example) but it's just demanding enough to put you out of range of 120-144 FPS gameplay on more demanding/modern titles.

However, if people just want to push past the traditional console 30FPS "limit" (which is changing as of late), it definitely has its place

That's a fair-minded take on it, I think.

It's true, my GTX-1080, when pushing a 1440p monitor, is not attaining 120-144 FPS on any titles I play. But on many titles, it's consistently over 100, and it's rarely below 80 on any of them. That's fast enough to please me, given my play style. A GTX-1080Ti should do even better.

I played on a 24" monitor at 1080p for years. I couldn't go back to it now. I'm just too happy with the larger screen real estate and higher resolution, and with frame rates that are actually better than any I experienced at 1080p (owing to having upgraded my GPU at the same time as I bought the monitor). It's true that if I returned to the smaller screen and kept the GPU the same, I could eek out more FPS, but that would be solving a problem I don't perceive.

But I am not everyone. For me, that's where the sweet spot is. For someone who thrives on fast FPS, settling for slightly lower FPS in return for better resolution/more screen real estate won't make much sense.

I've never owned a console, whether a 30 FPS console or otherwise. I suppose I'm one of those "PC Master Race" gamers that console gamers loathe so much. :p
This is all perfectly fair. And, honestly, when next-gen cards come out, I'll probably snag a 1440p, 144Hz display then and never go back, much like I never went back when I upgraded to a 144Hz monitor, or a 60Hz monitor way back when.

I'm sure 1440p is a better experience, it's just not yet worth the trade-off for me. Hopefully it will be sometime this year.

Just make sure your fancy 144 Hz display is capable of 60 or else become one of the many whiners on the steam forum about screen tearing and you not reaching your "glorious 144 fps".
 
Just make sure your fancy 144 Hz display is capable of 60 or else become one of the many whiners on the steam forum about screen tearing and you not reaching your "glorious 144 fps".
I must admit I don't understand the insinuation here. Why must your monitor be capable of being slower to gain a faster frame rate without tearing? I thought the whole point in having "G-Sync or FreeSync" was to have a varying frequency. Which he plainly stated he was enjoying.
 
We're still waiting for a GPU that can drive 4K monitors for high-end gaming (e.g. considerably better frame rates than 60 FPS). It doesn't seem as if the GTX 1180 will get us there, if this leak is accurate.

Which is fine. Current 4K monitors are too small; it's difficult to read text on them unless you are uncomfortably close to the monitor.

Eventually, 4K monitors will be large enough, and GPUs will appear that will drive those monitors at high frame rates. But for now, 1440p monitors and GPUs to drive them at high frame rates (GTX 1080 and GTX 1080Ti) represent the sweet spot for gamers in today's market. I don't see that changing this year.

When the GTX 1180 arrives, we should see some price drops on the GTX 1080 and GTX 1080Ti GPUs. Smart gamers looking to upgrade from 1080p monitors to 1440p should consider taking advantage.
I have a 15-inch 4K screen and it's far from being "too small". It's actually the freaking best display I've had and text is incredibly sharp and big enough for reading. Ever heard about UI scaling? All modern OSes can do it without noticeable issues, maybe except for some very old, legacy apps.
 
I must admit I don't understand the insinuation here. Why must your monitor be capable of being slower to gain a faster frame rate without tearing? I thought the whole point in having "G-Sync or FreeSync" was to have a varying frequency. Which he plainly stated he was enjoying.

Its not about gaining a faster framerate. Its about having a smooth experience. I've seen the complaints especially in the CoD forums because many are capped at 91fps and their 144Hz display had problems delivering a smooth experience because of that while those with a 60Hz display didn't suffer problems.

Seen it in other game forums too but can't recall their names atm.
 
We're still waiting for a GPU that can drive 4K monitors for high-end gaming (e.g. considerably better frame rates than 60 FPS). It doesn't seem as if the GTX 1180 will get us there, if this leak is accurate.

Which is fine. Current 4K monitors are too small; it's difficult to read text on them unless you are uncomfortably close to the monitor.

Eventually, 4K monitors will be large enough, and GPUs will appear that will drive those monitors at high frame rates. But for now, 1440p monitors and GPUs to drive them at high frame rates (GTX 1080 and GTX 1080Ti) represent the sweet spot for gamers in today's market. I don't see that changing this year.

When the GTX 1180 arrives, we should see some price drops on the GTX 1080 and GTX 1080Ti GPUs. Smart gamers looking to upgrade from 1080p monitors to 1440p should consider taking advantage.

it's difficult to read text on them unless you are uncomfortably close to the monitor????? You never heard of UI scaling on Windows? It works great on my 4k 28" screen no problems reading the text
 
We're still waiting for a GPU that can drive 4K monitors for high-end gaming (e.g. considerably better frame rates than 60 FPS). It doesn't seem as if the GTX 1180 will get us there, if this leak is accurate.

Which is fine. Current 4K monitors are too small; it's difficult to read text on them unless you are uncomfortably close to the monitor.

What?! There are good 40 inch 4k TV's. Those are also good monitors. I have both 15 inch laptop with a 4k screen and a 28 inch 4k monitor... Text is fine for me, noticeably smoother as well.

Are there even any 4k monitors out there that can go above 60hz? Haven't seen one yet
 
So when will the overpriced TI be out? This 1180 is just a little better than the old one. I want to upgrade but if it's going to be $1000 that thing better have enough power to hold up with new game releases at least 5+ years at a minimum.
 
We're still waiting for a GPU that can drive 4K monitors for high-end gaming (e.g. considerably better frame rates than 60 FPS). It doesn't seem as if the GTX 1180 will get us there, if this leak is accurate.

Which is fine. Current 4K monitors are too small; it's difficult to read text on them unless you are uncomfortably close to the monitor.

Eventually, 4K monitors will be large enough, and GPUs will appear that will drive those monitors at high frame rates. But for now, 1440p monitors and GPUs to drive them at high frame rates (GTX 1080 and GTX 1080Ti) represent the sweet spot for gamers in today's market. I don't see that changing this year.

When the GTX 1180 arrives, we should see some price drops on the GTX 1080 and GTX 1080Ti GPUs. Smart gamers looking to upgrade from 1080p monitors to 1440p should consider taking advantage.

Personally, I still haven't found a reason to upgrade from 1080p, much less upgrade to 4K. I can only speak for myself, but what I'm hoping for from high-end video cards in future generations is consistent 144FPS gameplay at 1080p (on a 24" monitor, nothing is stretched). Right now, even with a 1080 Ti, framerate fluctuations are far too common from game to game.

Again, just personal preference - I prefer higher FPS to sharper resolutions (Plus, I can usually crank the settings up higher and get excellent graphics and FPS), but I understand the appeal of both.

Yes, I'm another one who will have to respectively disagree with you.

I've been on 1080p for what seems like a decade, and my recent 3440x1440, aside from adding an SSD, this is the most noticeable change to PC usage I have experienced.

And it is not just my eyes, non PC addicts / master-racer's have seen it, and they love it. At least one of them has gone out and purchased a 21:9 also.

I recommend 21:9 to anyone.
 
So when will the overpriced TI be out? This 1180 is just a little better than the old one. I want to upgrade but if it's going to be $1000 that thing better have enough power to hold up with new game releases at least 5+ years at a minimum.

lol whats wrong are you broke? What kind of person upgrades a 800$ card every year??? Might as well pay a subscription to nvidia and rent the performance from the cloud lol.

1180 for 899$ calling it out right now and 1180ti for 999$.
 
Its not about gaining a faster framerate. Its about having a smooth experience. I've seen the complaints especially in the CoD forums because many are capped at 91fps and their 144Hz display had problems delivering a smooth experience because of that while those with a 60Hz display didn't suffer problems.

Seen it in other game forums too but can't recall their names atm.

Yeah, this is the reason it's generally better to go for a FreeSync (or G-Sync, if you play for the green team) display. Otherwise, you'll see stuttering and screen tearing even with a high refresh rate display. People's mileage will vary on non-variable-sync displays.

People tend to pick up a 144Hz display without this technology and get confused when they still have issues. I know, because I was one of them initially.

Of course, some have a better experience at normal 60Hz, others have a better experience at normal 144Hz. Everybody should have a better experience with G or FreeSync, though.

Yes, I'm another one who will have to respectively disagree with you.

I've been on 1080p for what seems like a decade, and my recent 3440x1440, aside from adding an SSD, this is the most noticeable change to PC usage I have experienced.

And it is not just my eyes, non PC addicts / master-racer's have seen it, and they love it. At least one of them has gone out and purchased a 21:9 also.

I recommend 21:9 to anyone.

I'm not judging. You can use and enjoy whatever resolution floats your boat. I know a lot of people care more about resolution than they do 120+ FPS, there's nothing wrong with that. My problem is that switching to a higher resolution display after I'm already used to 120+ FPS would just be a direct downgrade, at least until better cards come out (Doesn't look like the 1180 is going to fit the bill).

Eventually I'd like to make the switch, of course. :)
 
Back