The RTX 3080 hits 100+ fps in many top titles at 4K with max settings and RTX on

barely... at medium setting

But now the real question is: can it run FS2020 ?
No because it's a new game that is broken released as dx11 for some odd reason, did a NY flight and was reserving 17gigs of VRAM.......while other places are under 3 which Is odd.
 
Nothing stands in the way for AIO partners to release beefier cards with more RAM than the reference cards. I mean it happened before so why not know? Another reason to charge an arm and a leg for their bespoke designs.. if people were willing to spend so much on 2080Ti cards it only gives them that much room to position their cards price wise above the founders edition pricing, as long as they are cheaper than 3090 people will buy them..
 
3080 does not have NVLink, Nvidia doing so because of the huge price increase from the 3080 to the 3090.

Supersad because some folks like adding a second card when the next generation comes out and prices drop to extend the life of it.

SLI requires game dev support though as far as I know, and the number of upcoming games that will support it is basically zero. Nvidia quietly killed SLI some time ago, NVLink is really only used for compute and rendering.
 
So the graphics card is not even released yet but the stocks are already "limited", so you should hurry up and pre-order one. Yeah, right. Then the online stores will add 100+ more USD to the MSRP. And the old shills will copy/paste the same post about how they always buy the card at MSRP or less. Same bs storry as before.
 
Man, B3 was a performance nightmare at launch, and it took them many months to even iron out the major kinks of frame rate drops & game crashes.

I didn’t even purchase the game watching the reviews... terrible coding all around.
Just tell them to reduce the x120 tesselation rate of underwater and other hidden stuff
 
I’ve been wondering about whether 8-10GB is enough VRAM for 4K. My 2080 can use most of its 8GB memory at 1440p. But game engines can use the extra memory it doesn’t need to cache things so it’s hard to tell how much is actually needed. If much bigger capacity cards are coming say next year then that’s a good enough excuse not to upgrade my 2080 now. Not that I really need one, at 1440p 144hz my 2080 handles games quite well enough. But as seems to be the case when I buy a new graphics card now, I need a monitor too. So whilst 4K cards may be available at reasonable prices, reasonably priced 4K 144hz monitors are not. I think il wait until I can get both a 4K 144hz monitor and a GPU of 3080 level performance or more to power it for less than £2000 combined before I pull the trigger on 4K.
That's not the way you're meant to play. You are supposed to Just buy it! Because It just works and the more you buy, the more you save!
 
I really don't understand the need for that sub-heading. For starters, the 3080 is $699, which is definitely more expensive than both consoles. Second, it's just a GPU? Sure, recent builders are just gonna pop the new GPU inside their cases and that's that. However, on this very site, there was a survey article showing that the majority of PC gamers often upgrade after 5-7 years. That means a complete new build. Now you need another $700 minimum on top of the GPU to get decent performance. For nearly 3x the price of the new consoles, it must blow them out of the water.
 
I really don't understand the need for that sub-heading. For starters, the 3080 is $699, which is definitely more expensive than both consoles. Second, it's just a GPU? Sure, recent builders are just gonna pop the new GPU inside their cases and that's that. However, on this very site, there was a survey article showing that the majority of PC gamers often upgrade after 5-7 years. That means a complete new build. Now you need another $700 minimum on top of the GPU to get decent performance. For nearly 3x the price of the new consoles, it must blow them out of the water.
The title was bait designed to draw in comments exactly like this.
 
I’ve been wondering about whether 8-10GB is enough VRAM for 4K. My 2080 can use most of its 8GB memory at 1440p. But game engines can use the extra memory it doesn’t need to cache things so it’s hard to tell how much is actually needed. If much bigger capacity cards are coming say next year then that’s a good enough excuse not to upgrade my 2080 now. Not that I really need one, at 1440p 144hz my 2080 handles games quite well enough. But as seems to be the case when I buy a new graphics card now, I need a monitor too. So whilst 4K cards may be available at reasonable prices, reasonably priced 4K 144hz monitors are not. I think il wait until I can get both a 4K 144hz monitor and a GPU of 3080 level performance or more to power it for less than £2000 combined before I pull the trigger on 4K.
What you're not taking into account is memory compression as well as the speed of the memory itself.

As long as they have really fast memory with really good compression the size won't matter nearly as much maybe in 2-4 years the 10gb isn't enough but they built these cards with devs input on what they Want to do moving forward game development takes a long while so anything coming out in the next 2 years is pretty much going to fit within the abilities of the cards they built knowing full well what devs were doing and asking for.

Not to mention that I really truly think they are betting big on tech like DLSS blowing up and with or resolution along with memory size won't really matter anymore.

I can almost confidently say they've got something planned to where no game will need more vram than what they put it in atleast not in the next couple years.

DLSS everywhere is their next big breakthrough and I belive a lot of these cards secret sauce will come down to leveraging that so the crazy results we see currently in DLSS 2.0 titles can b be had anywhere.

It's not ready yet but again I don't think it's necessary anyways not for a couple more years.
 
All tests were done with DLSS enabled.
As long as it's on in both scenarios I don't have a problem but can't be used to compare across older cards or amd.

Which I know they are not but it just feels like it needs to be stated again.

You can obviously run a DLSS vs no DLSS test it just needs to be spelled out and Nvidia vs competition needs both a with DLSS and without DLSS bench to paint the full picture.

No matter how good it may be there are those out there who won't accept or no matter what.

Remind me a lot of the people who called early science demos witchcraft.
 
No because it's a new game that is broken released as dx11 for some odd reason, did a NY flight and was reserving 17gigs of VRAM.......while other places are under 3 which Is odd.
I have a feeling when it's put on the cards with amd without massive ram the difference between them will scale based on their actual performance advantage and memory won't have any bearing except the card with more memory will show it using more.

A bench in MSFS2020 between a 3080 10gb and a 3090 24gb should still end up around 20-30%.

I'd there was some major memory limitation we should see the 3080 performance tank in relation to 3090.

If we see it showing up at like half or less then yea there might be a problem.

We shall find out soon enough.
 
Multi gpu is dead when Peterson left.
The only reason a 3090 has a bridge is because it's technically a Titan made for content creators, there is zero reasoning for needing a second one, the bandwidth on PCIe 4 means the scaling would be garbage for 3000 series.

Your comment doesn't seem to make sense? Are you saying a) a bridge wouldn't be needed so they can SLI without one? or b) the bandwidth on PCIe 4.0 is so bad that scaling would be awful with the 3000 series? (definitely not the case on an x16 slot)

Your post makes no sense if it's the former, and if it's the latter it's a fairly dumb/contradictory statement, especially when the new consoles (and some nVidia tech) pull data direct to the GPU from PCIe 4.0 SSDs.
 
No because it's a new game that is broken released as dx11 for some odd reason, did a NY flight and was reserving 17gigs of VRAM.......while other places are under 3 which Is odd.
that's because they released it when the marketing dictated and not whem the game was ready.
 
I'm more worried about how it will run titles designed for next-gen consoles.

But since it's likely we'll be on the 4000 cards once the cross-gen titles get sloughed off the release schedule, it's probably a moot point anyway.
 
I’ve been wondering about whether 8-10GB is enough VRAM for 4K. My 2080 can use most of its 8GB memory at 1440p. But game engines can use the extra memory it doesn’t need to cache things so it’s hard to tell how much is actually needed.
Even if only used as a cache, you will still see a performance hit when the cache is full.

I would definitely wait for a bigger memory card to appear, if buying for 1440p or above . For high refresh 1080p it should be bullet-proof.
 
I’ve been wondering about whether 8-10GB is enough VRAM for 4K. My 2080 can use most of its 8GB memory at 1440p. But game engines can use the extra memory it doesn’t need to cache things so it’s hard to tell how much is actually needed. If much bigger capacity cards are coming say next year then that’s a good enough excuse not to upgrade my 2080 now. Not that I really need one, at 1440p 144hz my 2080 handles games quite well enough. But as seems to be the case when I buy a new graphics card now, I need a monitor too. So whilst 4K cards may be available at reasonable prices, reasonably priced 4K 144hz monitors are not. I think il wait until I can get both a 4K 144hz monitor and a GPU of 3080 level performance or more to power it for less than £2000 combined before I pull the trigger on 4K.
a moment of silence to 2080 beta testers that paid a premium for the same product 3080 at less than half the price
 
a moment of silence to 2080 beta testers that paid a premium for the same product 3080 at less than half the price
You know, I bought my 2080 for £500 in April because of the pandemic. I was fully aware that Nvidia were going to release new GPUs this year. That was nearly 6 months ago. I feel sorry for anyone who needed a graphics card for the lockdown who waited until now to get one - 5 months of staying at home with no decent GPU? No thanks. I don’t feel bad, I normally travel a lot in any given year and I’ve already saved about £1800 in flights so I could easily afford a new card. I reckon I could sell my 2080 for £300 aswell, it really wouldn’t cost much to get a 3080, if I sold the 2080 for £300, the outlay would have been lower than the cost of a 2080Ti back in April.

But at the end of the day the 2080 runs any game I want at pretty much max settings on my 1440p 144hz monitor. I don’t need to upgrade, I really wouldn’t get much out of it and if I did, I’d just feel the CPU bottleneck more. However £650 for a 3080 isn’t really a huge amount of money. But that doesn’t mean I don’t want an upgrade, what enthusiast doesn’t? But with a 2080 at 1440p I can comfortably wait until next years cards come out, or even cards the year after it. Also for me with my current monitor, a new CPU would give me more than a new GPU.
 
I’m kinda bummed they always reference 4K spec without also mentioning 1440p or 1080p as well...What’s the point of the 1080p 360hz monitors if Nvidia don’t mention how fast games can get running on the card?

That said, I’m incredibly excited about the 3080, but a bit stressed about getting one before next year with the projected scarcity.
 
I’m kinda bummed they always reference 4K spec without also mentioning 1440p or 1080p as well...What’s the point of the 1080p 360hz monitors if Nvidia don’t mention how fast games can get running on the card?

That said, I’m incredibly excited about the 3080, but a bit stressed about getting one before next year with the projected scarcity.
They always mention 4k spec only because it(s good marketing. It's where the progress is biggest because the last-gen cards have much lower bandwidth. If you want to know the gains for lower resolutions, wait for proper independent reviews on less hand-picked games. There will still be gains, but less impressive than in 4k. It's just as it was leaked, there's a solid 40%-50% improvement overall.

Don't worry about the scarcity, big companies want your money. exercise patience and you will overpay less.
 
You know, I bought my 2080 for £500 in April because of the pandemic. I was fully aware that Nvidia were going to release new GPUs this year. That was nearly 6 months ago. I feel sorry for anyone who needed a graphics card for the lockdown who waited until now to get one - 5 months of staying at home with no decent GPU? No thanks. I don’t feel bad, I normally travel a lot in any given year and I’ve already saved about £1800 in flights so I could easily afford a new card. I reckon I could sell my 2080 for £300 aswell, it really wouldn’t cost much to get a 3080, if I sold the 2080 for £300, the outlay would have been lower than the cost of a 2080Ti back in April.

But at the end of the day the 2080 runs any game I want at pretty much max settings on my 1440p 144hz monitor. I don’t need to upgrade, I really wouldn’t get much out of it and if I did, I’d just feel the CPU bottleneck more. However £650 for a 3080 isn’t really a huge amount of money. But that doesn’t mean I don’t want an upgrade, what enthusiast doesn’t? But with a 2080 at 1440p I can comfortably wait until next years cards come out, or even cards the year after it. Also for me with my current monitor, a new CPU would give me more than a new GPU.
RIP 2080 buyers KEKW
 
I’ve been wondering about whether 8-10GB is enough VRAM for 4K. My 2080 can use most of its 8GB memory at 1440p. But game engines can use the extra memory it doesn’t need to cache things so it’s hard to tell how much is actually needed. If much bigger capacity cards are coming say next year then that’s a good enough excuse not to upgrade my 2080 now. Not that I really need one, at 1440p 144hz my 2080 handles games quite well enough. But as seems to be the case when I buy a new graphics card now, I need a monitor too. So whilst 4K cards may be available at reasonable prices, reasonably priced 4K 144hz monitors are not. I think il wait until I can get both a 4K 144hz monitor and a GPU of 3080 level performance or more to power it for less than £2000 combined before I pull the trigger on 4K.

Gamers Nexus did a really good piece on that, long story short, lots of games will allocate 100% of GPU memory but only actually use 4 to 6gbs of it at 4k. The only exception to this is Flight Sim 2020 (that I know of).
 
I know it's all about gaming and enthusiasts but:

- Most gaming monitors are still 1080 and 1440p, the latter keeps growing; the UHD monitors are still very, very few, they are small and incredibly expensive for what they offer. The very few that are 42-inch+ and bigger are either OLED (...TVs (not monitors) so very susceptible to burn-in) or suffering other major issues and out of question about their price-to-what-you-get/total performance ratio. I'd rather get the most expensive (1440p/UHD) projector for the money than a 65-85-inch screen that doesn't even provide sRGB mode.

- GPUs are components used for many more things than simply gaming, even in a home (entertainment) environment. 40 more frames matter much more below 60 than above 100 and once again - UHD and not advertised "4K" matters only on paper at the moment as the number of high refresh UHD monitors is extremely limited, expensive and will be so in the next few years. For the rest or in other words - TV gaming - you need a PS or an Xbox.

And for 90% of the users a 1XXX (such as the 1080Ti) and anything above 2070 will work just fine for at least the next 2-3 years on their 1440p 144Hz monitor, for rendering home, or any other needs they may have, not professionally. Ray tracing and most of the bells and whistles currently advertised are simply pure marketing with very little benefits to the end user. Once we see better software development support this may change.

And once again, 30-40% UHD (not 4K) frame rate increase with better RT and DLSS support are good but not crucial for the end user real benefits, especially coming from 2070+ and even a 1080Ti unless you are doing competitive gaming only or don't know where to spend your (parents) money...
 
Back