Gears 5 at 8K resolution brings the $2,500 Nvidia Titan RTX to its knees

4k gaming is hardly a thing yet, let's pump the brakes.
If you are looking for long term improvement, it's worth seeing how far off this stuff is. You can't do it without running benchmarks. So sure it may not be usable now but you don't get anywhere by just not even looking at it and trying to improve to get there.
 
8K is fine for browsing, desktop publishing, image photography, and video playback. 4K and lower for interactive programs. Until they figure out how to continue Moore's law, this is it for a while. Honestly, IMO, 4K is as good as it ever needs to be including the reasons I had already cited for 8K. My eyes will unlikely ever enjoy the virtues of 8K before I pass on.
 
I think the article conveys some useful information. It shows that 8K won't be viable for competitive gamers until graphics cards more than double their current performance. This may take five years instead of just two or three.
 
One day Microsoft/Sony devs will tell us gaming with 24 fps on 8k resolution is so cinematic. Im glad we don't have to wait for PS6 lol.
 
8k gaming? Honestly, wouldn't it need a screen measured in square yards (or meters) to need that kind of detail, yes? I must be missing some kind of basic idea - or is the industry just intent on personal cineplex as the next necessary thing?

Yeah, they're just trying to find the next way to hype people into paying for something that has pretty much 0 practical application in an average consumer scenario.
 
4k gaming is hardly a thing yet, let's pump the brakes.
4K gaming has been a thing already for 4 years.
The media just spews crap about 4K gaming all the time for whatever reason.... people act like games can only be played in ultra settings with every single option maxed out.. When the reality is that "ultra" is a set of settings built into games to sell hardware, often providing less than 10% increased visual fidelity at the cost of 50-1000% additional GPU resources.
8k gaming? Honestly, wouldn't it need a screen measured in square yards (or meters) to need that kind of detail, yes? I must be missing some kind of basic idea - or is the industry just intent on personal cineplex as the next necessary thing?

Yeah, they're just trying to find the next way to hype people into paying for something that has pretty much 0 practical application in an average consumer scenario.

Funny because they have actually been pushing 1080p and 1440p monitors that make alot more profit for companies in the last years rather than just coming out and saying that 4K gaming is a thing.

Honestly I've been playing games ONLY IN 4K since 2016 and I don't own a flagship card.

I used to play League of legends, warframe, trackmania, world of warships, heros of the storm and overwatch in 4k at high settings with an RX470.
Today I have a Vega56 Reference that I got in 2017 on launch day, its flashed with a Vega64 Bios and undervolted for more performance.

Honestly I can't think of a game other than Prey that doesn't get atleast 60FPS in 4K on high settings or better.
Just fired up Forza horizon 4 last night for the first time.... turning off Vsync has me averaging around 78fps with peak FPS being around 110 and minimums being around 63. This is with high preset.

I play Overwatch (Ultra), Heros of the Storm (Ultra), Warframe (ultra), Dirt4 (High, EQAA), Dirt Rally (High, EQAA), CSGO (Max settings), Cities Skylines (high/Ultra), Majesty2 (Max settings), Quake Champions (High, 85% Render Scale), Soul Calibur 6 (Max) and a few other games from time to time and get atleast 60fps in each title.
The Top titles all get a minimum of 100FPS (until Cities Skylines) then the FPS is in the 75-95 Range, And Soul Calibur is locked at 60fps, FH4 was mentioned before.

Just remember that these FPS are with a Vega56. Today I have a 3900x, but the average FPS was basically the same with a 1700x OC (minimums improved with the 3900x). And I have literally 3x 4K monitors so its also while driving 2 other 4K desktops (with various apps scattered across them) at the same time.
 
I think the article conveys some useful information. It shows that 8K won't be viable for competitive gamers until graphics cards more than double their current performance. This may take five years instead of just two or three.
This depends on what game you want to play...
 
4K gaming has been a thing already for 4 years.

The media just spews crap about 4K gaming all the time for whatever reason.... people act like games can only be played in ultra settings with every single option maxed out.. When the reality is that "ultra" is a set of settings built into games to sell hardware, often providing less than 10% increased visual fidelity at the cost of 50-1000% additional GPU resources.

I know, I've owned 4k displays for almost 2 years now. First setup was an i7-3770 + GTX970, now with an i7-9700k + RTX2070. Much of it is game dependent, and I know turning down settings before cutting resolution is obviously the way to go. But, in my experience with a variety of games, 4k gaming is still far from being mainstream due to graphics card limitations.

If I can't play all games smoothly at 4k with reasonably high settings with a $600 GPU, we have a ways to go.
 
2560 x 1440 @ vsynced 60FPS+ and ultra settings is good enough for me given present GPU and monitor hardware.

A $500 monitor and a $500 GPU will do the business there. Not cheap, but not extravagant either.

Much above this (I.e 4K, 60FPS+, ultra) it's a $2000 monitor with a $1000 GPU and the result is still barely satisfactory.

I have a 144hz gsync 2560x1440 monitor and a 1080ti so both in that price point you set and most games run 90-120fps on ultra (and don't turn on vsync. Freesync/gsync is vastly better).

But I agree that 8K is silly. The new consoles won't output graphics any better than at best a 2070 and a 2070 isn't adequate for 60fps 4k gaming by PC standards. They will simply checkerboard and upscale like they always do and call it "4K" or "8K". Hopefully 120hz+ TVs really start to catch on. The biggest improvement for console gaming would be a few more minimum FPS, not more pixels.
 
Personally , I'm skipping 8k altogether, its just an interim step to the true holy grail 16k gaming (ultra settings). I may have to upgrade my gtx960 (4gb, so futureproofed) and 2500k . Oh and all that on total TDP of <25w for greta.
BTW how are you connecting to this diplay hdmi DP (x2?) If the cables cost more than £29 then .....WTH
 
Last edited:
RTX2080ti is %0.001 of the GPU base.

As such, nobody cares about a $1,400 card. It doesn't matter who, even to people who can afford it, or already own one. It is simply too limited and too overpriced to matter in any way/shape/form. The consolation, is that the 2080ti is the only card out there for high-end.

No point in bragging about a mythical card, than 13k worldwide own. Almost laughable, actually...
 
8k gaming? Honestly, wouldn't it need a screen measured in square yards (or meters) to need that kind of detail, yes? I must be missing some kind of basic idea - or is the industry just intent on personal cineplex as the next necessary thing?

Exactly so. When you're shooting at someone, or being chased by someone, you don't even notice details on the HD 1080 resolution, let alone higher. That kind of resolution is for people who want to watch full-size wallpapers on wall-sized panels. Not for gaming.
 
8k does nothing for consumers, it is only a marketing ploy. Comcast doesn't even broadcast in 8k and upcharges people for using 1080p. Most (if not all) of comcast's is 720p upscaled to HD. (FWIW: OTA TV is much cleaner and sharper and better... and it is free. Yet Comcast charges a $10 premium for high-def.)

8k in gaming is going nowhere and also an utter joke. Any time you hear anyone mention 8k, understand it is because they don't want to you know about how weak 4k gaming is.

Gaming means 144hz+
 
8k does nothing for consumers, it is only a marketing ploy. Comcast doesn't even broadcast in 8k and upcharges people for using 1080p. Most (if not all) of comcast's is 720p upscaled to HD. (FWIW: OTA TV is much cleaner and sharper and better... and it is free. Yet Comcast charges a $10 premium for high-def.)

8k in gaming is going nowhere and also an utter joke. Any time you hear anyone mention 8k, understand it is because they don't want to you know about how weak 4k gaming is.

Gaming means 144hz+
Wow, you're actually using Comcast cable as a counterexample for 8k content. Good for you!
 
Wow, you're actually using Comcast cable as a counterexample for 8k content. Good for you!

No I am illustrating there is nothing that produces 8k, so why bothering jizzing yourself over 8k.... ??

I also reiterating that "gamer" means faster than TV standards of 60hz... so what "GAMING" 8k monitor do you know of, that does 120Hz, or 144Hz..? If not faster than TV standards, then it is not a "gaming" monitor.

We don't even have proper 4k gaming monitors yet, so anything 8k is just marketing fluff for headlines.
 
Last edited:
The output resolution might well be 8K, but the internal resolution absolutely won't be for the vast majority of "8K" games. It'll be how the likes of the XBox One and PS4 handle 1080p, or the Xbox One X and PS4 Pro handle 4K - a combination of dynamic output resolution and a lower internal resolution.

Edit:

The likes of Nvidia use 3 different chip designs for the RTX models:

TU102 - 2080 Ti, Titan RTX
TU104 - 2070 Super, 2080, 2080 Super
TU106 - 2060, 2060 Super, 2070

+plus all the relevant mobile and Quadro models too

One wafer can only support one chip design, so a TU106 wafer will be used to generate processors for the listed models above and nothing else. Since the TU106 has a die area of 445mm2 compared to the 745 mm2 for the TU102, the same size wafers (they're typically 300mm in diameter) will yield more chips for the former compared to the latter. So a TU106 wafer will actually generate more profit than a TU102 wafer will, despite the markup on the final graphics card, especially as the 102 is only used in a total of 4 models compared to the 8 for the 106.

Thank you for clarifying. I see things have changed then. I recall back in the day all GPUs were binned from the same wafer, but Nvidia has changed the way they do things if they have 3 tiers of chip designs.
 
You can set gaming at 30 Hz or 60hz if you got titan in it. and VII amd cant run that high even wit 16 gb ram onbard. lest see if pcie 4.0 rx 5700 xr xt can run it at decent 8 fps then ? this test must try run every card from 1gb-48 gb .
older pc*s can still hang on 1080p and a little more. the problem is and are .
If you trying to run 4k-8k on unsuported gpu cpu you getting green screen lag and no movie or games wouls be shown on that screen. trying to run on celron pentium and so on would not have speed to run fast enough. like testing mp3 on a pentium 60 Mhz and pro mp3 was like a slideshow. bios must support some feauters and must have havc support. o.c 1gb lags alto becorse it was meant to run 1920x1080p and lower.
2gb a bit more to take on but still 1k-2k lags. if you get a 4gb-8gb you can run 8k youtube and other stuff.
getting on titan ti gpu class you can run pretty good on 11-48 gb gpu vram.

problems like I cant run that solves 1 st tesing out a new pc against low resolution pcs. the bottle neck would be badely FPS and overheting in some cases.
in older days we ran on less pc power but still got godd fps. if you see shadow man a heavy demand in that time it cam out. now support of 1080p-4k textures. it would not be that demanding to play. system req was set too high. I played it easy on a pentium 60 Mhz 128 mb onboard ram gpu ram at 128 mb too vram. a good pci 128 sound card (133 mhz) and I was nicely run.
so saying you cant run 8k on a lesser pc. you can easly get top specs on a low end pc. but fps would still be bad. BOTTLENEcks are too stop you from try beating the last fps on a 1999-2004 pc .

many other tings that will put you down in fps are the ram speed.
500mhz and up and ddr1 2 would use ages to start up.
ddr 3-4 are nicer to deal with.

win 3.1 cant run 8k win 95 98 se forget it. hardware too old.

win xp limited too dx9.0c (dx10 1) with patch.

vista dx11 (11.1) support decent fps with latest dx 11-12 gpus.

win 7-10 good frps but latest bios must be enabled good fps. exspensive.
so buying a decent pc to kids or just a net board andriod gameing and they getting your old pc too try out.

and since it is pegi 18+ you are playing you set restrictions on what they gonna surf on games 3+ 16+ and then you go over 18+. you give them good fps pc s (or they buying them self from 15+-18+). let them earn up money every week.
even a 18 years old must get a little support friendship from daddy and mum.
take them away from data a little while. and set up a timer from the wireless lan to turn off. with atleast 10 min warrning. bed time warrning or just give them a story telling cd book. some people are so tierd they just go to bed after work.

upd ol tokyo are one reason. but it would be downscaled to 1k in other contries. or the cam would give only 1080p to tv watchers. but with a pc you can watch it in 8k full 60 Hz

now that you really want a 8k it is very exspensive to get. 1 you must have a good internet line 30-100mb 300-1gb line. in japan they must use parbol to get 8k down. if europe should get it they must get a parabol 8k tv and a decent pc to just watch 8k streaming. the 1 st 8k streaming on youtube was told it was to heavy for that time. you got green screening lags an gery screening. no codecs could render so much info. so now go and find a youtube put it to 8k and se if you can run IT. it not shame thay youre pc from 1999 can run it at all. the gpu cpu ram enables future pc laptops to watch 8k witouth problems. 8k ar 33.x mp

links https://www.amazon.com/dp/B07SM38P2G/ref=psdc_6459737011_t1_B07WRMR92Z?th=1 www.altibox.no 300mb-500mb 1gb-10gb line every mont it would be 149 000 nok for 10gb (fabrick only)
MVA to get it and it would be months to get some to watch on it. 8k gaming would be nice youtube in somehow 300-500mb watching movies. netflix 4k-8k into the future. 4k are nice too but try out before you buing a 1gen-4th gen 8k tv monitor. get more then 30 Hz screen to 60-166 Hz to play gamees likea pro. then the other problem comes up. everthing are too small. put up a 1k screen beside it that run only 1080p and then try out 2k 4k 5k-8k too se if it lags. take out parts and blow them CLEAN wit air on box. now put it together and fire up GOW 5. now you can play every game in 8k. cONgratS.
 
Last edited:
Back