Despite it's flaws, at current stage anyway, only you seem to be excited over a dragging tech (for now). Reminds me of all the hoopla of Nvidia's "hairworks" those days.Yes, 34 in total, 25 coming, only 9 are out, but that`s not a handful either. Besides... Cyberpunk2077, CoD:Cold war, Dying light 2, please tell me some big releases that will not support RT.
Hairworks was a proprietary gimmick, RT is supported by both AMD and NVidia and is an important feature for DirectX12. RT does drop framerates by 30-40%, but it is playable for all games that support it on 3080 @4k, albeit some drop to 30fps. On lower resolutions is much better and most people don`t even play in 4k anyway. Calling it a tech demo is lame, but sure, improvements need to be made. A RT benchmark will be a good indicator for futureproofing as it stresses the card a lot and that was, actually, my point.Despite it's flaws, at current stage anyway, only you seem to be excited over a dragging tech (for now). Reminds me of all the hoopla of Nvidia's "hairworks" those days.
RT still brings down 36% of fps on a 3080 @ 4K.
Let's wait and see if it can actually take off. It might, or might not. For now people are looking for fluid gaming experience at 4K. For you RT might be the ultimate option, but for most, it's not.
It'll take a few more generations of Nvidia cards for RT to settle in, if it's the thing at all, for both the high end and budget cards. If RT is important, it will seep to the budget cards sooner than later.
For now, it's just a tech demo.
And no, I wouldn't make a purchasing decision based on Cyberpunk. A title that I'm not waiting for anyway.
That's a very good point, and definitely a factor to look for when the first benchmarks start rolling in.This is only for sure true in a pre-Smart Access Memory world. With SAM, PCIe bandwidth could have a VERY notable impact on the performance gain it can provide. (Because it provides a faster connection between the CPU & the GPU's VRAM).
I'm not the gamer some of the rest of you here are, but doesn't essentially every option which increases image quality and/or realism lower frame rates? If frame rate was always the overarching goal, why aren't we all playing at 480p with solid-fill textures?RT does drop framerates by 30-40%, but it is playable for all games that support it on 3080 @4k
That's a very good point, and definitely a factor to look for when the first benchmarks start rolling in.
I'm not the gamer some of the rest of you here are, but doesn't essentially every option which increases image quality and/or realism lower frame rates? If frame rate was always the overarching goal, why aren't we all playing at 480p with solid-fill textures?
I'm with you. Calling raytracing no more than a gimmick is exceedingly myopic.
I also agree that RT tech is still kind of immature due to the big hit on performance (compared to what it offers). It's getting better though. And calm down man, nobody calls it an obsolete tech, it just is not yet practical in many of today's conditions. No one will want to activate RT in their competitive shooter, for example. You'd activate it only if your card can produce high enough FPS, during single player. Even then most people would prefer higher FPS for fluid game play.
And no, gaming will advance further beyond 4k and even 8k.
As for 4K, 8K, etc. Ive seen some of that equipment. People don't buy PC monitors big enough to make sense rendering a game at 8k. A 32" 4k panel is 145dpi and a 32" 8k is 275dpi. A young human with perfect vision (20/20 and no other eye issues) can see 300dpi max at 8-10 inches. So a 8K display is likely the max point for useful gaming on a PC simply due to eye resolution. The more distant the display the less pixels needed. Sure you could sit 10 inches from a 60 inch display and see the pixels and so "need" a higher resolution but who would want to play 10" from a display that large? So the larger it gets the further back you move and the less DPI you need to maintain the illusion of not seeing pixels.
I'm glad you mentioned dpi vs size. It's all about optimal viewing distance.and viewable area without having to pan.
A 40in 4k monitor at 18in away is not a very pleasurable viewing experience.
On an AMD x370 chipset motherboard with PCIe 3.0 slots, would it be pointless to install a PCIe 4.0 generation video card?
If RT sucks for AMD cards that will leave a big question mark despite their (at this time alleged) top performance. Being quiet about it is not a good sign. We`ll see.
You're probably the only person that gives the slightest **** about that game.
And you can still get one for over $1200!
Ray Tracing is such a niche feature that it doesn't matter to me. Those that own 3080 cards have stated that they are turning off Ray Tracing to get better frame rates. So Ray Tracing is still a work in progress. I think we are two generations away from having graphic cards that can run games with that feature turned on full time.