I disagree. I could buy a Mercedes and a 4090 if I wanted to but there is a mindset to building wealth. Just because you have money doesn't mean you have money to spend.Well said.
I disagree. I could buy a Mercedes and a 4090 if I wanted to but there is a mindset to building wealth. Just because you have money doesn't mean you have money to spend.Well said.
For example, you buy a 3060 and go crazy with dlss 2, with ghosting, etc. Better save some money and buy a 3070, and have more real native fps and not that AI invention. OK, hopefully you understand me better now.I disagree. I could buy a Mercedes and a 4090 if I wanted to but there is a mindset to building wealth. Just because you have money doesn't mean you have money to spend.
okay, but save some money and go with a 3070ti, but wait, if I save some more I can get a 3080, but if I just save some more I can get a 3090. But if I'm going to get a 3090, 3090ti's aren't that much more.....For example, you buy a 3060 and go crazy with dlss 2, with ghosting, etc. Better save some money and buy a 3070, and have more real native fps and not that AI invention. OK, hopefully you understand me better now.
okay, but save some money and go with a 3070ti, but wait, if I save some more I can get a 3080, but if I just save some more I can get a 3090. But if I'm going to get a 3090, 3090ti's aren't that much more.....
Make a budget and stick with it or you'll end up broke
I use a 4k,65 inch TV as a monitor. I won't post numbers, but I spent way more than $1500 on it.To me, it is about matching your GPU to your monitor. Why set an artificial amount... instead of an actual goal, or need..?
Why spend $1500 on a monitor and have a $500 gpu..? Or vice-versa...
Right, that is why I said u have to set a goal or have a need.I use a 4k,65 inch TV as a monitor. I won't post numbers, but I spent way more than $1500 on it.
I bought it with the idea of "future proofing" in mind. I'll have this display for 10 years, videocards are replaceable
I am not speaking about mean people, obviously. To them everything is expensive.okay, but save some money and go with a 3070ti, but wait, if I save some more I can get a 3080, but if I just save some more I can get a 3090. But if I'm going to get a 3090, 3090ti's aren't that much more.....
Make a budget and stick with it or you'll end up broke
Using on the 1st screenshot dlss and FSR in quality mod but Xess on Balanced is bordeline dishonest, sry but I don't think it's fair... even if not important for most of the informed ppl it mislead quite a part of your audience, and it's because I have a great respect for your website that I take the time to tell you this, even if you need to scroll down to see the full test, it's still the "front page" image, and for a lot of ppl, they don't even bother going down or read the article and you know it. I would have understood to make the correlation to FPS on a multi-player title but for what is primarly a solo player game, I think image quality is 1st, but it's only my humble opinion
like I said I'm with you for a mulltiplayer game, on a solo, graphic quality come 1st (if I got at least 60fps) but it's a personnal thingThe main reason you enable one of these upscalers is to get a performance gain. What's the point of Xess Quality mode looking as good as DLSS or FSR's if it's gonna be slower? When they perform the same, then they could compare Quality to Quality.
The thing is, how much is too much (even if you can afford it)?To me, it is about matching your GPU to your monitor. Why set an artificial amount... instead of an actual goal, or need..?
Why spend $1500 on a monitor and have a $500 gpu..? Or vice-versa...
The thing is, how much is too much (even if you can afford it)?
Monitors or TVs are much cheaper than before and GPUs due to mining went higher and stayed there, artificially. So if CPU makers also go on the same train, soon you'll need to spend $1500 for each part ($1500 CPU + $1500 gpu + $1500 screen,...).
People that set goals instead of a price are the loving case from companies:
- before tech evolved so that for the same $500 you would get 4 years later 2x as much performance (this is just an example), the same as when I bought my first 1080p lcd I paid the same amount as now for my first 4K QD-OLED 120 Hz;
- now (since mining / scalpers started a couple of years ago) they give you 100% more performance for a huge price increase.
So "setting " goals whatever the price makes companies increase the price.
We & you are talking about games, so you give a wrong message of "need". There is no "need" when playing games but desire / lustHow much is too much...?
Is based on one thing, how much performance do you need. [...] So you BUILD your machine for that purpose... You have stated you play on a TV, thus even a Vega64 can play 4k games on a cheap 4k TV.
We & you are talking about games, so you give a wrong message of "need". There is no "need" when playing games but desire / lustso, if you are buying hardware because you work with 3D, CAD, video editing, etc. then you need it. Gaming (unless you work making or playing games) is not something that people need.
I understand your point but I'm on the point of view that gaming was for most people (2 or 3 minimum wages European average for a gaming rig that would last sobre years), now is just for some OR you need a credit for it...Some people buy a car to just get to work.... some by a second car to race on the weekend, because it is their hobby....
Some people build a computer for non-competitive games, some people build their rigs for competitive play.
RTX 4000-s are a good performance upgrade, but unfortunately nVidia went all in with its DLSS 3, which tests showed is a useless gimmick, because it creates too many motion artifacts, enough to steer away from it.
From the 3 titles I tried, Darktide, Plagues tale requiem and Hogwarts, the only one that was actually a good experience was Hogwarts in terms of latency and visual compromise. I believe this is the best implementation so far making me think that it will improve as time goes on.DLSS 3 is only worth it if you have over 45-50 FPS to start with. At that framerate the fake frame artefacts are imperceptible unless you're really trying to find a scenario where you can kind of notice them if you struggle. So if you want 100 FPS smoothness but your GPU can only render 50 it's useful, but if you want 30-60 FPS but your GPU can only handle 15-30 you're going to have a bad time with artefacts and increased latency.
The artefacts are also less noticeable when using a mouse as opposed to a controller because the camera movements are not as smooth so you're less likely to notice the ghosting effects.
In my experience, frame generation is also useful for visually mitigating micro-stutters caused by asset loading because they're being offset by the interpolated frames. They're still noticeable, but not as much.
I've tested it on several games and so far I haven't found one where I'd rather turn DLSS 3 off. It's not groundbreaking, but it's a nice feature when used properly.
I have a 4070ti, playing on a 120Hz TV at 1440p ultra settings with RT on. Not 4k because from 4 meters away 4k is a huge waste of FPS. With frame generation I can guarantee 120FPS in all games with this setup.From the 3 titles I tried, Darktide, Plagues tale requiem and Hogwarts, the only one that was actually a good experience was Hogwarts in terms of latency and visual compromise. I believe this is the best implementation so far making me think that it will improve as time goes on.
Yes Hogwarts at native the performance is all over the place and hardware under utilized at 4k ultra settings and rt on dlss set to quality with frame generation I get an average of 120 fps with dips into the mid 90s. The game is still pretty and very playable with dlss3 although native 4k with Nvidia dlaa anti aliasing is slightly superior in image quality at the cost of frame variance ranging 40 to 90 fps.I have a 4070ti, playing on a 120Hz TV at 1440p ultra settings with RT on. Not 4k because from 4 meters away 4k is a huge waste of FPS. With frame generation I can guarantee 120FPS in all games with this setup.
The only real issue with frame gen for me was the screen tearing. In Control and Plague Tale Requiem, enabling VSync in Nvidia Control Panel fixed it without introducing too much latency, but in Cyberpunk 2077 I'm getting 100ms+ latency with VSync, which is a big deal-breaker. Setting the frame cap to 58 didn't work either for some reason, it also introduced too much latency. I managed to get it under control by forcing DLSS2 to run in quality mode, bringing the FPS just slightly under 120 90% of the time. I hope NVidia fixes VSync with frame gen because it would benefit a lot from it on non G-Sync compatible displays.
I've yet to try Darktide and maybe Hogwarts, although the latter seems to have a very disappointing RT implementation and poor optimization.