Shadow of the Tomb Raider unable to maintain 60fps on a GeForce RTX 2080 Ti

With RTX OFF scenes looked like they worked on them to be uglier to make RT look better, I dont trust those demos, with RTX OFF battlefield 5 was worse than BF1 on Ultra on my friend 1080Ti...
 
It seems raytracing will be the new PhysX, only top tier cards will be able to pull it off (eventually). I don't wish that to happen, cause the tech is very interesting and it actually allows for cool stuff, but I don't see myself being able to enable it in the near future because the most I will be able to afford (or willing to pay for) is a RTX 2070.
 
Go watch Battlefield V demo again, look at the bus windows. Dice had 2 weeks to implement RT. All things considered that is damn good. SotTR I wouldn't assess from as they went to Twitter stating it is extremely unoptimized running extra Ray's than needed.

Source for the 2 weeks claim? Are you saying Nvidia worked on this architecture for years and based it's entire presentation on 2 weeks? Yeah, please excuse me while I express extreme doubt.

Like I said, demos seek to show the product in the best light. It seems to be working on someone.
 
Last edited:
Source for the 2 weeks claim? Are you saying Nvidia worked on this architecture for years and based it's entire presentation on 2 weeks? Yeah, please excuse me while I express extreme doubt.

Like I said, demos seek to show the product in the best light. It seems to be working on someone.

https://wccftech.com/resident-evil-2-support-nvidia-rtx/

Next time you can Google it yourself, and yes they have further links to developers that clearly state the situation NO ONE HAD ACCESS TO THE ARCHITECTURE PRIOR TO 2 WEEKS AGO. People just need to stop already you can get quantifiable benchmarks from different devs as they are released 3Dmark already stated they will have an RT benchmark by the end of September. People need to stop going post nuclear on something they have zero idea about.
 
https://wccftech.com/resident-evil-2-support-nvidia-rtx/

Next time you can Google it yourself, and yes they have further links to developers that clearly state the situation NO ONE HAD ACCESS TO THE ARCHITECTURE PRIOR TO 2 WEEKS AGO. People just need to stop already you can get quantifiable benchmarks from different devs as they are released 3Dmark already stated they will have an RT benchmark by the end of September. People need to stop going post nuclear on something they have zero idea about.

First WCCF isn't a source, it's a rumor website. Second Digital Foundry’s John Linneman isn't a source either nor does he work for Dice. You are pulling quotes from rumor websites that qoute a non-Dice employee. In other words, completely unfounded. The only thing an actual Dice employee has said is "Sorry, can’t speak about specifics yet. A bit too early.". There have been zero confident claims of anything from the devs.

People just need to stop already you can get quantifiable benchmarks from different devs as they are released 3Dmark already stated they will have an RT benchmark by the end of September. People need to stop going post nuclear on something they have zero idea about.

Um, you were the one making claims on how awesome it was going to be and how much they should be able to optimize, all without anything more then rumors. All your saying now is what I've been telling people all along. Flipping your opinion on a dime won't change that.

So yes, everyone should follow that guideline with this launch and hope that Nvidia's RT doesn't turn out like AMD's primitive shaders.
 
First WCCF isn't a source, it's a rumor website. Second Digital Foundry’s John Linneman isn't a source either nor does he work for Dice. You are pulling quotes from rumor websites that qoute a non-Dice employee. In other words, completely unfounded. The only thing an actual Dice employee has said is "Sorry, can’t speak about specifics yet. A bit too early.". There have been zero confident claims of anything from the devs.



Um, you were the one making claims on how awesome it was going to be and how much they should be able to optimize, all without anything more then rumors. All your saying now is what I've been telling people all along. Flipping your opinion on a dime won't change that.

So yes, everyone should follow that guideline with this launch and hope that Nvidia's RT doesn't turn out like AMD's primitive shaders.

"Um, you were the one making claims on how awesome it was going to be and how much they should be able to optimize, all without anything more then rumors." - Never said anything of the sort, I only stated the demos were a rush job, Techradars leak on fps however was true the 2080 Ti was on average cranking out 100fps in their testing and it lines up with NVidia's charts indirectly. I actually never flipped my opinion at al, I've been telling people to buy based on Legacy gaming performance not RT performance at all, frankly because different developers will implement it in different ways and depending on that way it could be more impactful on performance we never knew the tax on using the feature, however if it is that bad like any other tech it will mature over time Even ADOREDTV admitted that and did a nice video based around one of his sources , All I ever said is RT looks real good and you would be a complete ***** not to se the difference between the 2.

"Second Digital Foundry’s John Linneman isn't a source either nor does he work for Dice. " No you are right but Elenarie does work for Dice and if you even bothered its in black and white but you need to dig through 13 pages of fanboyism. Good luck with that, I am not going to spoon feed you do your research or gtfo.

"First WCCF isn't a source, it's a rumor website." BTW Techspot quotes WCCftech as a source very often when pointing to various industry facts......so its more than acceptable. I am not going to go out of my way finding exact pages because you are lazy, just stop already. Your deflecting from being way wrong is depressing.

NVidia fanboys are mad at the pricing and going post nuclear.
AMD fanboys are salty AF for not having their own version of RT(then again they are still waiting for Async Compute hardware portion of the die to actually be used in more than 5 games claiming still claiming Vulkan will dominate the market.....)
 
First WCCF isn't a source, it's a rumor website. Second Digital Foundry’s John Linneman isn't a source either nor does he work for Dice. You are pulling quotes from rumor websites that qoute a non-Dice employee. In other words, completely unfounded. The only thing an actual Dice employee has said is "Sorry, can’t speak about specifics yet. A bit too early.". There have been zero confident claims of anything from the devs.



Um, you were the one making claims on how awesome it was going to be and how much they should be able to optimize, all without anything more then rumors. All your saying now is what I've been telling people all along. Flipping your opinion on a dime won't change that.

So yes, everyone should follow that guideline with this launch and hope that Nvidia's RT doesn't turn out like AMD's primitive shaders.


Also I want to make a point on this, AMD and NVidia aren't stupid, they push a lot of technologies out the door over the years, some being better than others while some have such and impact on performance it isn't desired to be used making a game unplayable especially at the budget and mainstream levels, where as on the highest end it was atleast tolerable although not the best play experience. Tress FX and Hairworks spring to mind as well as early implementations of AA.

NVidia has been working on this for what they call the past decade, and with millions and billions invested I seriously doubt they would push out a technology that wasn't playable on some level, It is very obvious the demoed games were sideloaded technology. Jensen even stated OVER and OVER again "It Just Works"- That wasn't for consumers benefit, he even made a comment "It works and can be applied provided the textures are done the right way"- he was very clear about that almost taking a jab. Also Ray tracing is a lot different compared to Primitive shaders.

The claims on Battlefield V and SotTR make complete sense, I doubt anyone had a working card until 2 weeks ago, If you have been paying attention to leaks and benchmarks you would have seen engineering samples or early NVidia hardware being leaked about that same time also keep in mind the finalized driver won't be released until sometime in September, so even with preliminary numbers we aren't going to see the final numbers to use as a reference. AdoredTV even stated this about the demo's and reinforced that they were alpha or beta states with no optimization done, he is the only one I seen actually slamming and congratulating NVidia at the same time on this.
 
All those that get duped into paying a king's ransom for this new very raw and immature tech will pave the way for the next gen of cards' implementation and refinement that will actually be usable in real scenarios. Definitely skipping the 2000 series. It'll kinda suck to keep my 1080 for 2 more years, but I'm as thoroughly unimpressed by Nvidia than ever and their business practices are damn near criminal as they have been for over a decade.
 
All those that get duped into paying a king's ransom for this new very raw and immature tech will pave the way for the next gen of cards' implementation and refinement that will actually be usable in real scenarios. Definitely skipping the 2000 series. It'll kinda suck to keep my 1080 for 2 more years, but I'm as thoroughly unimpressed by Nvidia than ever and their business practices are damn near criminal as they have been for over a decade.

Yeah it's expensive and yeah that sticker shock is heart stopping, but, if you look at tech radars report of having 100+ frames at 4k in many games, than I can say it lines up with Nvidias reported slides of performance, while a 35-50% framerate increase is really really nice for 4k and 2k ultrawide owners, I really can't justify 1200$ however I am going to call that by next year as the die process becomes better with less rejects and cheaper, we will most likely see a pricedrop 800$ to 1k$, factor that with 10series left overs they need to sell. Either way 3000 series or a decent VR upgrade as less wires equals godmode with VR and that bulky looking htc battery for vive is just stupid and ugly looking.
 
" “As a result, different areas of the game have received different levels of polish while we work toward complete implementation of this new technology."

This is indeed a demo but I'll believe it when I see it. They need to go from sub 40 FPS 1080p to 60 FPS 4K like Nvidia claimed. You are talking about more than a 400% performance gain in ray tracing from "polish" as the dev puts it. That's usually not on the level of polish, it's more on the level of "our RT code needs to be completely re-written" or "our hardware is completely inadequate". This isn't the first report of poor RT performance either.

Currently, I'm waiting to reserve judgement for these performance issues. Given the pricing, people expect these to do 4K 60FPS with RT on at the least.

There is no need for me to comment. You are the cynical voice we need in this day of BS.
 
That must have been embarrassing, just like that video that was shown, where Lara dies about 10 times. I mean who even would show such poor game-play to anybody?
I am using Intel's i7-8700k @ 4.69 Ghz, Nvidia RTX 2080 Ti Founder's Editon, and 16 GB of Corsair Dominator Platinum @ 3000 Mhz. I am playing Shadow of the Tomb Raider in Directx 12, 4K/HDR @ 60hz - VSYNC with TAA (Temporal Anti-Aliasing), Ultra Texture Quality, 16x Anisotropic Filtering Quality, Ultra Shadow Quality, HBAO+ Ambient Occlusion, High Depth of Field, Ultra Level of Detail, Tessellation, Bloom, Motion Blur, Screen Space Reflection, High Screen Space Contact Shadow, Normal Pure Hair, Volumetric Lighting, Lens Flares, and Screeen Effects all ENABLED. I am getting 50 to 75 FPS (61 Vsync).
 
There are plenty of fatalists in this world - just let them revel in their impending doom and instead pay attention to the realists who show you that there will be a tomorrow and performance will be at least as good then as it is today.
 
Back