Battlefield V DXR Real-Time Ray Tracing Performance Tested

From what I understand that won't have any effect, RTX runs on a separate part of the GPU normally not being utilized if the game doesn't support ray tracing. That's why the 2070 GPU side starts to run under utilized with ray tracing enabled. Turn the regular settings down as far as you want, there just aren't enough "Gigarays" to go around regardless of visual settings.

Now do I cancel my trade in with EVGA and just go get a 1070TI/1080 whiles they're on sale... Really need to upgrade from my 970, it struggles at 4K bad with games on my 65".

I can't recommend the 1080 gtx enough, it has been phenomenal for me since release.
At this point, you can get a 2080 for the same price as the 1080ti, it also overclocks better and beats out the older model.
 
Thanks for the in-depth article. As many suspected, the actual issue with RTX cards is the miniscule amount of actual RT cores on the cards. We take high CUDA cores for granted today but look at how many are needed by modern graphics cards to run games well. Why would anyone, especially NVIDIA, think you would need so few RT cores to achieve parity in ray-traced performance? The technology warrants an entire graphics card of it's own. RT cores are the future standard and should eventually replace CUDAs entirely so NVIDIA should have just gone ahead and done it.
 
It would be interesting to see if the in game quality settings affects anything. I know that the article said that it didn't, but that's hard to believe.
Is playing on the low quality preset with Ray tracing turned on still deliver poor frame rate? Maybe just turning off AA can help with frame rate? Where is the bottleneck? MSI Afterburner miss reports CPU usage BTW! I think the bottleneck is CPU.
 
Ray Tracing is completely useless in the current graphic generation. It is easily two graphic generation away. To play with Ray tracing on 4K we will need at least 3 RTX 2080Ti on sli mode
But RayTracing does not work on SLI, at least it does not share RT workload.
 
Remember it it rtx version 1. so you have to wait some time to get faster versions. this is just likea TEST version to the market.
so in gen 2 3 4 we can maybe run 8k-16k with that teknology.
so if we wait a little to next gen 3xxx 4 5 6 we can render like realtime like g-force 3 rela time rendering did years ago.

put it on hold and just wait. this is great and will rule into the future/ or not /never /nada.
TEST version 1. there will come stronger cards later. if not they will make them without it if it not work. costumers need to Wait 4k-16k rtx rendering or get powerfoul pcs.

im gonna try out a rtx 2070 8gb i7-9700 32gb ram and asus prime z370 p ii. later when price goes down from 11000 nok on www.prisguiden.no to about 3xxx-5000 I will test out the 11gb rtx. but not yet. im shal try out every game like serious sam 4 engine and mecwarrior game with both DX rtx support.

just wait ill get it an you get it later. its exspensive to upgrade everyting. I must fit togeter and it must be tested out 1 st.
so if you go 1st for rtx 2070 later rtx 2080 and then ti version. how many years has it come to 2 nd gen rtx then ?

will th 20xx series last 4 month or 4 year or just days. like rx 5xx series from amd they could use their own tecnology and make something that nvidia don have.
if you hold on to rtx 3xxx (if it has that name) you could render realtime in 4k-16k wotout problems.

so yea yes you have latest i9-9900k or razor amd am motherboard.
then you could breif about it. if it works on lower end rtx 2070 good in 1080p so why try getting better when no bandwidt for the newest games will support full spedd like pcie 4.0
wait for pcie 4.0 and gen 2 rtx gpu ? as for now the pcie 3,0 has a inch to go furder on bandwidth.

1st
wait for rtx gen 2 (o.c play with youre gen 1 rtx)
get a pci 4.0
ddr5-ddr6 onboard ram 2xxx-4000 amd memmory Hd2 or hd3 ?
hope there comes someting from amd intel tha has enough bandwidt too run everyting in 4k on youre new (ehm)bought 8k screen to 50000 nok. it so exspensive here. we hot enormous MVA here in norway. so buying someting may takes ages years.
the next gen quhd 8k must run good on new computers and there must be fiber optic too support the raw bande'widr'th of a 8k red cam and web cams bandwidth has to overcome the 33.1 mp bandwidth.

if it comes to 16k it would be supported in pcie 5.0 that doubles trippels the bandwidt and that needs new motherboard ram gpu and new ssd disks.
remember battelfield v needs a 16 gb patch to run.

not every level has 100% of rtx just some places in game. should they have made it 100% it would be problem rendering even with a rtx 2080 ti 2x card and so on.
1 price have to go down to a private level. and more gemes maybe windows 10 will use it in screensaver and adobe premiere programs. it just a start not a run and get to 1 st place.

programs games need to support rtx and other new try out tings . to say that rtx will not work on lower end 2070 would be a lie. its just a lower version that you and other can try out for getting better lower pris gpu cards and o.c motherboards that support pcie 3.0-5.0.
ill wait for price to fall down on every thing im gonna need to upgrade. so about 10000+ to get newest tings would be to exspensive for me and others whos dont have that money yet. so express or slow updating would be best.

I m would try out low end gpu cpu ram motherboard and somehow a m2 ssd hybrid ssd hdd and pci 1x-4x ssd disk that support enormous more bandwidt.
if you use gtx 10xx series a litte more for a while and get back to lets us say gen 2 and the using pcie 4-6 and rtx are running good in 4k 5k 8k and so on. some places it would stutter. think about a picture with 33.1 mp every sec trough a dp 1.2 (I dont know what the future will give us in dp) and maybe HDMI 4-8 to get just running a game adobe program and so on.

if you get enough bandwidt to get it trough ssd pcie at 1 tb up down you have to wait a little to get that. now we are trying to get the power to run on 95 tdp an in 10 sec it would run at 105 w on newest 20xx ti and somehow gtx 10xx 95 tdp.
if you run more then 10 sec the rtx would be to hot and have to downsaple and maybe crash to save youre exspemsive aio desktop.

rendering in 4k are no problem but if you run more then 10 sec at 105w you could destroy it. its to exspensive to try overclock it. 1 st you have to get it stable and later have it stabel over 100 w tdp, DANGEROUS but if you want to buy new gpu every 3rd week day its up to you. money dont hang on trees.
but 2-3 x GPUs when price falls. then try it out. get it stated in *garanti (*Norwegian) that if you overclock and ruind/*(ødelagt) it you get a replacement or not. (this it tought s about the future. just find it by youre self and hope something gets trough) I can not predict the future. wiki pedia and other computer magazine s.
word sentences are trubbeling me. so correcting me would not be at tARGET.
ill auto correct it when I see it.
 
Last edited:
I suppose it's like most new technologies - it will take one or two generations before its really good for mainstream. The fact that RTX raytracing mode isn't really a competitive option fits with what I read on the Battlefield 5 GPU buying guide at http://daydull.com/gaming/battlefie...ations-benchmarks-the-best-gpu-deals-for-bf5/ Just not a good deal to go for an RTX card right now. Glad to know for sure so I can buy my card now.

I'm still curious how the RTX cards will peform for 3D artists. Might be a great way to get powerful ray traced rendering at home for independent artists.
 
I hope as many people as possible buy these cards to get volumes up and thus eventually push prices down. And that'll encourage both NVidia and AMD to implement a *real* ray tracing feature set which can keep up with the regular cores in the next card arch.

Or the one after that...

Buy up people!
 
Either they should have cancelled the raytracing in this title or made a better gpu that can actually run raytracing. Don't know who did a bad job but someone did and I hope he gets fired. I bet even the 21XX can't run raytracing at 4K and as I wanted my next upgrade to do 4K I have to wait for the 22XX series and I bet by then they introduce another gimmick to make games run like ****. They really don't want me to buy new hardware..... hope investors force them to offer something that people actually wants.


People measure their sense of worth and completion with how much and what they have. Many of these people become early adopters of technologies, even those that haven't proved their worth. Nearly $2,000 CAD (with taxes) for a 2080 ti video card is insane. I thought $500 was too much. Same goes for cell phones. Companies realize how there's a small fraction of people with this problem so of course they cater to them first in the hopes of maximizing profit. The rest of us wait a few years until the competition and newer products make these buys come down in price and be more reasonable. I always buy a generation behind because I believe in good "value."
 
Oh my. And Nvidia stock is already in freefall today, I wonder how all these negative reviews will add up to affecting the stock?
 
I love this website usually, but in this particular article I have to ask why did u run such a limited set of benchmark. OK, so it runs dogshit on ultra, what about the other graphical presets? There are some folks out there like myself who care more about frames than fidelity, and will happily play Medium settings with DXR on if that results in better performance. Of course now I don't know that because for some reason u did not benchmark those settings, u didnt benchmark anything at all really.

From what I understand that won't have any effect, RTX runs on a separate part of the GPU normally not being utilized if the game doesn't support ray tracing. That's why the 2070 GPU side starts to run under utilized with ray tracing enabled. Turn the regular settings down as far as you want, there just aren't enough "Gigarays" to go around regardless of visual settings.

Now do I cancel my trade in with EVGA and just go get a 1070TI/1080 whiles they're on sale... Really need to upgrade from my 970, it struggles at 4K bad with games on my 65".
I upgraded to a used Asus Strix gtx 1070 that I bought for $260 (was upgrading from a gtx 970). Sold my GTX 970 for $120 , so really just paid $140 to upgrade. :)
 
I suppose it's like most new technologies - it will take one or two generations before its really good for mainstream. The fact that RTX raytracing mode isn't really a competitive option fits with what I read on the Battlefield 5 GPU buying guide at http://daydull.com/gaming/battlefie...ations-benchmarks-the-best-gpu-deals-for-bf5/ Just not a good deal to go for an RTX card right now. Glad to know for sure so I can buy my card now.

I'm still curious how the RTX cards will peform for 3D artists. Might be a great way to get powerful ray traced rendering at home for independent artists.
I would also love to see how RTX performs for professional use and its use cases.
 
Ray Tracing is completely useless in the current graphic generation. It is easily two graphic generation away. To play with Ray tracing on 4K we will need at least 3 RTX 2080Ti on sli mode

This guy gets it. Even the next generation of graphics cards wont have the ability to provide smooth gameplay with RTX. Remeber here they only did reflection raytracing. Imagine the impact if they had enable raytracing in toto. Easily two or three generation away from being mainstream.
 
I can't recommend the 1080 gtx enough, it has been phenomenal for me since release.

Done, found a Asus GTX 1080 Strix for $500 CAD, RTX can wait. The guy I bought it off of still has warranty direct from Asus and offered to let me use it if it ever dies, good for another year or so.

At this point, you can get a 2080 for the same price as the 1080ti, it also overclocks better and beats out the older model.

I just can't justify spending $1000 CAD right now, half of that was more acceptable and gets me a card that I want, the Strix 2080 is another $150 on top of the starting price...

I upgraded to a used Asus Strix gtx 1070 that I bought for $260 (was upgrading from a gtx 970). Sold my GTX 970 for $120 , so really just paid $140 to upgrade. :)

That's a phenomenal deal on a 1070 Strix, the best I could find them around where I live is about $400 CAD, figured for another $100 to get the 1080 and be good until the next gen RTX cards come out in 2-3 years. My 970 lasted me since it came out way back in 2014, 4 years isn't a bad run, and I Can stick it in my portable PC and be good for 1080p gaming.

I'm really glad I waited until some actual benchmarks came out for ray tracing, I've been on the fence whether to dish out the money for it or pass and grab a 1080 now that they've dropped in price on the used market. Keep in mind the used GPU economy where I live is not the best either, most people are asking $600 for entry level 1080s, or they're miners dumping them in batches. The one I found was from a gamer who just got himself a 2080 TI, and was just trying to get some quick cash for his old card, worked out for me perfectly.
 
I am truly curious about sli benchmarks. With such a performance hit I believe that, if this workload can be distributed thru sli, it would be in Nvidia's best interests to release a rtx accelerator add in card. These bfV tests have shown that power draw is actually down with rtx on because the cards are not utilizing all its CUDA cores at this resolution and frame rate. If it can work like physx, just add an older nvidia card and the workload is offloaded to it, I think it could gain traction. Maybe not with just one game out, quite yet at least, but a viable option for the furure. Besides, I am a tinkerer. The more I can tinker the happier I am.
 
I am truly curious about sli benchmarks. With such a performance hit I believe that, if this workload can be distributed thru sli, it would be in Nvidia's best interests to release a rtx accelerator add in card. These bfV tests have shown that power draw is actually down with rtx on because the cards are not utilizing all its CUDA cores at this resolution and frame rate. If it can work like physx, just add an older nvidia card and the workload is offloaded to it, I think it could gain traction. Maybe not with just one game out, quite yet at least, but a viable option for the furure. Besides, I am a tinkerer. The more I can tinker the happier I am.

The only way I could get behind the idea of a stand alone Ray Tracing GPU would be if it was cross brand and series compatible, that way you can add RTX to any existing system, but Nvidia being Nvidia it would be only supported with an existing RTX product, so all of three currently available cards, rendering it a complete waste of a product. Remember the PhysX add in card and how well that worked out? Potentially this could be as big of a flop or never see wide enough adoption to be worth making in the first place.

In reality we'll likely just have to wait and see if game developers figure out a way to make RTX actually work with the limited throughput the available cards have, maybe it can be scaled back or applied in such a way not to cripple the games performance entirely. Remember this is just the first game to use it and be benchmarked, maybe it was a last minute rush to add the feature in and or the drivers aren't properly optimized for it yet.

As with all new technology and features, they can improve with time, or become a pointless endeavor never to be mentioned again, I have a feeling ray tracing is going to stick around. It might not be practical now, but you can't deny the possibilities of the technology to make games look all that much more visually impressive, and this extends to just about all games, not just triple A shooters.
 
Battlefield being a fast paced game isn't suitable for highlighting RTX(despite reflections only). That said RTX is new technology and it got to start somewhere. I remember 8800 GTX being miserable first generation hardware for crysis, 480/5870 for metro 2033 and so on. The RTX cards are at least better of in comparison.

GTX480 and HD5870 didn't cost $1200 :)
 
Battlefield being a fast paced game isn't suitable for highlighting RTX(despite reflections only). That said RTX is new technology and it got to start somewhere. I remember 8800 GTX being miserable first generation hardware for crysis, 480/5870 for metro 2033 and so on. The RTX cards are at least better of in comparison.


True. Using Ray Tracing in fast pace games isn't ideal. Gamers that play fast pace games and Multi player care more about consistent high frame rates instead of eye candle. If Ray tracing gets baked efficiently without a cost of losing frame rates then that would be great.

I think games like Farcry or small indie developers with those walking sims and exploring, Ray tracing would make more sense. Kona, Outlast, Skyrim etc.
 
It's 4K all over again. A good thing if adopted and implemented industry wide however it isn't enough bang for the buck to have users and developers switch to it. The majority never will opting to wait for the next big thing before they upgrade allowing time for the cost to fall. Hopefully Radeon will recognize this in their upcoming cards.
 
The only way I could get behind the idea of a stand alone Ray Tracing GPU would be if it was cross brand and series compatible, that way you can add RTX to any existing system, but Nvidia being Nvidia it would be only supported with an existing RTX product, so all of three currently available cards, rendering it a complete waste of a product. Remember the PhysX add in card and how well that worked out? Potentially this could be as big of a flop or never see wide enough adoption to be worth making in the first place.

In reality we'll likely just have to wait and see if game developers figure out a way to make RTX actually work with the limited throughput the available cards have, maybe it can be scaled back or applied in such a way not to cripple the games performance entirely. Remember this is just the first game to use it and be benchmarked, maybe it was a last minute rush to add the feature in and or the drivers aren't properly optimized for it yet.

As with all new technology and features, they can improve with time, or become a pointless endeavor never to be mentioned again, I have a feeling ray tracing is going to stick around. It might not be practical now, but you can't deny the possibilities of the technology to make games look all that much more visually impressive, and this extends to just about all games, not just triple A shooters.

Well yes, the physx aic was a flop, but they mitigated that by allowing the use of any cuda gpu. Of course it would only work with nvidia tech, but my comment was based on the fact that the regular cuda cores go underutilized with rtx on due to a bottleneck in the render pipeline. Its also quite the pipedream if more games dont come out that utilize it. Maybe they could add some cuda and make it an all in one. Imagine the eye candy in something like witcher 3, hairworks maxed, rtx maxed, and stutter a thing of the past. Its not my fault amd cant keep up, async compute and Vulcan just can't keep up.
 
Back