So much for RTX: Crytek reveals real-time ray tracing demo for AMD and Nvidia hardware

Anyone notice TECHSPOT is like the only one reporting about this? guru3D, extremetech, pcgamer, anandtech, tomshardware etc haven't reported this yet?? I'm guessing those other sites favor or maybe own Nvidia stock?

Correct me if I'm wrong but why isn't this big news?
 
Brings to mind needing a dedicated card for physx back in the day. 1st gen of new tech is not a smart buy for most people. Me and my 1080 will be plugging away until nvidia's next release at least.
 
A $300 gpu can do this today color me impressed. It took Dice and Nvidia months after launch to get a playable framerate in BFV on a $1200 gpu. Rtx is the modern day gameworks/Physx. Nvidia found a way to gimp us again. Fool me once shame on you fool me twice shame on me.

DICE didn't have RTX hardware til very late in development...
 
I'd be excited if Crysis games were actually good.
There is a very old Gmod mod for zombie survival that I have played over 8 years.
And looking at the ugly old textures and player models I keep asking myself, why do I keep playing this old game. Almost feels like a good game doesnt need good graphics at all.
Most of my most favorite games are either very old, or are far from having top graphics.
 
You guys are little quick to confirm that the GI is actually as good. We have no way of knowing if this is actually calculating path tracing in real time, or if it’s baked. The reflections look good but that’s only have the goal. It doesn’t look like true GI to me.

It says real time ray tracing right in the preview image of the video /facepalm

They are using Voxels, which is significantly more efficient and can be done using traditional compute.

Yup like how it said real in game footage on watch dogs E3 preview, which on release the graphics were vastly watered down, same can be said with a dozen other games.
Chances are that the majority of the ray tracing is pre-rendered.

Regardless 80% of all GPU sales for consumers so far has been nVidia cards.
 
Anyone notice TECHSPOT is like the only one reporting about this? guru3D, extremetech, pcgamer, anandtech, tomshardware etc haven't reported this yet?? I'm guessing those other sites favor or maybe own Nvidia stock?

Correct me if I'm wrong but why isn't this big news?
I first saw this story on WCCF. I haven't found it anywhere else yet which is rather strange.
 
Still gotta give rtx a chance I mean the only 2 games to have utilized rtx technology so far have been battlefield 5 with only shitty pseudo ray traced reflections and metro exodus with the real deal global illumination rtx. This just some pre-rendered video.
 
Anyone notice TECHSPOT is like the only one reporting about this? guru3D, extremetech, pcgamer, anandtech, tomshardware etc haven't reported this yet?? I'm guessing those other sites favor or maybe own Nvidia stock?

Correct me if I'm wrong but why isn't this big news?
I saw the article on Tom's. Also guru3d posted as well. They may have posted a few hours later.

This is a very interesting demo. I want to try it out!
 
This is like a slap on the face for nvidia. It's good that someone trying to bring in sense into nvidia's exorbitant, flamboyant over-pricing. I'm a 980Ti user myself, and ashamed to see nvidia's current overbloated pricing trend. But I think the buyers are also to be blamed for supporting this trend.
 
Guys, ray tracing don't need specialized hardwares like RT cores, RT core is just an accelerator. Some guy was able to run BF 5 RTX on with an Titan V (I think Techspot had news about this) and the performance drop off when RTX on is worse compared to 2080 Ti.

https://youtu.be/yHfP82FwXio

That means if vega 56 can render ray trace at 4k 30fps the 2080ti can probably do 4k 90fps.
 
Yup like how it said real in game footage on watch dogs E3 preview, which on release the graphics were vastly watered down, same can be said with a dozen other games.
Chances are that the majority of the ray tracing is pre-rendered.

Regardless 80% of all GPU sales for consumers so far has been nVidia cards.

This is a feature being added to their engine, which is in more then a few games. Enough said.

And what about real-time can you not read? GamersNexus already did a technical overview of pre-bake RT vs real ray tracing.


Pre-baked RT is static, not dynamic and clearly isn't being used in the above video.

Yep, Nvidia owns 80% of the market and it's a good thing this works on all cards. You seem to have skipped reading the article but it specifically states it's hardware agnostic. Even better, it should work on every 10xx series and 20xx series video card. I honestly don't see why you'd be against everyone being able to reap the advantages.

The only thing Nvidia has shown me is that they are willing to just slap extra half-assed features on their cards to sell them. There are multiple other ways to do ray tracing in real time and Nvidia seemingly choose the worst.

Guys, ray tracing don't need specialized hardwares like RT cores, RT core is just an accelerator. Some guy was able to run BF 5 RTX on with an Titan V (I think Techspot had news about this) and the performance drop off when RTX on is worse compared to 2080 Ti.


That means if vega 56 can render ray trace at 4k 30fps the 2080ti can probably do 4k 90fps.

That's not quite how it works. RT cores accelerate a very specific type of math used when calculating rays. This system uses voxels and it may or may not be acceleratable by the RT cores. At the very least is should be CUDA compatible but the RT cores could be worthless. It would actually be a lot better for the PC industry that way, as RT cores are unnecessary when you can calculate rays using compute. And notice how this method doesn't have any noise? Much better looking and no need of a denoiser.
 
Anyone notice TECHSPOT is like the only one reporting about this? guru3D, extremetech, pcgamer, anandtech, tomshardware etc haven't reported this yet?? I'm guessing those other sites favor or maybe own Nvidia stock?

Correct me if I'm wrong but why isn't this big news?
UM, because it's the weekend and their doing other stuff in there lives. It'll still be big news on Monday.

Or maybe because it's doesn't matter. It's just a tech demo with a lot of unknowns and wont even be available for awhile.
 
Nvidia loves proprietary technology. Sure their cores are optimized for ray tracing, but it's all just numbers. GPUs are pretty darn good at computing numbers.

CPUs have been used for a long time to render CGI ray traces scenes. I've done some work with 3DS Max 2011, GPUs were starting to be utilized for rendering, but the results weren't as good. The engines needed more work back then.

It's no surprise to me that Crytek is able to pull this off. RTX isn't a gimmick, but a way of Nvidia trying to control the future of graphics, and Crytek said, no thank you Nvidia, we don't want your proprietary tools here!

This is why open platforms are always better. And AMD has been giving open tech for quite a while now, with Vulkan API, and Freesync just to name a couple.

I'm glad to see that AMD won't be in a cold freeze when it comes to ray tracing in the future. As a gaming enthusiast myself, competition is good and hopefully AMD can compete with Navi to slow down the ridiculous price train that Nvidia has become.

Got a bit off subject, but it all ties into the future of what gaming will look like and Nvidia wants to control it as much as possible. Crytek just showed them they won't have the monopoly on ray tracing and it makes buying a GPU for now and the future still open, without fear of missing out on the latest graphics achievements in the gaming Industry, no matter which company you prefer to buy GPUs from.
Jumping on the bandwagon to soon since all we've seen is a tech demo that you CANT even download to your computer.
RTX is new but is playable on 2 games. While far from perfect, RTX also needs time to mature like all things do.

Crytek have a long long way to go before anyone knows anything or if itll be any different than nvidias. Looks promising but that's all it is, nothing more. It'll be awhile for we all get to use it.
 
From my experience, cryengine sucks. Overbearing wasteful engine. I wouldn't believe anything these guys claim tell you actually play it. truth be told cryengine is a venomous memory sucker since it's creation. Good graphics but way too big of a power hog compared to your end result. So them saying they can pull off RTX with very little resources leaves me at the very least, skeptical. Especially in a early release video and very little actual information, or real in game footage. Sound like more BS from them, like crysis being anything more then eye candy. I think they are showing this long before it's finished because they are afraid Nvidia is going to get too big of a head start. Hoping wild promise will slow the implementation.
 
Last edited:
From my experience, cryengine sucks. Overbearing wasteful engine. I wouldn't believe anything these guys claim tell you actually play it. truth be told cryengine is a venomous memory sucker since it's creation. Good graphics but way too big of a power hog compared to your end result. So them saying they can pull off RTX with very little resources leaves me at the very least, skeptical. Especially in a early release video and very little actual information, or real in game footage. Sound like more BS from them, like crysis being anything more then eye candy. I think they are showing this long before it's finished because they are afraid Nvidia is going to get too big of a head start. Hoping wild promise will slow the implementation.

The original Far Cry and Crysis is years ahead in term of graphic quality though. I have such fond memory of playing the original Far Cry, quite sad that Crytek hasn't been doing well with the sale of their engine. They really need a sleeper hit like PUBG to come back to popularity.
 
You guys are little quick to confirm that the GI is actually as good. We have no way of knowing if this is actually calculating path tracing in real time, or if it’s baked. The reflections look good but that’s only have the goal. It doesn’t look like true GI to me.
That's because ther's no GI, CryTek used raitracing only for reflections
 
RTX 2080 Ti owners should feel betrayed lol. Pay $1k+ for a card and then realise the trick can actually effectively be done with a $300 card? Crazy questions come to mind. Why do we need RT hardware if we can run it with software? Is this Nvidia's attemp to make this tech proprietary? I'd bet Nvidia will partner with crytek to make "RT heavy" games that will gimp AMD cards (just like tesselation staff?) ? RT is cool tech nonetheless!!
 
RTX 2080 Ti owners should feel betrayed lol. Pay $1k+ for a card and then realise the trick can actually effectively be done with a $300 card? Crazy questions come to mind. Why do we need RT hardware if we can run it with software? Is this Nvidia's attemp to make this tech proprietary? I'd bet Nvidia will partner with crytek to make "RT heavy" games that will gimp AMD cards (just like tesselation staff?) ? RT is cool tech nonetheless!!
Proprietary? they relay on Microsoft's DXR and they pushed RT extension to Vulkan...
 
Proprietary? they relay on Microsoft's DXR and they pushed RT extension to Vulkan...
You realise they have their own programmable hardware (RT cores) right? If a game is programmed to use this hardware it requires this specific hardware. Their pushing RT extension to Vulkan is plausible but they didnt do this for charity purposes you'd guess?
 
You realise they have their own programmable hardware (RT cores) right? If a game is programmed to use this hardware it requires this specific hardware. Their pushing RT extension to Vulkan is plausible but they didnt do this for charity purposes you'd guess?
No, you are wrong, that's what APIs are made for, abstract the hardware so the same software can be run on different hardware design. That's why the same game can run on both NVIDIA and AMD, or the first Unreal run on today's hardware (which are completely different from what was available at the time).
No company do anything for charity (except real charity initiatives ;)), do you think AMD work for charity?
 
GPU cards with specialized ray-tracing hardware + game engines with optimized ray-tracing code = the solution

This is what RTX is... and look, as of this moment, it is total scam.

This smell like a lawsuit coming. Nvidia lied to its investors and their customers. This was supposed to be impossible unless you were having hardware acceleration. Guess what... typical Nvidia deception.
 
RTX 2080 Ti owners should feel betrayed lol. Pay $1k+ for a card and then realise the trick can actually effectively be done with a $300 card? Crazy questions come to mind. Why do we need RT hardware if we can run it with software? Is this Nvidia's attemp to make this tech proprietary? I'd bet Nvidia will partner with crytek to make "RT heavy" games that will gimp AMD cards (just like tesselation staff?) ? RT is cool tech nonetheless!!

Nvidia strategy was to corner AMD with a *lack* of proprietary features. ALso, implementation of gameworks in games are sabotaging AMD hardware efficiency. The Witcher 3 was a good example. The game was making AMD card flickering due to hairwork, however, the game was running WAY better than on Nvidia hardware after 1-2 months of the release. When AMD patched the hairwork thing, the game was running at 25% slower. My CF setup with 2X290x was running the game at 60FPS at 4k. After the patched, I was not able to past 40FPS. All this because of gameworks... which RT and DLSS are nothing more than this... gameworks features... closed proprietary technology that look worst and gimped competition hardware... and run like crap anyway.

This is RTX in a nutshell.
 
Nvidia strategy was to corner AMD with a *lack* of proprietary features. ALso, implementation of gameworks in games are sabotaging AMD hardware efficiency. The Witcher 3 was a good example. The game was making AMD card flickering due to hairwork, however, the game was running WAY better than on Nvidia hardware after 1-2 months of the release. When AMD patched the hairwork thing, the game was running at 25% slower. My CF setup with 2X290x was running the game at 60FPS at 4k. After the patched, I was not able to past 40FPS. All this because of gameworks... which RT and DLSS are nothing more than this... gameworks features... closed proprietary technology that look worst and gimped competition hardware... and run like crap anyway.

This is RTX in a nutshell.

Lol back to the conspiracy theory ? remember when AMD was banking on Mantle and DX12 since Nvidia didn't have async compute on Maxwell ? yeah Nvidia fixed that with Pascal and now Turing is getting even more efficient in AMD tech than AMD itself (Wolfenstein 2, Far Cry 5 and Rainbow six siege have AMD tech written all over them). If anything is to blame it's AMD and its stagnation, hopefully they can turn it around with Navi though...

And RTX is here to stay, if anything I think DXR is easier to implement correctly than the shitty DX12 which so far has 1 game that run better than DX11 (Strange Bridgade, **** game though)
 
Back