Crysis 2 tessellation...thats a good looking cement slab

red1776

Posts: 5,124   +194
I had not seen this posted yet here.
According to The Tech Report, It seems Nvidia's influence may have lead to superfluous polygons in some odd places throughout Crysis 2.

http://techreport.com/articles.x/21404

img.x
 
From what I've read this happens all over Crysis 2 once you apply this patch. There will be high poly water rendered where there is no water.
 
From what I've read this happens all over Crysis 2 once you apply this patch. There will be high poly water rendered where there is no water.

Yeah that is covered in the article as well. The interesting thing about this is that it is alleged objects that need little to none (or merely texturing) are heavily tessellated on purpose to slow down AMD cards.
 
Looks like Nvidia's hand in the cookie jar this time around...although with Crytek involved it probably isn't as cut and dried as that I'll venture.

Still more than a little surprising that in this day and age of near-instantaneous information flow, that both companys (lest we forget AMD's dubious "quality" setting about-face in Catalyst not so long ago, as example) see the need for black ops.

In the end, it seems more like a storm in a teacup.....it's Crysis 2....the game that every man and his dog has been taking pot-shots at since (before) it hit retail...and supposedly no one is playing. As a bench (in DX11 Ultra) it has some value as an indicator of future tessellated games in much the same way as Heaven or 3DMark 11 do - although I do note that a certain percentage of graphics benchmark followers -call it 50% for the sake of argument- would have the running of these deemed a capital offense....I personally find them more apropos than say, the call for Bitcoin hash or F@H testing inclusion -at least in the context of game orientated sites.

The game in general is too freshly baked as a mainstay of a graphics review in any case, a quick succession of patches and game revisions becomes a minefield unless the reviews clearly state what game version is being used- and a lot do not.
 
In the end, it seems more like a storm in a teacup.....it's Crysis 2....the game that every man and his dog has been taking pot-shots at since (before) it hit retail...and supposedly no one is playing. As a bench (in DX11 Ultra) it has some value as an indicator of future tessellated games in much the same way as Heaven or 3DMark 11 do - although I do note that a certain percentage of graphics benchmark followers -call it 50% for the sake of argument- would have the running of these deemed a capital offense....I personally find them more apropos than say, the call for Bitcoin hash or F@H testing inclusion -at least in the context of game orientated sites.

Well I'm assuming thats it right there. They planned on, and assumed that Crysis 2 is going to be the benchmark measuring stick for some time to come. I don't know how many are actively playing the game, but you can tell the reviewers are in love with it post DX11 patch. I have to admit it looks like a million bucks now amped all the way up.
 
...but you can tell the reviewers are in love with it post DX11 patch.
I'm not so sure about that.
I think you'll find Warhead is a part of more sites' benchmark suites than Crysis 2. Metro 2033, BFBC2, AvP and Far Cry 2 certainly are, and I'd hazard a guess that Crysis 2 probably doesn't feature much more often-if at all- than a raft of other DX11 titles (F1 2010, Shogun 2, CoP, Civ V)
 
I'm not so sure about that.
I think you'll find Warhead is a part of more sites' benchmark suites than Crysis 2. Metro 2033, BFBC2, AvP and Far Cry 2 certainly are, and I'd hazard a guess that Crysis 2 probably doesn't feature much more often-if at all- than a raft of other DX11 titles (F1 2010, Shogun 2, CoP, Civ V)

Really? I have gotten a distinctly different feel from the C2 DX11 reviews and Benchmarks game blurbs. I get the "best use of DX11 to date" feel. I will bet on your take though, you always seem to have the numbers to back it.
What I don't get is why Far Cry 2 is still part of the lineup.
 
Force of habit/inertia ? ...or possibly that its variance isn't large regardless of base system config

As it happens, I was just going on recollections from reading reviews...but since I had a spare fifteen minutes, I pulled up the latest reviews from the general sites I have bookmarked and padded out with a few other generally well known sites. Of the 37* producing a review on a regular basis, here are the benchmark occurances:
27: Metro 2033
26: AvP
24: 3dMark11
22: Unigine Heaven
15: Far Cry 2, BF:BC2
14: Vantage, Lost Planet 2
13: Mafia II (DX9 and 11), Just Cause 2
12: Crysis Warhead, Call of Pripyat, DiRT3
11: F1 2010
9: Civ 5, DiRT2
8: Shogun 2
7:Crysis 2 (DX11)
6: Battleforge, HAWX 2, Starcraft 2
5:Crysis 2 (DX9), Dragon Age 2, Resident Evil 5
4: Batman AA, CoD:Black Ops
3: CoD:MW2, Crysis, HAWX, L4D2
2: Bulletstorm, CoD4:MW, Stone Giant, UT3, Witcher 2, Wolfenstein, World in Conflict
1: Anno 1404, ArmA II, Brink, Chronicles of Riddick, FEAR3, MoH, Portal 2, Shift 2, Spinter Cell:Conviction, WoW:Cataclysm

(* bit-tech, TS, OC3D, Anand, Hexus, Hardware France, HT4U, Tech Report, Tom's, Madshrimps, OCC, OCAU, ABT, OCIA, X-bit, Tweaktown, TPU,[h], PC Per, Guru of 3D, Hardware Heaven, Hardware Canucks, Hot Hardware, Techware Labs, Overclockers.com, Legit Reviews, iXBT, eteknix, Vortez, Kitguru, Bjorn3D, Benchmark Reviews, Ninjalane, Hi Tech Legion, Neoseeker, Motherboards.org, Hardware Secrets)
 
7:Crysis 2 (DX11)

...See....I told ya:p:haha::rolleyes:
Well there's a faulty recollection ey?

Looks like metro 'Last light' will become the new standard then i'm betting.
Well thats a C&P laziness + Brand allegiance list isn't it?
 
Looks like metro 'Last light' will become the new standard then i'm betting.
Quite probably if the game is either a financial success, garners favourable reviews, and has a predictable game benchmark included. Most sites seem to fall back to demo's rather than actual gameplay, so the latter is probably paramount.
Well thats a C&P laziness + Brand allegiance list isn't it?
It features most of the AAA titles available- it not the most diverse range of game styles. As for GPU vendor input (if that was the "brand allegience" part), then there's a fairly even split between Nvidia (M2033, Far Cry 2, LP2, Mafia II, Crysis 2, HAWX/HAWX2) and AMD (BFBC2, F1 2010, Shogun 2, DA2, CoP, DiRT2, DiRT3, AvP).

Personally, I'm waiting on someone to use the Unreal Engine 3. After seeing the Samaritan demo, thats the kind of DX11 game that would sell cards (and break more than a few current ones).
 
Personally, I'm waiting on someone to use the Unreal Engine 3. After seeing the Samaritan demo, thats the kind of DX11 game that would sell cards (and break more than a few current ones).

I watched that a few times. I was wondering while doing so if there are cards now that can run this level of ray-tracing. It looks like its going to make 3GB+ buffer the norm.
 
Yes and No...or sort of. The demo ran on a triple GTX 580 setup- so we're probably a couple of GPU generations away from it being comfortable on an "average" enthusiast system. Samaritan's ray tracing is actually fairly rudimentary..so, there's still some performance chasing to be had in future years.
 
Samaritan's ray tracing is actually fairly rudimentary..

So was that amount of Horse power needed for tessellation,( I guess I'm using that generically with shading as well) or physics?
In other words, what aspect(s) are responsible for the tripling of GPU needed?
 
As agissi noted, Samaritans uses ray tracing in some of the lighting calculation ( some reflections in this case), but conventional rasterization incorporation with shadows, refraction, DoF (bokeh) etc. (i.e each individual billboard light source is individually rendered. then the reflection rays are raytraced while doing screen space ray marching to provide to correct amount of occlusion -hope that reads ok) We're still a while away from complete ray traced global illumination...although if compute functions are picked up by AMD as well as Nvidia then we're obviously getting closer to that point (software application following hardware implementation). Refractions -as noted by agissi- are a product of rasterization at the present time, and obviously not limited to just the UE3/Samaritan demo, since it is presently used in a number of already/near-future released games if you browse the SIGGRAPH paper (last link).

Epic's Unreal Engine 3 powerpoint presentation ( pdf)

The recently concluded SIGGRAPH 2011 ( as per usual, almost completely ignored by all and sundry) had an interesting -at least for me- paper on advances in real-time rendering
 
I would assume its the Ray Tracing. ie. Lighting calculations and refractions.

I guess i was wondering out loud in a more specific fashion after DBZ's statement about UE3's Ray Tracing being rudimentary, like how far are we from Daneil Pohl's reasearch, or for that matter, Larrabee's failure.( I mean that insofar as it didn't get to the point it was intended BTW :p) From what I understand of Ray tracing algorithm (admittedly very limited) The code that needs to be cracked is that in addition to the 'intersection' or object ray, the calculations from the secondary rays ( reflection, refraction, and shadow) cause entirely too many calculations to make it viable as a method of replacing rasterizarion and or the use of cube maps for reflections, in a game program. The problem of inter-reflections also seems to be a tough nut to crack as (from what I have read) would also require full global illumination and all of the calculations from the entire ray trace algorithm..
I will give those a good read Chef, maybe it will provide some answers...or at least get me up to date.
 
I don't think anyone is regarding seeing ray tracing as "replacing rasterization". The gains at the present (and foreseeable future) would dictate that both should be able to co-exist within the game engine - as is widely in use now in various hybrid raytrace/raster engines ( Pixar's Renderman, CryEngine 3, UE3 etc.). If rasterization can provide the necessary level of world detail at various levels then there isn't much to gained by trying to duplicate the effect using ray tracing for anything other than a technology demonstrator/benchmark.
Might pay to check out Stanford's paper on Interactive k-D Tree GPU Raytracing (various format links >>here<<)

Ray tracing really is of use for highly reflective/refractive surfaces. For the rest of the game world I think you'll find that rasterization is much easier to implement, a lot less taxing on resources and emminently more flexible to program for.
 
It has been my understanding that there are/was research and organizations that are attempting to replace rasterization with pure ray trace programming by perfecting the ray trace algorithm. From what I understand, this includes complete global object and light spatiality, and eliminating the 'TMI' scenario they currently have with the algorithm's primary and secondary rays producing superfluous calculations. perhaps this concept has been abandoned now.
 
Yup, there is plenty of work being done to reduce the computational overhead of ray tracing -you don't have to look to far on the net to see that. Unfortunately (for gamers) a great percentage of the work is aimed at industrial graphics (movies, presentations, simulations) rather than gaming. Hardly surprising when there is a level of intransigence and distrust of anything new made available to gaming ( note the addition of bokeh filter and realistic water effects in BFBC2 if your card is CUDA capable, the hue and cry that tessellation in Metro 2033 is too OTT et al.). You'll note that a lot of university ray trace studies and research have ties (or at support from) Nvidia and to a lesser degree, Intel through their OptiX and Embree software suites ...so I think it's fair to say that ray tracing will continue to be derided by a certain section of hardware champions until (if) they get sufficient traction from their association with Optis. Unfortunately, at this stage it is all about research -which means funding, with the payoff some time in the future when an entry-level GPU can churn out Crysis at 100fps and the graphics houses need a reason for people to upgrade and to keep buying.

Gamers in general are an oddball bunch to cater for. On one hand they call for something new like a spoiled child every five minutes, whilst simultaneously deriding whats placed before them. And like a great percentage of supposed technophiles, have an inbuilt distrust and fear of anything new. Case in point regarding the subject of IBM's cognitive computing chip story here at TS. How quick were people here to equate cognitive computing with murderous robots ?....Gee, and I thought that some people might be above the level of movie-induced hysteria. Not.Even.Close.
 
Back