Tom Clancy's The Division: Graphics & CPU Benchmarks

Steve

Posts: 3,043   +3,153
Staff member

The Division is set in an open world with immersive and destructive environments based on a mid-crisis Manhattan. As an agent, the player's mission is to restore order by investigating the source of a virus. Played from a third-person perspective, this shooter has varied weaponry from high caliber machine guns to sticky bombs. The environment is full of objects you can use to your advantage and take cover behind during firefights.

With The Division, Ubisoft aims to take graphics fidelity to the next level by enabling new graphical features in an open world environment. The game has been built upon the Snowdrop Engine which focuses on dynamic global illumination, destruction and a number of cutting edge visual effects.

In fact, the PC version has received a full complement of visual effects, including sub-surface scattering, spot shadows, ambient occlusion, chromatic aberration, depth of field, contact shadows, wind-affected snow, parallax mapping, vignette and volumetric fog.

Read the complete article.

 
"Perhaps I am just so acclimatized to horrible game launches that mostly working games now appear flawless."

Yes you maybe so. Take some time off! Oh and thank you for being honest :)

There really isn't any incentive for me to pay for a new game and then work for the developers for free, which is what I believe is happening with games these days. Why doesn't game companies pay back these early adapters by offering them "beta-tester" status with special privileges that do not necessarily have to involve discounts? Do they even say "thank you for testing our game"?

I'm still optimistic. I think the only scenario that will make me quit serious gaming altogether is when Shigeru Miyamoto joins the foray and release a buggy game just to patch it up later. Now that would be make me my version of a doomsday.

Seriously this needs to stop, now, and you guys over publicizing the positives and vaguely mentioning the negatives is a big part of it.
 
Thanks for including some 7970Ghz benchmarks. I'm still running 2 of those in crossfire at 2560x1440 and generally still happy with their performance.
 
This is one of those games I was looking forward to. Then I played the demo. Can safely say this is a generic game with nothing fabulous to make me pay more than £10 for. See you in the sales, UbiSoft... You've done it again. Failed I mean. Just change your name to USuck already.
 
This is one of those games I was looking forward to. Then I played the demo. Can safely say this is a generic game with nothing fabulous to make me pay more than £10 for. See you in the sales, UbiSoft... You've done it again. Failed I mean. Just change your name to USuck already.

Can confirm. Wait for the bargain bin GOTY version.
 
Sometimes I feel like we are being trained for the urban apocalypse, where the 1% will live behind their walled defended enclaves with access to all that life has to offer, while the rest of us are fighting it out in the streets for the necessities of life like real coffee (thinking Orwell's 1984 here)
 
Just making a point with the CPU usage. I understand this was to test FPS in the game but, I have an idea for lowering the CPU Usage. In Rainbow Six: Siege if you turn on Vsync and thus limit the FPS to 60FPS it tends to lower the CPU Usage in that game. It lowers it considerably because with Rainbow Six I was running at well over 100FPS, but my CPU was maxing out. With Vsync enabled it takes it down to a max of about 60% CPU usage.

That's just my two cents, I am not sure if that would actually help.
 
Sometimes I feel like we are being trained for the urban apocalypse, where the 1% will live behind their walled defended enclaves with access to all that life has to offer, while the rest of us are fighting it out in the streets for the necessities of life like real coffee (thinking Orwell's 1984 here)

Yes but even then game mechanics will apply, pay to win, like those bunkers that already exist for the rich. And of course they will have weapons with fancy skins, and all we will have is a few pistols, Molotovs, and an electric knife from the 1980s, not to be confused with something you might find in dying light.
 
Too bad it looks like total crap with 390X in crossfire. Plus the flashing will drive you crazy. Without Crossfire the game looks great!

Awww poo! Was that with an AMD crossfire profile for the game?

That's with the latest drivers 16.3 and with a profile. The Crimson game profiles you can't customize them that much as they were in the pre 15.12.

Single card the game doesn't flicker like that and the graphics are great. In CF the graphics look washed out and blurry.
 
Just making a point with the CPU usage. I understand this was to test FPS in the game but, I have an idea for lowering the CPU Usage. In Rainbow Six: Siege if you turn on Vsync and thus limit the FPS to 60FPS it tends to lower the CPU Usage in that game. It lowers it considerably because with Rainbow Six I was running at well over 100FPS, but my CPU was maxing out. With Vsync enabled it takes it down to a max of about 60% CPU usage.

That's just my two cents, I am not sure if that would actually help.
Sure, that would help lower cpu AND gpu utilization with benefits on power usage, heat and noise when full power is not needed for 60fps. But this is a benchmark, it's supposed to max it out if possible. AMD has included an fps limiter in newer drivers, so you can limit your fps even if vsync is disabled (or not working in game menus, when fps can get crazy high)
 
If my GTX 970 will look as bad as the GTX 770 in a few years, I'll go red team for my next GPU. Really good performance review, I was waiting for this benchmark.
 
Ah yes, this review truly highlights why I keep coming to TechSpot. That's quite a diverse line-up of cards, and big thanks on the CPU benchmarking as always.

I must say I find it very interesting to see an FX-8320E minimum frame rate vs the i7 6700K to be a measly 8FPS, while the max frame rate was also only 2FPS less. If I end up getting this game, and my friend keeps telling me to, I should be able to run it no problem with a few graphics tweaks on my overclocked FX-8320 and R9 390.

It's also very interesting to see the ways NVIDIA and AMD cards have aged over these years. A 280X nearly taking on the old Titan. An R9 290 pulling past a GTX 780 Ti. The 7970, 7970GHz, 280, 285, 380, and 280X all now encroaching on or flat out outclassing the vanilla GTX 780.
 
That's with the latest drivers 16.3 and with a profile. The Crimson game profiles you can't customize them that much as they were in the pre 15.12.

Single card the game doesn't flicker like that and the graphics are great. In CF the graphics look washed out and blurry.
Thanks for the info JetFixxxer.
 
Interesting article but I don't see the r9 380x on charts, did you forget to test that one? I also keep seeing 4 year-old tahiti arch doing very well in the latest games (at least they hold themselves well with a few graphics tweaks), on the other hand the previous generation Nvidia cards keep going down? I mean how do you explain a hd 7970 performing on par with a gtx 780? I remember back in the day hd 7970's direct competitor was gtx 680 but now gtx680 is almost at the bottom of the list
 
While the Radeons are showing well, overall the performance of all the top-class GPU's is pretty underwhelming. Good to see new game engines really taxing these cards.
 
Any chance of you guys throwing a few mobile cards on these benchmarks? I would be really interested to see how the 980m, 970m and a few AMD cards handle this at 1080p.
 
All three Fiji cards within 11fps of each other with High Quality preset and 6fps with Ultra preset @ 1080p. Same story @ 1600p. Yikes!
 
Appreciate ur effort guys.
But pls when u "cut short" by using the ingame benchmark then pick just 3-5 cards and make a reallife test to prove if the ingame results are at least in the ballpark with the realworld.
I know that its varies drastically but these ingame benchmarks are worthless if u dont even try to compare it.
And I would love to see linecharts of the framerates and/or say if its the "minimum/minimum" value or a calculated "average/minimum" value in the results. Thats interesting for "the deepest spot" of fps u will archieve.

But pls tell me why u make the cpu benchmark with the lowest resolution u use in ur test? Cant get it.
Use the highest resolution next time pls or both. Cant imagine 4 times the load at gpu at 4k results in lower cpu usage.
And we love to here more about the total cpu usage in benchmarks generally, thx for that.
Normally no realworld system is running so clean like a benchmarks system should. Thats good and a need but everybody should calculate up to 10% for bs running beside the game. When ur already slightly cpu bottlenecking in the game on a bench sys than it could result in light stutter or lower fps when same is running on a realworld sys with higher ground load.
 
Any chance of you guys throwing a few mobile cards on these benchmarks? I would be really interested to see how the 980m, 970m and a few AMD cards handle this at 1080p.

when u have a 970m just look after benchmarks comparing the 970m with the normal 970, if its 20% slower than its the same in the benchmark here.
 
Back