Tomb Raider Tested, Benchmarked

It's so funny. Nvidia's drivers weren't ready! Stop the presses! Wait until they are ready to review them! AMD's drivers are hardly ever ready, we'd never see a CPU/GPU test if that were the case.
I'm eagerly awaiting the response from hahahanoobs when the next game which favours Nvidia comes along...
 
hahahanoobs

I have never known someone to argue you about something so lame? I mean seriously, their was nothing severely wrong with this review, at all...

wow...
 
"After testing each item, we found that changing DOF to normal while leaving everything else on ultra produced a 48% boost from 54fps to 80fps -- and that's with TressFX enabled as it is by default on ultra quality."

Ultra enables tressfx? I thought that's what ultimate does? So does the 680 get 80fps avg with tressfx enabled and DOF set to normal (and resolution at 1920x1200)?
 
No you are misunderstanding what I am saying there. Look at the screenshots of the quality settings on page 1. We are tweaking the quality settings there, NOT the quality preset.
 
As you can imagine, completing a performance review takes entire days and we usually try to move pretty fast to get you the facts as titles are released -- sometimes at the cost of missing a zero-day patch or driver update.

I totally understand that.

From your Update:
"It's somewhat safe to assume this will be addressed on an upcoming driver release."
^Something like that was what I thought was missing from the original story. To me, the end of that ("For now buy AMD") paragraph came off like, We don't care about what nVIDIA may or may not be doing, so why should our readers, so we won't even mention them. I wasn't expecting any specific dates. That would be insane. *looks at the guy that assumed that was what I wanted* Assumptions and/or thoughts from an experienced tech journalist (you guys) about what might be going on, can be just as reassuring as actual proof or conformation. Maybe someone was really looking to buy a GTX 670 for Tomb Raider, but after reading that you (Steve representing TS) said AMD was the only option, they switch, then regret it later when possibly a driver and/or patch comes out to fix the issues a short while later.

In the particular case of Tomb Raider, we have performed a limited set of tests with the new patch and Nvidia graphics cards, and performance hasn't improved on the highest visual settings. It's somewhat safe to assume this will be addressed on an upcoming driver release.undefined

^There was nothing wrong with your initial testing. I do not think your test was bias in any way, shape or form. I just didn't like how it was left as AMD was the only option, and a blank stare at what the other guys (nVIDIA) were doing to fix their problems.

That was my issue with the review. If that makes me a bad guy... guilty.
 

Build 1.0.718.4 did nothing to improve performance of Nvidia cards in our test. Still waiting for another patch or updated Nvidia driver before we update our article.

I'm still gob smacked that you think "For now" came off as in your words "We don't care about what nVIDIA may or may not be doing, so why should our readers, so we won't even mention them."... Pretty amazing but like I said not going to argue our point further. I have said all I need to and you have circled your issue many times so its all there for others to read.
 
Just tested (benchmark only) the newest build with 314.21 Beta drivers. DOF is no longer a problem. AA and tressfx are the new performance killers. I'm going to bed now, looking forward to playing a lot tomorrow. Performance wise, I'm very pleased. I just hope the CTDs haven't returned.
 
Just tested (benchmark only) the newest build with 314.21 Beta drivers. DOF is no longer a problem. AA and tressfx are the new performance killers. I'm going to bed now, looking forward to playing a lot tomorrow. Performance wise, I'm very pleased. I just hope the CTDs haven't returned.
60% performance improvement in Tomb Raider, and yes Tress FX is a new technology and needs some polishing, it is like when tessellation made its official showing in DX11, it was taxing on systems until drivers were polished and games developers made optimizations.
 
Re-Run the bench
New beta driver from nvidia and amd so far got a GTX 670 and the new geforce 314.21 has a significant performance gain quite much if you ask me

Res 1920 x 1080 Dell SX2210
Core i7-3770 non K
Corsair Vengeance 16 GB @ 1600MHz
EVGA GTX 670 FTW 2GB
Driver Geforce 314.21
Tomb Raider v1.00.718.4

Ultra Setting No V-Sync, With FXAA On
Min 67.9fps
Max 101.2fps
Avg 83.5fps

Ultimate Setting No V-Sync, With FXAA On
Min 34.0fps
Max 62.3fps
Avg 48.2fps
 
How come in the CPU comparison they used an AMD FX 8350 vs the i7 3770k.? Since when did the i7 become the go to test review for gaming builds when it doesn't provide much difference over the i5. Since the IB release everyone knows that the i7 and its hyper threading doesn't do much more for the extra $100 price tag. WHy in the world isn't the i5-3570k on this list? it competes better with the 8350 and probably would have showed the same results. I really get sick of sites like this promoting the use of the i7 in gaming builds when it doesn't belong in a gaming build...
 
Don't worry - some of us are sick of guests coming here and saying "this is practically the same, why didn't I see benches for both?"
 
I'm sick of seeing reviews using the stupid I7 3770k when nobody in their right mind considers it a gaming chip....
 
I'm sick of seeing reviews using the stupid I7 3770k when nobody in their right mind considers it a gaming chip...

And why not? please, tell us why the 3770k shouldn't be used as a gaming chip?
 
Because it doesn't provide a significant performance increase over the i5 and is $100 more expensive. Plain and simple its a work station CPU not a majority gaming CPU. The performance increase on the games it does provide an increase is very very minor. MOST people who are building a gaming PC know this and put an i5-3570k in their machine instead of being mindless *****s who think the i7 gives a significant increase in performance....
 
How come in the CPU comparison they used an AMD FX 8350 vs the i7 3770k.? Since when did the i7 become the go to test review for gaming builds when it doesn't provide much difference over the i5. Since the IB release everyone knows that the i7 and its hyper threading doesn't do much more for the extra $100 price tag.
HT has already been beneficial for games many years (some more then others) and makes a noticeable difference in overall system snappiness as well. More and more games today take advantage of threads however they can get them, i7's rule the gaming charts and that is not changing anytime soon. Especially when pushing multi-GPU setups at high single and triple monitor resolutions.
i5's are priced accordingly.
 
Because it doesn't provide a significant performance increase over the i5 and is $100 more expensive. Plain and simple its a work station CPU not a majority gaming CPU. The performance increase on the games it does provide an increase is very very minor. MOST people who are building a gaming PC know this and put an i5-3570k in their machine instead of being mindless *****s who think the i7 gives a significant increase in performance....
I don't think anyone ever said the i7 provided a significant increase in gaming performance. But, because it doesn't provide a great increase over the i5 (your suggestion as a 'gaming' cpu) it can't be considered a viable gaming processor? That doesn't make sense. It still provides more performance then the i5 does it not? It may not be an significant amount to justify the increase in premium but it is still a better processor all the same. I bet you think the GTX titan shouldn't be considered a gaming GPU.​
 
This guy thinks the i7 gives better performance with multi GPU systems lmao.... ignorant much?
Its still pretty lame to leave out one of the most popular gaming CPU chips the i5 3570k when most people on most forums don't compare the 8350 to the i7 anyways, price point competes the i5 with it especially in gaming... Wow the logic of this person "I bet you think the Titan isn't a gaming blah blah bull crap..." Come on now you should know better then to spew nonsense when someone has a valid argument.
 
I agree the i7 isn't considered a "Gaming CPU" its considered a workstation high volume chip not a gaming chip.. Does it perform good in games sure but its not a gaming chip. There are barely any games that benefit from hyper threading, I don't know what this guy is smoking... Unless you plan on doing a lot of other things outside gaming the i7 is not a good choice for a "Gaming PC" its a waste of money, waste of $100, ESPECIALLY if you think its going to provide overall more performance or should I say a noticable performance gain over the i5. And I add even in a multi GPU / monitor setup... The i5 can handle any gpu configuration that can fit on the 1155 socket as good as the i7 saying otherwise is just ignorance.
 
[RIGHT][/RIGHT]
Its still pretty lame to leave out one of the most popular gaming CPU chips the i5 3570k when most people on most forums don't compare the 8350 to the i7 anyways, price point competes the i5 with it especially in gaming... Wow the logic of this person "I bet you think the Titan isn't a gaming blah blah bull crap..." Come on now you should know better then to spew nonsense when someone has a valid argument

I made that claim because the only argument you were trying to make "i5 is better then the i7 because of the price". That would be true in terms of price / performance. But, that isn't what were talking about. We are talking about which chip is better in terms of performance only. The i7 is the better chip. Sure, the i5 can give just as good performance, but it isn't better. Do you understand? last time I checked there is no such thing as a CPU that is manufactured for 'gaming'. Yes, some are better than others but, that doesn't make those processors gaming chips. Sure, games aren't necessarily taking full advantage of HT but that is irrelevant. Also, I brought up the titan because according to your logic that shouldn't be considered a gaming GPU because others can provide equal or better performance for a cheaper price.

This guy thinks the i7 gives better performance with multi GPU systems lmao.... ignorant much?
What are you even talking about? When did I even say this?
 
What are you even talking about? When did I even say this?
I think that was aimed at amstech comment above yours.

@guest above feel free to do your own benchmark and show TechSpot how its done. I think you will find that no matter how you do your benchmark, there would be someone that thinks it should be done differently. Read the benchmark and if you are not happy with the results, you have the option to go somewhere else in search of happiness. With your own submission you said the results would be the same so why not switch the CPU's in your head and avoid this crazy discussion of which CPU's you think should have been used.
 
I agree the i7 isn't considered a "Gaming CPU" its considered a workstation high volume chip not a gaming chip...

Could you talk any more crap? Intel have, you know, workstation chipsets and CPU's? "x79" spring to mind? Anyway here's Intel's website:

http://www.intel.co.uk/content/www/uk/en/processors/core/core-i7-processor.html

As you can clearly see, they advertise the core i7 as a high performance "top-of-the-line" gaming processor.
Sure, the i5 does nearly as good a job, but not quite as good as the i7, this is usually mainly down to the extra L3 cache on tap, but we won't go into the technical details as its all on the Intel website.

You'll also notice that generally on the whole, in pretty much every review of the i7 vs the i5 across the internet the world over, i7 beats the i5 by a frame or two in all games, but that's exactly it, its only a frame or two, to some (such as yourself I should imagine) that's a waste of $100, for someone like me, who wants the extra frame or two, I'm happy to pay the extra.

But as we were originally talking about pure performance, the i7 is more powerful in almost every way over the i5.
If you would like to know what Intel workstation CPU's are and why they are different here's a few links from Techspots own review of them and Intel's website:

https://www.techspot.com/review/465-intel-sandy-bridge-e-core-i7-3960x/
http://www.intel.co.uk/content/www/uk/en/processors/core/core-i7ee-processor.html

Even the Workstation Processors on the first page I found mention gaming, seems as if people are prepared to pay another £500 for an even more powerful processor! Shock horror!
 
Great job agreeing with yourself guest... I can see your IP fella. FWIW the last 5 guests are all the same guy so I think we really need to stop feeding this guy. Nothing wrong with agreeing to disagree in my opinion. I'm still a bit confused though... since the 3570k and the 3770k give such similar performance in most games (usually within 2-5%) why do you need additional benchmarks for the 3570k if you already have 3770k results? Also, even I who has a machine specific for gaming, will often do other things on that rig such as audio/video encoding/transcoding and those extra threads can really make a difference. For some people enough of this takes place to afford the $100 price difference especially since it almost never detriments gaming performance.
 
Back