1. TechSpot is dedicated to computer enthusiasts and power users. Ask a question and give support. Join the community here.
    TechSpot is dedicated to computer enthusiasts and power users.
    Ask a question and give support.
    Join the community here, it only takes a minute.
    Dismiss Notice

AMD plans new round of Radeon price cuts, Sleeping Dogs bundle

By Jos ยท 31 replies
Aug 21, 2012
Post New Reply
  1. CMH

    CMH TechSpot Chancellor Posts: 2,051   +22

    Correct me if I'm wrong, but at 2560x1440 (or 2560x1600 in a 30") the bandwidth-starved 660ti loses some of its competitive edge against the 7950.

    Most reviews are done at 1920 x 1200 or 1080p, and I'd admit the numbers are in favor for the 660ti.

    I'm surprised noone had brought this up yet.
  2. dividebyzero

    dividebyzero trainee n00b Posts: 4,840   +1,268

    Up until the fairly recent phenomenon of $300 2560x1440, I doubt many people spending a sizeable amount of cash on a 2560x1440/1600 IPS screen would be going to be shopping for singular third-tier card. If the resolution becomes more widely used it could have some impact on the mainstream...although by the time that happens, we'll likely be on the third generation of 28nm GPU's at least.
    If you're looking for reviews indicative of real world gaming then 1920x1080 might fit the bill for most people which is why it's the commonly used resolution - although a quick look at some reviews of the 660Ti and their testing resolutions...
    1920 and below only (8 sites): Bjorn3D...Benchmark Rev...HiTech Legion..Cowcotland...LanOC...PureOC...OC3D...OCaholic...
    Includes 2560 (21 sites): Ninjalane...Guru3D...HT4U...Xbit...Hexus...Hardwareluxx...Legit Reviews...Hardware Secrets...Hot Hardware...TechSpot...Technic 3D...Hardware Canucks...Vortez...VR-Zone...Tech Report...Motherboards.org...Anandtech...Tweaktown...HardOCP...ComputerBase...Tom's Hardware
    Includes 5760 (10 sites): Neoseeker...RWL...Hardware Heaven...PurePC...eTeknix...Hardware.info...OCC...TPU...Bit-tech...PC Perspective...
    ...would indicate you probably need to update your bookmarks- especially as a large number of sites that put out multiple reviews tend to fall into the 2560 and 5760 categories
    I did. But then, the GTX 660 Ti review thread seemed the appropriate place for it rather than a story about AMD price cuts
  3. Blue Falcon

    Blue Falcon TS Addict Posts: 161   +51


    I totally agree with you regarding NV making $. From a business point of view, NV is a better run company (balance sheet, cash flow, strategy -- Tegra, Tesla, Quadro lines). However, strictly speaking on price performance, I think AMD has the edge now.

    I see HD7870 has dipped to $230-240 on Newegg:

    Also, cards such as the MSI TF3 / Gigabyte 7950 are dropping to $310:

    Here is the question:
    - If I am not overclocking, why would I pay $60-70 extra for the 660Ti?
    - If I am overclocking, why wouldn't I buy a 7950 and crank it to 1100-1150, and while at it not even worry about that 1.5GB of VRAM (3rd 512mb controller may hurt 660Ti's bandwidth per AnandTech and other sites)?

    Obviously AMD was riding with high prices since NV instead decided to focus on mobile market (per your JDPeddie link NV gained mobile discrete GPU market share). Now though, AMD responded and from a consumer's (not shareholder's) point of view they are once again delivering better price/performance and more oomph for overclockers.

    Perhaps what makes GTX660Ti sweet are those NV features, PhysX, CUDA, Adaptive Vsync, TXAA but overclocking wise 7950 should win. BTW, that HardOCP article you linked used an 800mhz 7950 against a 1300mhz+ GTX660Ti. Let's see what happens when he has a follow-up article.
  4. Blue Falcon

    Blue Falcon TS Addict Posts: 161   +51


    Also, Dirt Showdown is not a mysterious outlier. It's an AMD Gaming Evolved title. AMD is just working closer with developers on pushing global lighting model and contract hardening shadows via DirectCompute shaders. Thus far 3 games on the market have been coded to take advantage of DirectCompute and some other GCN architectural enhancements and they will all perform faster on AMD cards.

    Dirt Showdown
    Sniper Elite V2
    Sleeping Dogs

    If AMD puts even more $ behind DirectCompute for games, Kepler will continue to suffer in all future titles which use DirectCompute for HDAO/global lighting, etc.
  5. cliffordcooley

    cliffordcooley TS Guardian Fighter Posts: 11,516   +5,080

    Thats only fair since AMD cards suffer from lack of Cuda support.
  6. dividebyzero

    dividebyzero trainee n00b Posts: 4,840   +1,268

    Yup. And that's what they should have been pushing since GPGPU became a factor in consumer graphics. Nvidia 2006...AMD 2012. Better late than never.
    Yup. IF. A big proviso when it comes to AMD. Some of us remember ATi's GiTG program which also promised much...and promptly faded into obscurity. Hopefully AMD learn this time around...although it does bring up a couple of points:
    1. AMD is haemorrhaging red ink. It will be interesting to see what kind of triage system they use for project funding, and
    2. AMD's chief game dev point man recently departed for Intel, so your future world of AMD domination may not be so cut-and-dried
    BTW: All that's going to happen is that AMD release AMD-centric titles, and Nvidia does likewise. Borderlands 2 is due for release soon is it not...with TXAA support for Nvidia cards - 4 x MSAA performance for FXAA/MLAA performance penalty...and as far as I'm aware, AMD cards still tank in extreme tessellation- you don't think that maybe Nvidia might exploit that weakness?
    (or the OCaholic review you linked to earlier)
    All this ends up meaning is that eventually your choice of IHV might be predicated upon the game sponsor and game engine- that being the case, then us as the consumers lose. I'd also bear in mind that:
    1. the image quality additions have to bring more to the game than the penalty for using them, and
    2. The I.q. additions need to be something that can be used by cards down through the product stack if they do add substantially to gameplay
    I doubt it. For some obscure reason, people are thinking that Nvidia has abandoned compute because of the GK104, when the strategy was obviously to learn from the big-die mistake of Fermi (I.e. unnecessary double precision, 72-bit ECC memory support, large cache - none of which is required for gaming). AMD/ATi's success stemmed from getting the maximum number of usable GPU's per wafer. Nvidia is doing nothing more than branching it's product line. If you think that a GK110 card won't leave a Tahiti/Sea Islands in the shade in both compute and gaming I think you'll be sorely disappointed in Q1 2013 ( All production from September into the new year will be Tesla's for HSC projects such as ORNL).Nvidia have been doing consumer compute since 2006 (G80)- I doubt that they have suddenly decided that they don't want 85% of the professional graphics market.
  7. CMH

    CMH TechSpot Chancellor Posts: 2,051   +22

    I got a bit sidetracked when howzz started talking performance :p

Add your comment to this article

You need to be a member to leave a comment. Join thousands of tech enthusiasts and participate.
TechSpot Account You may also...