AMD is gonna kick some nVidia *** in Q3 lol, at least in terms of value. Fight!
Like the 670 there will be 4GB VRAM versions for $50 more or so. 2GB is enough for 1440p/1600p and below. (for the most part anyways)
One thing from the conclusion on Toms Review caught my eye though.
AMD is STILL having frame time issues in that many titles? Looks like that might include single GPU's as well?
What are you basing that on ? Or are you attempting to start a flamefest ?
AMD are on record as saying that Tahiti will remain the primary focus of their high-end graphics through the remainder of the year*, so whatever AMD's response is (and they seem to have shelved a Sea Islands successor) it likely boils down to either offering a larger/more varied bundle - which has a certain appeal, although limited when you can buy the game coupons on eBay for cents on the dollar, or further price cuts.
Now, bear in mind that AMD have instituted two price cuts since the HD 7000 launch (April and August) as well as adding game bundles, while Nvidia prices have held steady on pricing with the customer lucky to get a single game coupon- yet AMD continues to lose discrete graphics market share to Nvidia. Are you expecting AMD to come up with some other tactic ?
As far as I'm aware, AMDs primary focus is reclaiming its lost mobile graphics share (which has taken a nosedive of late). They seem to have shelved the Sea Islands high-end, and move to implement (along with Nvidia's Maxwell µarch) 20nm process technology likely combined with GDDR6 (since it is a subset of DDR4) to increase bandwidth whilst keeping the GPU die -memory controllers- real estate at manageable levels (256 or 384-bit).
Both TSMC's 20nm process and GDDR6 look to be Q1 2014 timeframe, so I hope you aren't expecting too much from AMD's design team. AMD have made reference to "special announcements" at Computex, but given the dearth of valid information/ AMD sanctioned "leaks" concerning Sea Islands' fast disappearing Curacao GPU, I don't think a new µarch will be unveiled anytime soon - although I'm certain a slide deck will be circulated.
* From AMD's Devon Nekechuk and pals
The issues of stuttering with Bioshock Infinite and Far Cry 3 don't appear to be frame pacing issues in the usually applied sense, more issues with non-memory management within the GCN architecture that haven't been ironed out - as I understand it. I was under the impression that Bioshock had gotten better with driver updates.
I am basing my information on a rumor that was revealed recently:
http://wccftech.com/amd-radeon-hd-8000-series-reportedly-launching-q3-2013-specifications-revealed/. Their information about the 780 and 770 was correct, and I have read their articles for a few years. They are pretty spot on.
Also, when I said "Fight!", I meant AMD and nVidia.
2300 shaders on 28nm equates to about 470-500mm² (390-420mm² + ~80mm² for the memory I/O ) at the transistor densities in use. That's a pretty big die to get up and running in the space of a few months, and a Q3 launch basically means that the chip has already taped out and they're into risk production already.
As a note, it might have been helpful to state the foundation for statement (ie the WCCF piece) rather than just a graphic which doesn't impart any actual information
Their original GTX 780 (Titan LE) piece called for 2496 cores/ 5GB framebuffer. WCCF basically just repackage news and speculation from other sites. Their GTX 770 info came directly from 3DCentre's extrapolation
The Hainan numbers also don't gel with what AMD's Dave Baumann said on B3D's forum a few days ago
We have updated pricing info to reflect the final $400 price point for the GTX 770:
Lol I knew you would point that out. The graphic was just a joke... I was saying that AMD may win the battle in a few months. I didnt want to provide any information.
Yes, WCCF is just a repackage, I still read from them.
They may well do, but I was thinking along the lines of the high-end implications since we're talking about the GTX 770. Curacao could indeed eventuate- but as WCCF pointed out, it will be expensive (at least $600) and will have no effect on the present price structure of either company. It may force Nvidia's hand and lower the price of the GTX 780 (unlikely), but there's still daylight between $600 and the $400 that the GTX 770/HD 7970GE sit at.
Hainan -at best- with the published data wish list, would still sit below the HD 7970GE/7950 in performance, and would have been a Pitcairn and HD7970M successor, so again likely falls outside the performance segment.
It would be great to see a regular turnover of new models, but at present both Nvidia and AMD seem content to stay pat until 20nm makes the next µarch feasible. The only difference is that Nvidia are reaping some benefit from pursuing a dual strategy ( full compute chip and pared down gaming GPU) bringing GK 110 to the masses whilst wringing the last drop of performance out of GK 104 at the same time.
Yeah, I hope AMD doesnt go with 6GB of GDDR5... that will easily drive the price up to the rumored $600. No one needs that type of memory at 1080P. Even if someone was running at higher resolutions, they will need a more powerful GPU, not more RAM. I cant wait to see what the 8950 will bring to the table. The rumored specs put a pretty large gap between the 8970 and the 8950. Also, what would AMD move from using a 384bit bus on the 7950 to a 256bit bus on the 8950. It doesnt add up.
Don't worry I thought it was funny lol.
If the 8 series specs are right, then the 8970 is gonna rock the market with a 6gb and some more power to the current generation. Though I still want to make a system with air cooled cards just to get that cooler still, I can't stop ranting how awesome that aftermarket Heatsink is.
The Windforce? Yeah it is really badass. The 6GB is gonna be great with the rise in 4K content, but I think the GPU performance has to improve before adding more RAM. It would make more sense to include 6GB of VRAM with the 9000 series.
Agreed, the 6gb may be nice if you grab a pair of em, but yea we need more power, but who knows until the day the card comes. At the moment, 3-4 seems to be the magic number for high res setups for the time frame.
3GB is the sweet spot. 2GB is beginning to be considered the low end (for highest settings in game). Especially with next generation engines like Frostbite 3, I think 2GB will be a must.
Yea, my 6990s have yet to have any issues even in my setup for games at ultra so far but I have a feeling battlefield 4 may make them show some weakness in which case ill probably grab some new cards. I'm waiting though before I do because I want to see all of this next gen cards and battlefield 4 in which case I be grabbing some minimum 3gb cards or higher depending on what's out and weather I go more dual gpu cards or 2-3 single gpu cards.
How about getting the thread back on topic ?
You have the opportunity to start another thread, or use the Conversation feature rather than clutter up this thread -I'm getting alerts on the thread only to see JC713's OT posts . I'll even give you a topic starter if you want to speculate- can't say fairer than that.
Why did you not test the overclocking potential of this beautiful graphics card?
We ran out of time before the deadline. We have since had time to check out the overclocking. The core maxed out at 1210MHz. This only allowed for 1 - 2fps more when compared to the factory overclock.
Thank you for the additional information!
It's good to see that NV surprised us with a $399 price but it is not for a 4GB version. The 770 2GB doesn't look like a slam dunk as AMD's partners are already dropping prices on 7970Ghz Edition cards and 2GB of VRAM is not enough for next gen games/mods. You get 4 free games and for $380 a card with 3GB of VRAM that's barely slower than a GTX770:
Here is to hoping GTX760Ti brings HD7970 level of performance for $299 to shake up the market more.
JC17: AMD is gonna kick some nVidia *** in Q3
^You should edit the above comment you made, since AMD has said months ago they had no plans for a new GPU this year, you admitted it was a joke, and your "source" was a rumor.
I -=LOVE=- the replaceable fan idea! I can not count how many crappy little GPU fans have gone out in my years.