New AMD graphics chief David Wang says the company will release new GPUs annually

Polycount

Posts: 3,017   +590
Staff
Something to look forward to: The GPU market has been a bit stagnant lately. Nvidia hasn't released a new series of video cards for some time, and AMD isn't performing much better. However, AMD's new graphics chief, David Wang, is shaking things up now. At Computex, Wang said AMD would shift their focus to an annual GPU release cycle to make the business "more fun" for consumers.

Although Nvidia has a commanding lead in the video card market, it's no secret that enthusiasts and hardcore PC gamers have a few issues with the way the company does business.

While the recent controversy surrounding the GeForce Partner Program is one example of this, Nvidia's unpredictable GPU release schedules are also the source of some frustration among gamers.

The last GPU generation the company released was their GTX 10-series cards, which launched in 2016. Gamers who have been eagerly awaiting the company's next GPU line-up recently got some unfortunate news when Nvidia CEO Jensen Huang said it would be a "long time" before their GTX 11 or 20-series cards hit the market.

For the time being, it seems the company is more interested in developing hardware that's geared towards AI and machine learning.

As unfortunate as that may be, PC gaming enthusiasts may have found their savior in AMD's new graphics chief, David Wang. At his first public appearance during Computex this week, Wang claimed AMD is fully committed to releasing a new GPU product every single year.

Wang admits that his company has lost a bit of momentum in the GPU market lately, adding that he wants to "make this business more fun" by returning to a more consumer-oriented GPU release cadence.

According to PCWorld, this annual release cycle could include entirely new GPU architectures, process changes, or "incremental architecture changes." Even if the changes aren't always significant, hardware enthusiasts will undoubtedly be pleased to see AMD make an effort to release more products, more regularly.

It's not clear when AMD plans to release their next GPU, but hopefully, we'll see it sooner rather than later.

Image credit: PCWorld

Permalink to story.

 
* If a new card release means Film Movie like Games capability in the next year then that is a good idea! Waiting for consumer High end Cards like Volta for years is moot.
 
I guess next card will be released in 364 days.

Why are people so upset about no new Nvdia cards. Hell is that GTX 1080Ti not doing it for you anymore. Maybe a yearly refresh is fine if it's results in tangible improvements, but you cannot expect large architectural improvements each year. If Nvidia releases next gen cards in 2019 that's fine by me. Until we see truly next gen games pushing the current cards to breaking point why would they rush out a new card.
 
With the X-BOX One, PS4, and PC's using AMD's chips, they could very well take the lead. Both Intel and Nvidia have become comfortable and apathetic due to holding their market segments that AMD has worked quietly and steadily to sneak in on them.
 
I guess next card will be released in 364 days.

Why are people so upset about no new Nvdia cards. Hell is that GTX 1080Ti not doing it for you anymore. Maybe a yearly refresh is fine if it's results in tangible improvements, but you cannot expect large architectural improvements each year. If Nvidia releases next gen cards in 2019 that's fine by me. Until we see truly next gen games pushing the current cards to breaking point why would they rush out a new card.

That's not how it works. Powerful graphics cards come out first and then games take advantage of them. Not the other way around. Developers can't make games that require more power than what's available.

The lack of a next generation of cards simply means that games are being held back.
 
That's not how it works. Powerful graphics cards come out first and then games take advantage of them. Not the other way around. Developers can't make games that require more power than what's available.

The lack of a next generation of cards simply means that games are being held back.

I wonder when developers start out on their next game and their vision as to what that game is going to be and entail. They must have a specific hardware in mind to develop with and for while in development, Hardware moves forward too. Do they stop to consider what it will allow them to do in regards to their project, or they just stick the parameters they started out with? It has to be maddening at times, causing recalculations, delays, and cost overruns to create the best experience possible.
 
That's not how it works. Powerful graphics cards come out first and then games take advantage of them. Not the other way around. Developers can't make games that require more power than what's available.

The lack of a next generation of cards simply means that games are being held back.

I wonder when developers start out on their next game and their vision as to what that game is going to be and entail. They must have a specific hardware in mind to develop with and for while in development, Hardware moves forward too. Do they stop to consider what it will allow them to do in regards to their project, or they just stick the parameters they started out with? It has to be maddening at times, causing recalculations, delays, and cost overruns to create the best experience possible.
Their biggest limit is what the current hardware allows them to do (for both the development and testing). Even if you work towards making the best game ever, you can't rely on the fact that maybe during development GPUs will become good enough to play the game. A simple delay or a botched GPU launch can spell disaster for you.
 
Last edited:
I would gladly buy an AMD card if it's better in performance/price ratio than geforce products. It's just been ten or more years when AMD hasn't released anything that can compete with them, the vega stuff is great in theory but the actual store prices make no sense. As N-vidia doesn't want to sell me a 1000+ $ graphics card and 2000+ $ g-sync monitor to go with it I could spend that money on AMD card and freesync monitor if they make anything compelling. I can run most things maxed at 1080p 100+fps and I'm not getting 4k monitor before better gpus as even 1080Ti can't manage 4k 100+ fps in most games.
 
I guess next card will be released in 364 days.

Why are people so upset about no new Nvdia cards. Hell is that GTX 1080Ti not doing it for you anymore. Maybe a yearly refresh is fine if it's results in tangible improvements, but you cannot expect large architectural improvements each year. If Nvidia releases next gen cards in 2019 that's fine by me. Until we see truly next gen games pushing the current cards to breaking point why would they rush out a new card.
Nobody will develop next gen games if there's no video cards to program for. Developers would never try to push a gtx1080ti cause that's maybe 0.01% of graphics card owners....but they might try to push the limits of a gtx 1070 and we've already seen games that can do that easily @ 4k.
 
Not like AMD hasnt already been doing that, there were only, what, 4 releases of pitcarn XT? (7870->8870->270x->370X). One release a year. And it wasn't "fun", it was a boring, tedious slog through AMD's inability to compete or modernize their GPUs with new features as nvidia ate their lunch.

We dont need yearly releases, we already have that. What we NEED is for AMD to release competitive architectures on time, not 2 years after nvidia's release.

Actually competing will be exciting. More rebrandeon is not.
 
Last edited:
"enthusiasts and hardcore PC gamers have a few issues with the way the company does business."

For me personally, I haven't had a stable AMD video card for a good 15 years. I wouldn't consider "doing business" by releasing crappy drivers. Every one of them I've bought has at least one major issue - drivers are the culprit. And every time AMD either ignores me or could care less and never releases a fix. Droves of customers post everywhere on the internet about it but we are all ignored. Every nVidia card has been flawless. I'm no fanboi, and I like competition, but I do care that I get something that works for my money. I doubt I will ever buy AMD again unless nvidia goes south. Endless headaches and countless hours lost in troubleshooting.
 
"It's just been ten or more years when AMD hasn't released anything that can compete with them,.."

Really? Wasn't the R9 290x/290 the fastest gaming GPU when it was released in 2014? what 4 years ago?? thats not 10 years ago and wasn't the fury x trading blows with the 980 ti when it was released in 2015? Yep that was 3 years ago, not 10 years its really only been Vega in 2017 that was really late to the market (HBM2 gamble lost) where the only reason why it didn't take the performance crown was because Nvidia launched a slightly cut-down Titan XP $1200 GPU and sold it has the 1080 ti.
 
I would gladly buy an AMD card if it's better in performance/price ratio than geforce products. It's just been ten or more years when AMD hasn't released anything that can compete with them, the vega stuff is great in theory but the actual store prices make no sense. As N-vidia doesn't want to sell me a 1000+ $ graphics card and 2000+ $ g-sync monitor to go with it I could spend that money on AMD card and freesync monitor if they make anything compelling. I can run most things maxed at 1080p 100+fps and I'm not getting 4k monitor before better gpus as even 1080Ti can't manage 4k 100+ fps in most games.
Even my 1080 Ti struggles to get over 100 FPS at 1080p in some games. No clue why.
 
Really? Wasn't the R9 290x/290 the fastest gaming GPU when it was released in 2014? what 4 years ago?? thats not 10 years ago and wasn't the fury x trading blows with the 980 ti when it was released in 2015? Yep that was 3 years ago, not 10 years its really only been Vega in 2017 that was really late to the market (HBM2 gamble lost) where the only reason why it didn't take the performance crown was because Nvidia launched a slightly cut-down Titan XP $1200 GPU and sold it has the 1080 ti.
Yes, the 290/x were not only performance competitive, they were really price competitive. They made nvida outright panicky and rush out the 780ti.

the fury X, OTOH, was a completely different bag of worms, and first gen HBM proved to be just as much as a screw up as HBM2. It was more expensive then the 980ti, ran hotter, required a radiator mounting to run, and performance wise only sometimes traded blows. Other times, the fury X was closer to the 980 in performance, depending on the game, and it couldnt OC at all, while the 980ti was an OC champ with lots of extra performance on tap. The Fury X was the start of AMD's downfall into non-competitiveness, along with the entire rebranded 300 line, finished off with the 3DFX-esqe strategy of splitting meager resources on a hail mary.

And no, VEGA didnt fail to take the performance crown because of just the 1080ti. It didnt take the performance crown because, at launch, they couldnt outperform 16 month old GPUs while pulling more power and having much larger dies to boot, WHILE BEING MORE EXPENSIVE (remember, the 1080's MSRP is $450, the vega 64 MSRP was $650, $550 for a "promotional model" that was a glorified bait and switch). VEGA was a disastrous launch, and it will slot in with the likes of the geforce 5 series as a failure of an arch.
 
Last truly great AMD card I can remember was the 4870, which shaded the GTX260 so much for it's price Nvidia had to revise it and add a new SKU with more shaders. 5870 also launched with DX11 hardware 6 months before Nvidia responded properly with their initial Fermi parts. AMD's GPU division were on top of their game, even if those cards still carried ATi branding!

Since then it's been a fair amount of meh and Nvidia haven't had a great deal to worry about. Took my money without too much debate required.
 
Last truly great AMD card I can remember was the 4870, which shaded the GTX260 so much for it's price Nvidia had to revise it and add a new SKU with more shaders. 5870 also launched with DX11 hardware 6 months before Nvidia responded properly with their initial Fermi parts. AMD were on top of their game.

Since then it's been a fair amount of meh and Nvidia haven't had a great deal to worry about. Took my money without too much debate required.
Funny, I remember the 290X doing that exact same thing with the GTX 780.
 
I would gladly buy an AMD card if it's better in performance/price ratio than geforce products. It's just been ten or more years when AMD hasn't released anything that can compete with them, the vega stuff is great in theory but the actual store prices make no sense.

Try 2013 with the 290x. That card ROFLStomped Nvidia's best (780 and 780ti). They had to release the first Titan to compete with it.

It wasn't until 2015 with the 980ti and Titan X that Nvidia started to take the performance crown and even then the delta was only 15%-20%.

You guys make it seem like AMD has been lagging far behind Nvidia.
 
Funny, I remember the 290X doing that exact same thing with the GTX 780.

Nah I think you're mistaken.

The 290X was enormous, ridiculously hot and noisy as well as more expensive than the GTX780 when it launched in October 2013. Sure it massively undercut the price of the Titan then but Nvidia had 780ti in the works already hence it being on the shelves within just a few weeks.

At which point the 290X was fairly unattractive and comfortably outperformed all round. It didn't overclock well either as it was pushed so close to it's limits at stock, whereas you could get a good 780 close after tweaking.
 
Last edited:
Nah I think you're mistaken.

The 290X was enormous, ridiculously hot and noisy as well as more expensive than the GTX780 when it launched in October 2013. Sure it massively undercut the price of the Titan then but Nvidia had 780ti in the works already hence it being on the shelves within just a few weeks.

At which point the 290X was fairly unattractive and comfortably outperformed all round. It didn't overclock well either as it was pushed so close to it's limits at stock, whereas you could get a good 780 close after tweaking.

You have a nice selective memory...that is ultimately wrong.


https://www.techspot.com/review/727-radeon-r9-290x/

The GeForce GTX Titan blew us all away eight months ago with its mindblowingly fast GPU, cramming 7080 million transistors into a 561mm2 die to provide massive processing power and bandwidth. The catch, of course, was that Nvidia wanted (and still wants) $1,000 for it -- a sum that didn't necessarily seem to prevent cards from flying off shelves even though it's more than our entire entry-level rig.

Nvidia followed up three months later with the equally impressive GTX 780 for a more plausible $650, where it remains today. Neither of those cards had much of an impact on AMD's sales as the company's most expensive offering at the time was a $450 Radeon HD 7970 GHz Edition (the 7990 arrived a few months later).

page 11 Conclusions
Radeon R9 290X vs. GeForce Titan vs. GTX 780
Some thought it was impossible, but there's no doubt the Radeon R9 290X is every bit as fast as the mighty GeForce GTX Titan.

What's particularly impressive about this comparison is the R9 290X's shockingly low price of $550. That's 15% cheaper than the GTX 780 despite being 10% faster, in addition to costing 45% less than the GTX Titan.

We're impressed with the R9 290X's performance and even more so with its price. AMD has finally given Nvidia a reason to discount its top tier products and that's enough to be grateful for. There's something to appreciate in watching healthy competition between two seasoned rivals. If you've been eyeing the Titan, it seems about time to pull the trigger, whether you take advantage of pending price cuts or settle for the R9 290X now.
 
You have a nice selective memory...that is ultimately wrong.


I think not. That review is dated 24 October 2013.

Your problem is you need to fast forward less than three weeks to this review, 12 November 2013:

https://www.techspot.com/review/738-gigabyte-geforce-gtx-780-ti-ghz/

Where the 780ti soundly beats the R9 290X, and what is essentially an overclocked GTX780 that was achievable with most decent partner boards has no trouble facing it down either despite not costing more and displaying less frightening temperature characteristics!

R9 290x was only ok just a month after it arrived for it's price performance. The 290 was the best choice from AMD's lineup at that time by far.

I just had a GTX780 knocking on to 1100Mhz, comfortably facing down R9 290X performance for less outlay and less shocking temperatures.....
 
I think this is all anyone needs to read to see who knows what they are talking about:


As proven completely incorrect by the link to Techspot's initial review of the 780ti in my post immediately above.

Thanks for playing though, collect your cuddly toy consolation prize on the way out.

I got my nvidia release mixed up. Titan then 780ti. 290x still stomped 780 and Titan. Neck and neck with 780ti.

But you called the 780ti, an overclock 780 lol. Don't let the door hit you on the way out.
 
That's not how it works. Powerful graphics cards come out first and then games take advantage of them. Not the other way around. Developers can't make games that require more power than what's available.

The lack of a next generation of cards simply means that games are being held back.

I wonder when developers start out on their next game and their vision as to what that game is going to be and entail. They must have a specific hardware in mind to develop with and for while in development, Hardware moves forward too. Do they stop to consider what it will allow them to do in regards to their project, or they just stick the parameters they started out with? It has to be maddening at times, causing recalculations, delays, and cost overruns to create the best experience possible.
Their biggest limit is what the current hardware allows them to do (for both the development and testing). Even if you work towards making the best game ever, you can't rely on the fact that maybe during development GPU will become good enough to play the game. A simple delay or a botched GPU launch can spell disaster for you.
Speaking as a developer, you can only target current hardware. There are only so many tricks you can do. If developers have tricks that they would like to do but cannot because of hardware limitations, developers are out of luck until that trick becomes available.
 
Back