Look at all of the Radeon VII benchmarks available now

I don't understand why people are so butt hurt about this launch. It's competition that is needed. Not the best of competition but at least it is some competition.

I have great admiration for little AMD. They have 1/10th the research budget of Intel. With that meager resource, they are holding their own against both Intel and Nvidia.

Ryzen successfully bearded the Intel lion in its own den. I wouldn't be surprised if Vega VII does the same to Nvidia, especially if they drop prices as much as they dropped the Ryzen prices. It's pretty remarkable.

I usually support the scrappy underdog. There were some pretty lean years when AMD had only Bulldozer, but the long wait has been worth it.
 
How much did Ivy bridge improve Sandy bridge? How much did Broadwell improve Haswell? A die shrink does not mean a huge performance and efficiency boost.

As for not competing against a $1300 graphics card, it just shows that AMD'S CEO is much more level-headed than Nvidia's CEO who seems to have more and more of an inferiority complex.

IT JUST WORKS!!
 
Radeon Seven has 16GB and 1TB/s bandwidth, plus PCIe 4.0... it is a way better choice for future games, than an RTX 2080.

Spot on. The 8Gb in the 2080 will start seeing more and more frame inconsistencies. Average fps will appear similar, which is the only thing the lower tier review sites note, but the experience will be much better on the Ryzen 7. To me, that is MUCH more important than any Ray tracing or DLSS implementation.

For certain games, the Ryzen 7 will allow gamers to get away with 16gb of system ram whereas the 2080 will start to need more.
 
Radeon Seven has 16GB and 1TB/s bandwidth, plus PCIe 4.0... it is a way better choice for future games, than an RTX 2080.

Spot on. The 8Gb in the 2080 will start seeing more and more frame inconsistencies. Average fps will appear similar, which is the only thing the lower tier review sites note, but the experience will be much better on the Ryzen 7. To me, that is MUCH more important than any Ray tracing or DLSS implementation.

For certain games, the Ryzen 7 will allow gamers to get away with 16gb of system ram whereas the 2080 will start to need more.

Lol by the time 16GB make any gameplay differences the GPU itself is already too slow anyways. Radeon VII is probably best suited for 1440p gaming where 8GB will be fine for a long time.
 
...if by chance the AMD competitor wins, I'd say it's time to drop the RTX TITAN XR on em. Blow em away.
Yeah, that's really smart to hope for. Maybe AMD will even go out of business! Having competition means the consumer has to read a lot more and choose things and like dude that's so boooring. Kill 'em off!

FFS.
 
I don't understand why people are so butt hurt about this launch. It's competition that is needed.
They aren't butt-hurt; they're entitled - and primarily interested in exhibiting their technical discernment. So they'll crow about NVidia's or Intel's superiority over AMD in one or another aspects of the high-tech battle. They don't realize that gratuitously bad-mouthing everything the underdog accomplishes is, just maybe, counterproductive in the long term. What their next PC would be *without* AMD around simply never occurs to them. All will be provided.
 
I wonder what NVidia could do @ 1.8Ghz with 16GB memory on a 7nm process ???

Probably 60% better framerates than AMD at half the power!
To me, that is kind of a moot point since nVidia chose not to do that, when they obviously could have, with the 20XX series.

In your opinion, should AMD just shut down because the competition is ahead of them? I would hate to see the prices on nVidia's products if AMD did.

You can bet AMD is working on architecture improvements. For a company as small as they are, this looks like a smart business move, IMO. Shrinking the die as a short-term solution does several things:
  1. It is a proving ground for the new node.
  2. It is significantly less expensive than a ground-up redesign of the architecture.
  3. It allows them to remain in a reasonably competitive position so that they can continue to sell cards and make a profit to fund further improvements.
As I see it, this move is not all that unlike AMD targeting enterprise workloads with Zen because enterprise, not gamers or enthusiasts, is where the money is in the CPU markets. They stayed in business (in fact, they are one of the top performing tech companies as far as their stock goes for 2018), they are doing well in the enterprise CPU market, and now we see a glimpse of gaming improvements in Ryzen VII.
 
Lol by the time 16GB make any gameplay differences the GPU itself is already too slow anyways. Radeon VII is probably best suited for 1440p gaming where 8GB will be fine for a long time.

At the very least, it will allow people to use just 16gb of system ram instead of 24gb or 32gb. It's like comparing an 8gb vs a 4gb card using only 8gb of system ram. Extrapolate that further for higher performance cards with even newer games and you will see that having a 16gb card could save you ALOT on system ram.

It may be a while, but there are already signs were 11gb of the 1080ti are doing better than the 8gb of the 2080:
https://m.hexus.net/tech/reviews/graphics/122270-nvidia-geforce-rtx-2080-ti-rtx-2080/?page=9
Check out the min frames.
Also check out the vram usage when using dxr over at Hardocp running BFV. This was not just superficial as the GTX 2080 was only marginally better in some scenarios over the gtx 2070 due to the lack of vram for dxr. DXR is not that big of a vram increase over dx12.
 
Well Jeff, maybe AMD fanboys are butt hurt about this launch because it doesn't bring any competition !?.
The butt-hurts aren't the AMD boosters. They're the trash-talkers. The same folks who reacted with mindless glee when Ryzen 1 didn't beat Intel on every single spec. Fortunately for everyone, that product (I bought an 1800X on launch) helped AMD survive to now produce Zen 2. So the trash talkers have much less to gloat over today - but there's still GPUs!

And this V7 does bring competition, in the data science market. Also for those wanting more than 8GB VRAM. More importantly, it allows AMD to sell off their excess MI50 chips which will help stabilize their finances. All good for competition upcoming.

Price is a big part of immediate competition. AMD has no doubt run a lot of data to optimize the combination of MSRP, units sold vs time, production cost, features, etc. This, then, is the "competition" that makes most sense for them. Maybe your guess is better than theirs, but they seem to have been making pretty good financial decisions for the past several years.

TL;DR: If AMD can keep putting out products even close to competitive in the high end, we should be goddamn glad, give them the benefit of the doubt, and not look for things to ***** about.
 
I hope the side by side comparisons show a clear win for the 2080Ti, and if by chance the AMD competitor wins, I'd say it's time to drop the RTX TITAN XR on em.

Blow em away.

Only if you like playing space invaders.

Nvidia: screwing over their stock holders, selling $1300 gpus that are unreliable, overhyping RTX and being headed by the most arrogant CEO I have seen in a while.

Call me a fanboy, but more and more are sick of nvidias crap.
So you are excited about what's basically GTX 1080 Ti card, 11 months later, selling at GTX 1080 Ti launch MSRP?

I'm glad there is some competition now, but that's it. Couldn't get excited over this at all. RTX 2060 suddenly seems like a much better launch now. Especially when it will be in laptops.
 
RTX 2060 suddenly seems like a much better launch now. Especially when it will be in laptops.

I assume you're aware that the laptop "2060" will have a much lower boost clock than the desktop card. I'm guessing 60% or even less, because it's impossible to move 160W out of a tiny chassis like that with any rational hardware or internal temps. Not knocking the technology; just sayin'... Check on what NVidia really means before you buy a machine based on their big shiny number. And see what it does after a half-hour of gaming, not just a three-minute benchmark.
 
Hmm, I wonder what little old Intel will do with graphics. They have half of AMD working their and let's face it, they have superior processes and budget. Maybe they will come out one day and beat both AMD and Nvidia.

I would like to see Nvidia get their butt kick a bit. I'm a Nvidia card user but they are getting spolied and marketing and pricing their crap like Intel always did with cpus. That is, too expensive with no real performance gains. AMD however (I do miss Rage), is a scrapper in the gpu market and continually stays in the game -which is good for us consumers.
 
Hmm, I wonder what little old Intel will do with graphics. They have half of AMD working their and let's face it, they have superior processes and budget. Maybe they will come out one day and beat both AMD and Nvidia.
.
what superior process?
 
Just to remind everyone, if this thing performs the same as a 2080 it is definitely a worse buy. You could even say it’s not even competitive as you don’t get ray tracing or DLSS. And they may not work on many games but it’s still something that separates the two cards.

More games can ray trace right now than can benefit from 16gb of RAM. It will be years before that 16GB of RAM becomes relevant and by the time it is you’ll be able to get a 16GB card for much less than you pay now.

Also, if Vega 7nm is this expensive, expect a Faster Navi card to cost more than $700. Also over the last few years every single time either manufacturer has released a new flagship the price has gone up. The $1000+ flagships won’t be going anywhere. And that’s exactly what I expect Navi to cost if it’s faster than a 2080.
 
Hot off the press:
https://www.techpowerup.com/251464/...res-system-requirements-outed-with-radeon-vii

I guess that 16gb of vram will pay off faster than expected.

You just gave to shake your head at those that bought a GTX 2080 for the same price or more than a GTX 1080ti.

Yeah, welcome to the 1.42% of gamer (steam hardware survey) who play game at 4k 60fps lol, and you would be insane to play a competitive shooter game at 4k 60fps (probably higher if you lower the details but might as well play at 1440p). The promises that AMD card age better ended with R9 290x. Now just get whatever card give you higher fps in the game that you play at the resolution that you play.
 
Just to remind everyone, if this thing performs the same as a 2080 it is definitely a worse buy. You could even say it’s not even competitive as you don’t get ray tracing or DLSS. And they may not work on many games but it’s still something that separates the two cards.

More games can ray trace right now than can benefit from 16gb of RAM. It will be years before that 16GB of RAM becomes relevant and by the time it is you’ll be able to get a 16GB card for much less than you pay now.

Also, if Vega 7nm is this expensive, expect a Faster Navi card to cost more than $700. Also over the last few years every single time either manufacturer has released a new flagship the price has gone up. The $1000+ flagships won’t be going anywhere. And that’s exactly what I expect Navi to cost if it’s faster than a 2080.

You do realize, that ANY card with async compute will be able to do realtime ray tracing under Microsoft Windows 10's DXR.

Nobody buying an RTX cares about DLSS, or Ray Tracing, because they didn't care about those things a year ago either. They want a faster geometry engine and more ROPS, Memory & Bandwidth.

Find a Gamer who cares about 4k DLSS, or RTX.
 
You do realize, that ANY card with async compute will be able to do realtime ray tracing under Microsoft Windows 10's DXR.

Nobody buying an RTX cares about DLSS, or Ray Tracing, because they didn't care about those things a year ago either. They want a faster geometry engine and more ROPS, Memory & Bandwidth.

Find a Gamer who cares about 4k DLSS, or RTX.
It’s still a feature that the Radeon 7 doesn’t have. A feature that is actually being implemented in a small number of titles over the next year. If the price and performance of the two cards is the same then these things set them apart. And can you show me any titles that the Radeon 7 will ray trace in over the next year? Right now the idea of a Radeon 7 doing so is nothing but a possibility but RTX is actually a thing and confirmed for some games. In other words right now and for the foreseeable future Nvidia cards can ray trace and AMD cards can’t.

And actually, you are quite wrong, many of us enthusiasts in the tech community love seeing things like real time ray tracing come to 3D gaming. If both cards are the same but one card gives you actual ray tracing in games then that is definitely a better buy. Only fanboys would claim otherwise.
 
Jup, who wouldn't have wet dream about playing cyberpunk 2077 with RTX anyways, probably AMD fanbois.
 
Jup, who wouldn't have wet dream about playing cyberpunk 2077 with RTX anyways, probably AMD fanbois.
Ray tracing is awesome and it’s not going away. In 5 years time it will be normal for games to have this sort of technology and that is real, its progress. It is absolutely nothing like hairworks or anything like that, it’s the next step of 3D rendering, so much so that movies have been using it for years. It seems that people have decided to hate it/ignore it because Nvidia were the first company to sell hardware capable of performing it. It’s pathetic.
 
Yeah, welcome to the 1.42% of gamer (steam hardware survey) who play game at 4k 60fps lol, and you would be insane to play a competitive shooter game at 4k 60fps (probably higher if you lower the details but might as well play at 1440p). The promises that AMD card age better ended with R9 290x. Now just get whatever card give you higher fps in the game that you play at the resolution that you play.

I would be a hypocrite to say that more than 8 GB is needed as I defended the 4 GB in FuryX for a long time. Still, there has to be a better metric than Steam as I do not care about all of the people running laptop iGPUs to game from developing nations. There has to be alot of non-conpetitive gamers that just want to play on their department store 4k tv at the highest settings.
 
Jup, who wouldn't have wet dream about playing cyberpunk 2077 with RTX anyways, probably AMD fanbois.

We are not saying that RTX will never be great, but it is far from ready.

You talk about getting excited for RTX on a yet to be released game that will require a $1300 gpu to run it at 1080p right after you mention competitive gamers and what they want.

What competitive gamer will sacrifice half their fps to see neat reflections in the water??

Again, 5-7 years from now, it will be great. Right now, it is just a gimmick. Gamers will much rather have the higher bandwidth right now.
 
We are not saying that RTX will never be great, but it is far from ready.

You talk about getting excited for RTX on a yet to be released game that will require a $1300 gpu to run it at 1080p right after you mention competitive gamers and what they want.

What competitive gamer will sacrifice half their fps to see neat reflections in the water??

Again, 5-7 years from now, it will be great. Right now, it is just a gimmick. Gamers will much rather have the higher bandwidth right now.

lol you can't distinguish between an online multiplayer PvP shooter game being "competitive" and a RPG single player shooter game being "non-competitive" dude. For games like BF5, PUBG I would like my fps to be >120fps at all times while Cyberpunk 2077 I'm perfectly happy with 80fps with all eye candy I can get. Nvidia got a lot of flack with BF5 being a competitive game and enable RTX there made no sense, just bad timing I presume as BF5 is the only AAA game released in this window (though BF5 being a SJW joke sure shove a lot of gamers away). Btw I already own a watercooled 2080 TI, and yeah I wouldn't want to play 4k resolution on the 2080Ti, much less any card 30% slower.
 
Last edited:
Back