AMD roadmaps reveal Zen 4 and RDNA 3 details, promise significant performance increases

Oh yeah, I'm all for it. Let's do this. :)

I just hear a lot of people saying that AMD and Intel are both guilty of making claims of higher performance improvements than they can deliver. I guess AMD has entered a magical zone when they are actually backing up most of their claims lately it seems. Good times!

that's how it's been the last few years
AMD: "we achieved #% more performance"
People: meh, PR numbers
AMD launches
People: lets test if true
People: damn it's true

repeat for next launch
 
Last edited:
The question in my mind is: Will AMD increase ray-tracing performance? When the 2XXX series launched RT was a gimmick hardly any game used that tanked performance to a ridiculous degree. Now the consoles have it, it's part of DX, and games are playable with the effect enabled. With RT becoming less of a gimmick, I want to see some commitment to the technology from AMD.

Their raster performance is competitive, but RT is leagues behind, and that is starting to matter.
It's still a gimmick because in most games, while you're busy admiring your reflection in the water, some enemy NPC wants to put a bullet through your skull. I don't get how it makes a difference to people as to where certain shadows lie when 99.9999% couldn't tell you where the non-RT shadows were drawn in the first place.

Ray-tracing is an over-hyped gimmick that may or may not have a significant graphical impact years from now. I remember when a REAL game-changer in graphics came to be and it was back in 2008. Tessellation completely and dramatically changed the look of games and it wasn't just where the shadows lie or a some fuzzy reflections, it was all game assets.

So here's the difference between RT and non-RT from the game that, to date, has employed it the most, Control:
maxresdefault.jpg

Yeah, I can see the reflection of the pipes in the puddle on the floor but it's not something that would make or break a game for me (or anyone else for that matter). The difference is there, but it's insignificant and I doubt that I'd really notice it during gameplay because it's the last thing that I'd be looking for.

Tessellation isn't subtle and its effects on graphics are profound. To give you an idea of tessellation's effects, I'll use the dragon statue from Unigine Heaven.

Here's the dragon statue with tessellation disabled:
00003.jpg

It's very nicely done and nicely rendered as you can see. However, when you turn tessellation on, the difference is immediate and staggering:
00004.jpg


Tessellation, like ray-tracing, was absolutely crippling to GPUs of the day. The difference was the fact that entire games looked different in pretty much every way. It wasn't just about a few shadows and reflections, it was about EVERYTHING. Don't just look at the dragon, the roof shingles, the bricks, the post, the stairs and the ground. Everything just pops into more of a 3D image. THIS is what a game-changer looks like and RT is a joke compared to tessellation.

If you want to be truly worried about RT performance, wait at least five more years because the devs have to figure out what else to do with it. As it is now, it's just not a big enough deal to suffer with the performance hit that it carries.
 
Last edited:
that's how it's been the last few years
AMD: "we achieved #% more performance"
People: meh, PR numbers
AMD launches
People: lets test if true
People: damn it's true

repeat for next launch
Yep. AMD claimed a 100% performance-per-watt uplift between RDNA1 and RDNA2. It was (understandably) met with skepticism by the tech press. Then the guys at ATi said "Hold my Ex" and exceeded the 100% performance-per-watt uplift, blowing everyone's minds in the process.

I'd never seen anything like it before (that I can remember) and neither had anyone else because an uplift like that was unprecedented. It was ATi's "Pascal Moment" and it was clear that AMD had learnt to under-promise and over-deliver from their experience with marketing Zen CPUs.
 
I know this will sound corny, but damn!
My first PC had a little over 1\2 MB of ram (640k).
I remember those days. If you had more than 640kb of RAM, you needed a 32-bit CPU and you needed to run EMM386 in your autoexec.bat file. The first PC I had was an original IBM PC (#5150). It had 256kb of RAM and two FULL HEIGHT 5¼" floppy drives. I remember the 200mm 8-bit memory expansion card that we put into it to double the RAM to 512kb. Crazy, eh? :laughing:
 
I remember those days. If you had more than 640kb of RAM, you needed a 32-bit CPU and you needed to run EMM386 in your autoexec.bat file. The first PC I had was an original IBM PC (#5150). It had 256kb of RAM and two FULL HEIGHT 5¼" floppy drives. I remember the 200mm 8-bit memory expansion card that we put into it to double the RAM to 512kb. Crazy, eh? :laughing:
You whippersnappers....

*cries* with my beloved TRS-80 Model 1 with 4k of ram....
 
You whippersnappers....

*cries* with my beloved TRS-80 Model 1 with 4k of ram....
I said my first PC, not first computer. My first computer was a TRS-80 Color Computer with 32k RAM. I'm not the whippersnapper that you think I am, Grandpa! :laughing:
Ok fine. But could they play Zork?! ;)

Seriously? 32k of ram? 4k of ram!?
Sheesh.
 
Yeap and all it had was BASIC and a manual to learn BASIC.

Cassette tapes for storage and glorious 4k of ram.

Amazing little thing for its time.

For the youngins here is the wikipedia link:


Now get off my lawn 🙃
Oh I know the TRS-80 to which you refer because my stepfather managed a Radio Shack in Abbotsford, BC when I was like seven years old. He used to let me read the catalogues at his work.
 
they'll probably go that route also.

at this point, devs should honestly just optimize their applications and games better. we know they can do it because the switch, deck and mobile platforms exists (also look at what the ps4 and xbox are still pulling off) and they'll magically make a game scale down and work on that lowbrow tech quite nicely.

BUT, theres no point to do that when the hardware can just brute force past any jankiness I guess.
What makes you believe that Radeons will stop caring about power consumption when AMD specifically uses performance-per-watt as its major metric? To me, this indicates that AMD is very aware of the need for efficiency.
 
Back