AMD FSR 3 debuts in Forspoken and Immortals of Aveum, also heading to consoles

The "doesn't work with VRR" thing is FUD being spread by people who are bad at reading comprehension. There's literally nothing behind this and the complaints people have are just due to bad frame pacing (which can be fixed) and tearing experienced when running over the monitor's native refresh rate without enhanced sync (because frame generation *is* incompatible with that.)
 
I think you're confused...

"Mantle - lives on Vulkan and DX12"

Mantle does NOT live on DirectX. That's a proprietary API developed by Microsoft.

"ROCm - gaining ground and compatibility, now entering (finally) consumer GPUs."

ROCm is an open software platform allowing researchers to tap the power of AMD Instinct™ accelerators to drive scientific discoveries. The ROCm platform is built on the foundation of open portability, supporting environments across multiple accelerator vendors and architectures.
I mean that in DX12 it has its influence and they applied several of the same design decisions from Mantle.

I know what ROCM is. Maybe I should have specified "now entering (finally) consumer GPUs ***on Windows***" because in Linux it already "worked" on consumer GPUs, although they lost support very quickly and you had to continue using previous versions (or recompile the new ones trying to maintain support)
 
I mean that in DX12 it has its influence and they applied several of the same design decisions from Mantle.

I know what ROCM is. Maybe I should have specified "now entering (finally) consumer GPUs ***on Windows***" because in Linux it already "worked" on consumer GPUs, although they lost support very quickly and you had to continue using previous versions (or recompile the new ones trying to maintain support)
Stop it.
 
980 Ti with GDDR5 was the better card vs Fury X HBM.
You can brag about AMD innovations like the rest as much as you want, but what AMD starts, its competition does better.

-Intel has had the better IMC since day one
-NVIDIA selling a lot of HBM2e and 3 and soon to be 3 Gen 2 with Micron with AMD only sampling MI300 right now. AMD still on HBM 2e with MI250. CoWoS packaging at TSMC is tight and AMD is clawing at scraps
-Intel is arguably doing chiplets/tiles better (up to 47 tiles on package with Ponte Vecchio and 4 on MTL + Foveros + EMIB packaging + PowerVIA coming soon)
-NVIDIA is doing upscaling better, power efficiency better, performance better, consistent launches better
-AMD is trash with software and are losing with their Open Source solutions
-Intel having AMX (AI inferencing) on every Xeon core going forward. MCR-DIMM. Possible 288-core SPF
-AMD is the budget brand on CPU and GPU side (expect the prices drops. don't pay full price)
-AMD consoles can't guarantee 1080p/60fps in 2023 when a 2070S that was the GPU the consoles were first compared to still can
-Intel will have GAA before TSMC. Intel has had chip supply wins for years vs Epyc CPU's (last time I checked intel was out shipping Epyc 7:1 per quarter)
-NVIDIA sells complete platforms, while AMD sells you components

Shall I keep going? I have more.
How would Nvidia be able to sell me a complete platform? When they do not have a CPU I would care to use in my everyday gaming PC/home PC. While AMD does have a complete solution for me to be able to buy and have a platform that can do pretty much whatever I wish to do. I guess on Nvidia side I could go with Intel CPU oh wait even Intel has a full solution just like AMD does for me to be able to buy and make a awesome PC while if I was to try and do a full Nvidia solution I would be stuck with their now old ARM based CPU and probably would not be able to install that shiny new RTX 4090 and if I could it would be so under used because of lack of CPU power I would probably throw my whole PC off of the deck into my yard and start over again with either an AMD solution or Intel solution.
 
Again...
Have many out there who are extremely salty that AMD gives you what NVidia wants to try and charge a premium for... that they try to spread misinfo/lie (*see below), or talk about Nvidia's past glories, instead of current situation.

RTX40 series is a flop solely due to NVidia's marketing and pricing and thought Gamers would pay extra for gimmicks. NVidia themselves admitting their RTX40 marketing was a flop and that RTX20 & 30 owners actually didn't need to upgrade to RTX40 to get frame-generation... it was essentially just a marketing hoax...!

This all became apparent directly after AMD announcing FSR3 (even GTX/RTX owners).... that in six-months time NVidia hurried up and rushed dlss3.5 out, completely ignoring the enormous amounts of egg on their face caught lying to their customer base... Gamers see through these gimmicks and it has hurt NVidia's reputation horribly.

Specially when Radeon is dominating the Gaming scene in Price/Performance.


* <--- FSR3 is an upgrade to FSR2. FSR3 has frame-gen for even RTX/GTX cards (w/o anti-lag+ , which is specific to 7000 series cards).
dd82e368375ec804f4701e1d3f251fe2f398adcd196c2e567048bef6c30aa162.png
 
Last edited:
The "doesn't work with VRR" thing is FUD being spread by people who are bad at reading comprehension. There's literally nothing behind this and the complaints people have are just due to bad frame pacing (which can be fixed) and tearing experienced when running over the monitor's native refresh rate without enhanced sync (because frame generation *is* incompatible with that.)
see chapter one. monitor switches to 120hz as soon as he clicks fsr3 fg on.
literally says "vrr not supported" and "confirmed online"
 
How would Nvidia be able to sell me a complete platform? When they do not have a CPU I would care to use in my everyday gaming PC/home PC. While AMD does have a complete solution for me to be able to buy and have a platform that can do pretty much whatever I wish to do. I guess on Nvidia side I could go with Intel CPU oh wait even Intel has a full solution just like AMD does for me to be able to buy and make a awesome PC while if I was to try and do a full Nvidia solution I would be stuck with their now old ARM based CPU and probably would not be able to install that shiny new RTX 4090 and if I could it would be so under used because of lack of CPU power I would probably throw my whole PC off of the deck into my yard and start over again with either an AMD solution or Intel solution.
Grace.
 
They started with some of the most forgettable games.
I would at least go for something popular like fortnite or cod where people would notice and appreciate the feature
 
Amd always falls on Nvidia traps.
Yes, based on public opnion so far it is same as FSR 2 was to DLSS 2.
Meaning a worse version of DLSS 3.

But that is not the main issue.
Now, what will happen?
Amd has 15% gpu market share.
People won't upgrade to newer amd gpu as they can just use FSR3.

This is not how you do business.
Amd needs much better leadership with proper vision of the future.
from a gamer's perspective, this is good news thank you. You sound like someone heavily invested in AMD shares though. :)
 
see chapter one. monitor switches to 120hz as soon as he clicks fsr3 fg on.
literally says "vrr not supported" and "confirmed online"
Setup issue, because that doesn't happen on mine. 👍

(Just watched the video and he *definitely* doesn't understand what's happening. VRR doesn't smooth out frametimes and make the game appear smoother, it assists with tears. Smoothness issues are due to problems with frame pacing (which AMD does need to address.) On top of this, he clearly didn't comprehend what the GPUOpen article on the implementation was saying based on his comments about the frametime graph.
 
Tried it in Forspoken, and colour me unimpressed. I had high hopes, and to some extent still do, but this first drop has too many issues and caveats, enough to totally miss the mark. Horrendous frame pacing, broken performance after time, VRR issues, needing to be used with FSR, quite a shame really. I do hope they can improve this as I think the concept has strong merits when implemented well, having tried excellent implementations of frame generation. AMD needs to critically re-evaluate product and features launches, their marketing team are just woefully ineffectual, and have been for a long time.
 
Setup issue, because that doesn't happen on mine. 👍

(Just watched the video and he *definitely* doesn't understand what's happening. VRR doesn't smooth out frametimes and make the game appear smoother, it assists with tears. Smoothness issues are due to problems with frame pacing (which AMD does need to address.) On top of this, he clearly didn't comprehend what the GPUOpen article on the implementation was saying based on his comments about the frametime graph.
VRR removes motion jitter that originates from having different fps and hz values. That's why it's smoother, has nothing to do with frametimes but presentation on screen. Seems like it's you who doesn't understand what VRR aids in.
 
VRR removes motion jitter that originates from having different fps and hz values. That's why it's smoother, has nothing to do with frametimes but presentation on screen. Seems like it's you who doesn't understand what VRR aids in.
DUUUUUDE. it's not motion jitter. It's screen tearing. Do you even technology, bro?
 
DUUUUUDE. it's not motion jitter. It's screen tearing. Do you even technology, bro?
There's both when your fps in unsynced. Image tears and motion is visibly less smooth, and I'd say the jitter is more off putting, at 165hz the tearing is not that distracting.
How can someone not notice the jitter when vrr is off and call themselves an expert is beyond me.
I'm deliberately not calling it stutter, since stutter is visible in frametime graphs. Unsynced fps/hz jitter is only visible in how it's presented on the monitor.
 
Last edited:
There's both when your fps in unsynced. Image tears and motion is visibly less smooth, and I'd say the jitter is more off putting, at 165hz the tearing is not that distracting.
How can someone not notice the jitter when vrr is off and call themselves an expert is beyond me.
I'm deliberately not calling it stutter, since stutter is visible in frametime graphs. Unsynced fps/hz jitter is only visible in how it's presented on the monitor.

When unsynced u do not get jitter, u get tearing.... which is the display updating your screen mid-frame.

Jitter is when your screen will WAIT for you gpu to update, bcz they are synced...
 
EDIT: In order for motion to be "fluid" it needs Vsync to be ON and FreeSync .to be OFF. It's actually decent.
yeah, I was wondering why it worked like crap when I tried it on rx6800 today. game was running at 55-65, and afmf "only" took it to 100, which actually felt worse than that 60 in terms of smoothness. awful frame pacing issues, more jitter, higher latency and tearing. It's probably good advice to set games that run at 60fps to run your monitor at 85/90/100hz, or else you'll feel that too. On mine, I can only go to 120hz from 60hz , so afmf is kinda uselsess. They need to fix the vrr compatibility, or else the technology is one step forward and two steps backward.
It only felt fine when it could hit the monitor's 165fps cap - frame pacing issues were gone, so was jitter but it still felt a bit more laggy than native 90 and created artifacts. Not bad though, I'd have no probelms to play the entire game like that. But 90 fps native was better.
 
Last edited:
yeah, I was wondering why it worked like crap when I tried it on rx6800 today. game was running at 55-65, and afmf "only" took it to 100, which actually felt worse than that 60 in terms of smoothness. awful frame pacing issues, more jitter, higher latency and tearing. It's probably good advice to set games that run at 60fps to run your monitor at 85/90/100hz, or else you'll feel that too. On mine, I can only go to 120hz from 60hz , so afmf is kinda uselsess. They need to fix the vrr compatibility, or else the technology is one step forward and two steps backward.
It only felt fine when it could hit the monitor's 165fps cap - frame pacing issues were gone, so was jitter but it still felt a bit more laggy than native 90 and created artifacts. Not bad though, I'd have no probelms to play the entire game like that. But 90 fps native was better.

Just so you know, the more money you spend on a dGPU the less you have to rely on gimmicks. Pro players do not use gimmicks like g-sync, or (obviously) upscaling, etc... bcz they want the most responsive frames.


If you have a 165Hz monitor, play without syncing and forget about using G-Sync/FreeSync/VRR.
 
gimmicks like g-sync
If you have a 165Hz monitor, play without syncing and forget about using G-Sync/FreeSync/VRR.
dude, just stop embarassing yourself.
if you're gonna backpaddle so hard get a rearview mirror.
suddenly, vrr is a gimmick cause fsr3 can't support it lol
you know why it doesn't ? cause I tried it today on my 6800 and frame pacing is totally broken with amd's frame generation unless you're synced to max refresh rate.
I have zero interest in what "pro players" do, I play single player games only, like most of pc gamers. Just look at HUB's reviews, people are interested in triple A game performance at Ultra/High settings. You can run things that "pro players" play on cards like 4060, so your "the more money you spend" excuse is just an absolute miss.

performance-2560-1440.png
 
Last edited:
dude, just stop embarassing yourself.
if you're gonna backpaddle so hard get a rearview mirror.
suddenly, vrr is a gimmick cause fsr3 can't support it lol
you know why it doesn't ? cause I tried it today on my 6800 and frame pacing is totally broken with amd's frame generation unless you're synced to max refresh rate.
I have zero interest in what "pro players" do, I play single player games only, like most of pc gamers. Just look at HUB's reviews, people are interested in triple A game performance at Ultra/High settings. You can run things that "pro players" play on cards like 4060, so your "the more money you spend" excuse is just an absolute miss.

Who is backpedaling...?
G-sync/Free-sync has ALWAYS been a gimmick. You do not need to sync your monitor and the only reason you do, or use these gimmicks is because your GPU and monitor are mismatched.

Subsequently, if you spend MoAr on a GPU you get less screen tearing and better frames.... thus no reason for added gimmicks.

You still have not explained why anyone needs AFF using upscaling, if they are getting 2x the frames as before...
 
Back