VcoreLeone
Posts: 289 +146
stop buying them.there's nothing to do about it.
stop buying them.there's nothing to do about it.
I mean that in DX12 it has its influence and they applied several of the same design decisions from Mantle.I think you're confused...
"Mantle - lives on Vulkan and DX12"
Mantle does NOT live on DirectX. That's a proprietary API developed by Microsoft.
"ROCm - gaining ground and compatibility, now entering (finally) consumer GPUs."
ROCm is an open software platform allowing researchers to tap the power of AMD Instinct™ accelerators to drive scientific discoveries. The ROCm platform is built on the foundation of open portability, supporting environments across multiple accelerator vendors and architectures.
Both Vulkan and DX12 were created under the influence of AMD and both have some of the key features of mantle.
Stop it.I mean that in DX12 it has its influence and they applied several of the same design decisions from Mantle.
I know what ROCM is. Maybe I should have specified "now entering (finally) consumer GPUs ***on Windows***" because in Linux it already "worked" on consumer GPUs, although they lost support very quickly and you had to continue using previous versions (or recompile the new ones trying to maintain support)
How would Nvidia be able to sell me a complete platform? When they do not have a CPU I would care to use in my everyday gaming PC/home PC. While AMD does have a complete solution for me to be able to buy and have a platform that can do pretty much whatever I wish to do. I guess on Nvidia side I could go with Intel CPU oh wait even Intel has a full solution just like AMD does for me to be able to buy and make a awesome PC while if I was to try and do a full Nvidia solution I would be stuck with their now old ARM based CPU and probably would not be able to install that shiny new RTX 4090 and if I could it would be so under used because of lack of CPU power I would probably throw my whole PC off of the deck into my yard and start over again with either an AMD solution or Intel solution.980 Ti with GDDR5 was the better card vs Fury X HBM.
You can brag about AMD innovations like the rest as much as you want, but what AMD starts, its competition does better.
-Intel has had the better IMC since day one
-NVIDIA selling a lot of HBM2e and 3 and soon to be 3 Gen 2 with Micron with AMD only sampling MI300 right now. AMD still on HBM 2e with MI250. CoWoS packaging at TSMC is tight and AMD is clawing at scraps
-Intel is arguably doing chiplets/tiles better (up to 47 tiles on package with Ponte Vecchio and 4 on MTL + Foveros + EMIB packaging + PowerVIA coming soon)
-NVIDIA is doing upscaling better, power efficiency better, performance better, consistent launches better
-AMD is trash with software and are losing with their Open Source solutions
-Intel having AMX (AI inferencing) on every Xeon core going forward. MCR-DIMM. Possible 288-core SPF
-AMD is the budget brand on CPU and GPU side (expect the prices drops. don't pay full price)
-AMD consoles can't guarantee 1080p/60fps in 2023 when a 2070S that was the GPU the consoles were first compared to still can
-Intel will have GAA before TSMC. Intel has had chip supply wins for years vs Epyc CPU's (last time I checked intel was out shipping Epyc 7:1 per quarter)
-NVIDIA sells complete platforms, while AMD sells you components
Shall I keep going? I have more.
see chapter one. monitor switches to 120hz as soon as he clicks fsr3 fg on.The "doesn't work with VRR" thing is FUD being spread by people who are bad at reading comprehension. There's literally nothing behind this and the complaints people have are just due to bad frame pacing (which can be fixed) and tearing experienced when running over the monitor's native refresh rate without enhanced sync (because frame generation *is* incompatible with that.)
Grace.How would Nvidia be able to sell me a complete platform? When they do not have a CPU I would care to use in my everyday gaming PC/home PC. While AMD does have a complete solution for me to be able to buy and have a platform that can do pretty much whatever I wish to do. I guess on Nvidia side I could go with Intel CPU oh wait even Intel has a full solution just like AMD does for me to be able to buy and make a awesome PC while if I was to try and do a full Nvidia solution I would be stuck with their now old ARM based CPU and probably would not be able to install that shiny new RTX 4090 and if I could it would be so under used because of lack of CPU power I would probably throw my whole PC off of the deck into my yard and start over again with either an AMD solution or Intel solution.
from a gamer's perspective, this is good news thank you. You sound like someone heavily invested in AMD shares though.Amd always falls on Nvidia traps.
Yes, based on public opnion so far it is same as FSR 2 was to DLSS 2.
Meaning a worse version of DLSS 3.
But that is not the main issue.
Now, what will happen?
Amd has 15% gpu market share.
People won't upgrade to newer amd gpu as they can just use FSR3.
This is not how you do business.
Amd needs much better leadership with proper vision of the future.
Setup issue, because that doesn't happen on mine.see chapter one. monitor switches to 120hz as soon as he clicks fsr3 fg on.
literally says "vrr not supported" and "confirmed online"
VRR removes motion jitter that originates from having different fps and hz values. That's why it's smoother, has nothing to do with frametimes but presentation on screen. Seems like it's you who doesn't understand what VRR aids in.Setup issue, because that doesn't happen on mine.
(Just watched the video and he *definitely* doesn't understand what's happening. VRR doesn't smooth out frametimes and make the game appear smoother, it assists with tears. Smoothness issues are due to problems with frame pacing (which AMD does need to address.) On top of this, he clearly didn't comprehend what the GPUOpen article on the implementation was saying based on his comments about the frametime graph.
DUUUUUDE. it's not motion jitter. It's screen tearing. Do you even technology, bro?VRR removes motion jitter that originates from having different fps and hz values. That's why it's smoother, has nothing to do with frametimes but presentation on screen. Seems like it's you who doesn't understand what VRR aids in.
There's both when your fps in unsynced. Image tears and motion is visibly less smooth, and I'd say the jitter is more off putting, at 165hz the tearing is not that distracting.DUUUUUDE. it's not motion jitter. It's screen tearing. Do you even technology, bro?
There's both when your fps in unsynced. Image tears and motion is visibly less smooth, and I'd say the jitter is more off putting, at 165hz the tearing is not that distracting.
How can someone not notice the jitter when vrr is off and call themselves an expert is beyond me.
I'm deliberately not calling it stutter, since stutter is visible in frametime graphs. Unsynced fps/hz jitter is only visible in how it's presented on the monitor.
you do get jitter lol get some glasses. maybe you don't notice it, but for me it's super annoying.When unsynced u do not get jitter, u get tearing
yeah, I was wondering why it worked like crap when I tried it on rx6800 today. game was running at 55-65, and afmf "only" took it to 100, which actually felt worse than that 60 in terms of smoothness. awful frame pacing issues, more jitter, higher latency and tearing. It's probably good advice to set games that run at 60fps to run your monitor at 85/90/100hz, or else you'll feel that too. On mine, I can only go to 120hz from 60hz , so afmf is kinda uselsess. They need to fix the vrr compatibility, or else the technology is one step forward and two steps backward.EDIT: In order for motion to be "fluid" it needs Vsync to be ON and FreeSync .to be OFF. It's actually decent.
yeah, I was wondering why it worked like crap when I tried it on rx6800 today. game was running at 55-65, and afmf "only" took it to 100, which actually felt worse than that 60 in terms of smoothness. awful frame pacing issues, more jitter, higher latency and tearing. It's probably good advice to set games that run at 60fps to run your monitor at 85/90/100hz, or else you'll feel that too. On mine, I can only go to 120hz from 60hz , so afmf is kinda uselsess. They need to fix the vrr compatibility, or else the technology is one step forward and two steps backward.
It only felt fine when it could hit the monitor's 165fps cap - frame pacing issues were gone, so was jitter but it still felt a bit more laggy than native 90 and created artifacts. Not bad though, I'd have no probelms to play the entire game like that. But 90 fps native was better.
gimmicks like g-sync
dude, just stop embarassing yourself.If you have a 165Hz monitor, play without syncing and forget about using G-Sync/FreeSync/VRR.
dude, just stop embarassing yourself.
if you're gonna backpaddle so hard get a rearview mirror.
suddenly, vrr is a gimmick cause fsr3 can't support it lol
you know why it doesn't ? cause I tried it today on my 6800 and frame pacing is totally broken with amd's frame generation unless you're synced to max refresh rate.
I have zero interest in what "pro players" do, I play single player games only, like most of pc gamers. Just look at HUB's reviews, people are interested in triple A game performance at Ultra/High settings. You can run things that "pro players" play on cards like 4060, so your "the more money you spend" excuse is just an absolute miss.