I know this article is about 1/2 a year old, and by now Vega's out & a known quantity, but I just couldn't stop myself from replying to point out just how much absolutely ridiculously BS stuffed FUD is in this post (even before Vega was available). To the poster, near everything you said is pure, and utter bull****.
First off, unless we're talking maxed out AAA gaming @ 4K/60hz (which not even a GTX 1080 Ti can do) AMD already had plenty fast enough cards available PRIOR TO VEGA to properly take advantage of a 1440p/144Hz or 4K/60Hz monitor with FreeSync. The R9 Fury & Fury X (AMD Fiji based GPU's) are both killer 1440p cards that'll push FAR above 40fps in most titles, with the latter generally falling right behind the "fast enough for adaptive sync because it says "Nvidia" on it" GTX 1070, and the regular Fury isn't too far behind (both of which were DRAMATICALLY cheaper, nearly by 1/2 than the 1070 in 2016 & pre-crypto boom 2017). Hell, even the comparatively ancient (launched 2012) "Hawaii" based cards (290/X, 390/X) kick surprising amounts of arse at 1440p (and absolutely burn any Kepler [GTX 600/700] cards to the freaking ground), and despite being 5+ years old will more often than not, exceed your "40fps" mark in the latest titles.
And if you crank the resolution up even further to 4K, the Fury X's HBM advantage erases the small gap, bringing it up to a near deadheat with the GTX 1070, and in games where it isn't crazy VRAM limited, is often even a tich faster (being HBM with it's 512GB/s memory bandwidth, it performs far better in that regard than any regular 4GB card, but still can't compete with having 8GB in those few VRAM killing corner cases).
Lastly, adaptive sync is most beneficial at the LOW END of GPU's, not the high-end, which is plainly obvious if you've ever tested them with both or understand how the tech works. Adaptive Sync let's your not quite fast enough for a locked 60fps low-mid range card, have an equally smooth experience at 40-60fps, without having to start knocking settings down like you would otherwise. Pretty sure this is even discussed in the article. (And is a MAJOR benefit for AMD/FreeSync with it near a free "value-add" in today's monitors, and not another $200 expense, on top of the ≈$200 they already paid for their card). Next time try doing some actual research before you just start pulling random crap outta your arse.