Forspoken requires 150GB of storage and needs an RTX 3070 for 1440p @ 30fps

So did you swear off PC gaming when crysis came out? Just wondering.

Crysis came out in 2007, the difference between high end PC gaming and the 360/PS3 was more noticeable. That’s not really the case with Series X/PS5, it’s a case of diminishing returns with an extremely high cost of entry on the PC side.

I have an aging 2600K/OG Titan machine in the kitchen, if I wanted to build a machine in the same ‘tier’ today it’d probably cost £3000-£3500 all in - to play half a dozen games that look marginally better than their console equivalents.
 
Have they not heard of texture compression?
By principle, they dont do textures compression. From their point of view, the "compression" falls on the consumer's "shoulders" by buying a new powerful video card. In fact, aside from the lower work on devs side(thus smaller budget), one might pertinently assume that publishers are hand-in-hand with video card manufacturers to help them sell more cards for a slice of the "pie".

Savage capitalism for us all.
 
Just shows developers are becoming lazier and leave the optimizations out of the development. All the 4K textures and lightings hogging the precious space mean nothing if there is no worthy gameplay.
And you don't need 4K to produce a good game. We had great, immensely replayable games at 640x480 or even 320x240 resolutions those days. And still are today.
 
RTX 4080 or Radeon 6800 XT for 4k60.

That doesn't make any sense. 4080 is 45% faster in 4k than 6800XT. Nvidia equivalent to 6800XT is 3080. And even if the game will require 16GB VRAM for Ultra textures, no way a 6800XT can push 60fps in 4k with other settings at Ultra.
 
The specs actually make sense when one notes that the game is a PS5 exclusive for two years. Yes, it's also a Windows release but note that all the hardware requirements are issued AMD first - that's not a simple alphabetical listing, it's due to the partnership between Square Enix and AMD for this particular title.

The hardware specs for the PS5 are reasonably similar to the Recommended PC requirements:

PS5 (Claimed 4K@30fps, 1440p@60fps, RT mode@?fps)
CPU = 8C/16T, Zen 2, 3.5 GHz
GPU = 36CU, RDNA 2, 2.23 GHz
RAM = 16GB GDDR6 shared

PC Recommended (1440p@30 fps)
CPU = 8C/16T, Zen 2, 4.2 GHz Ryzen 5 3600
GPU = 40CU, RDNA 2, 2.4 GHz Radeon RX 6700 XT
RAM = 24GB

The developers of Forspoken also made Final Fantasy XV, which was fairly well optimized on the PC on release -- up to a point. The 4K performance was grim, but that was about normal for the time. I get the feeling that the team has just thrown all of the bells and whistles at the PC port but lack the resources, or managerial direction, to really streamline it properly.
 
Ah, it's definitely here again, "PC gaming is dying", "consoles rule!" like back in the 2000s. This is one of the ways it starts, along with ridiculous hardware prices. Poor kids.
 
So did you swear off PC gaming when crysis came out? Just wondering.
Even with Crysis, the 8800GTX couldn't pull 60fps on 1280x1024 resolution (that was the monitor resolution I was using back then):
crysis_1280_1024.gif


No one was playing Crysis at 60fps outside of maybe some SLI setups.

Unless I'm remembering wrong, there were no GPUs that cost $1200 back then. Sure, if you really wanted to get close you could have priced out an 8800 GTX or Ultra. I think the GTX came in around $650 and the Ultra was around $850. Honestly, the only real way to hit $1200+ for a GPU back then was to actually buy 2 and run them in SLI.

Most folks had a complete gaming system that was quite capable for a total cost of around $800-1000.

The fact that today a high mid-tier GPU costs $800+ and a high-end tier card costs $1200+ is disgusting, if you ask me. Today you're now paying as much as you did for a complete build 15-16 years ago for just the GPU with what's available in this current gen.
 
6700XT and 3070 for 1440p 30fps

6800XT and 4080 for 4K 60fps ultra setting ?!


It does not make any sense. How can 6800XT runs 2x fps at much higher resolution and setting than 6700XT. I thought that 6800XT is like ~30% average fps faster than 6700XT

12GB video ram on 6700XT can't be bottleneck because they recommending 8GB nvidia card at 1440p. If 6800XT can do 60fps at 4K then 6700XT will easily do over 60fps at 1440p (6700XT perform better at 1440p than 6800XT at 4K)

I had the same question is this a AMD sponsored Title.

Also FSR but no DLSS
 
I can't quite see who they're writing games like this for. Only a tiny percentage of folks have these specs. Even if I had a PC like this, then I still didn't see anything in the video that would make me want to try the game out. I think the real problem is giving developers hardware that is light years in front of what gamers actually own - it made sense when everybody was regularly upgrading but the outrageous costs of modern GPUs mean that most folk just aren't upgrading anymore.
 
I'm gonna play it because it came free with a 6800xt I bought that ended up broken.

its rare for me to shout unoptimized but that seems the case for this one bigtime, my rig checks all the boxes, 11900, 3070ti, 32gb of ram, gen 3&4 nvme's, the games graphics just dont seem to need that much power.

im curious though if direct storage will play a big part in its performance? I doubt it because what would it need to constantly load at such high speeds.

also, if this wonky game murders one of my drives imma be p*ssed.
 
Id wager High settings will still look very good and run 30-50% faster, Ultra vs high is pretty hard to pick out in screenshots, accept for things like draw distance ect... but shadow rez, particle density, 4k vs 2k textures ehhh look pretty similar imo.
 
Can we please get the PS5 GPU on the market, at least I don't have to upgrade every year, should last me about 6-10 years right.
 
Introducing the least optimized PC game of the year: Forspoken!!

Can't say watching that trailer made me think this was a new game, looks like an old game with some modern eye candy trying to make it look better but fails to do so because the underlying product just looks dated.

But please tell me it needs a RTX4080, 12th gen i7, 32GB of RAM and 150GB of NVME storage... Trying to sell PC hardware or a game here? Not so sure.
 
I can remember way back during the infant pc gaming years (286/386/486 days). Where that brand new game that came out might just require you to get a new rig to play similar to a new console generation. Wing Commander II comes to mind as being the one to get me to get rid of the 286 and move to the 386. In more modern times. The old infamous "Does it play Crysis?" joke/question exists for a reason. The game was future proofed almost to the point that new avg machines had trouble with it. If a new AAA game requires top end specs to see its fully PC version glory........ I have no problem with it. If it's a big buggy dumpster fire mess, that invalidates everything I just said. But the most recent playstation PC games have been very solid and make very good use of specs available on a high end PC. If its the same coders who did the port..... and the game is gorgeous and fluid, and bug free, then: I'm in....... and if so, this would be another one of those games that would require me to upgrade, if I was purely going on recommended specs (barely: 3950x w/ 3080ti). Is this game worth that upgrade...... doubtful........but I do like that some PC gaming devs are keeping the envelope pushed......what's the point in shitting **** you money away, if you aren't going to have anything that will properly make use of it.
 
If not, the PS5 GPU is a beefed up RX6700, worse than RX6700XT.
Ultimately it's not quite as good as either of them, as the peak clock rate of the PS5 GPU is capped at 2.23 GHz (6700 is 2.45 and the XT version is 2.58) and there's no L3 cache either -- the extra cache level goes a fair way to improve the efficiency of the CUs in the discrete cards.

It does have a lot more global memory bandwidth than the two cards, but that GDDR6 is system-wide, so the DRAM modules have to contend with read/write demands from the two CCXs in the SoC, and the PCH die.
 
Ultimately it's not quite as good as either of them, as the peak clock rate of the PS5 GPU is capped at 2.23 GHz (6700 is 2.45 and the XT version is 2.58) and there's no L3 cache either -- the extra cache level goes a fair way to improve the efficiency of the CUs in the discrete cards.

It does have a lot more global memory bandwidth than the two cards, but that GDDR6 is system-wide, so the DRAM modules have to contend with read/write demands from the two CCXs in the SoC, and the PCH die.
Thanks for the clarifications.

People get overwhelmed with the Forspoken or Returnal requirements for Ultra settings and forget that the consoles have modest HW and you can get similar performance and good looking graphics with similar PC HW that doesn't cost arm and leg.
 
Can we please get the PS5 GPU on the market, at least I don't have to upgrade every year, should last me about 6-10 years right.
Ps5 has a 5700xt/6600xt gpu rDNA.. 1st gen. (Google it)

And you surely can run this game good on an old Higher tier gpu also.. I got my 5700xt from start and not planning on a new gpu yet..still good at my demands.
 
Back