Game streaming just won't happen (at least in the States) until datacaps are removed from ISPs.
TPU did a review of Goolge Stadia when it was in the beta stage and getting close to launching. From what I gathered from the info provided the data use was too high for anyone that's an avid gamer, if they have a datacap.
You just can't afford to blow through all your data within a week or two. Now you're left paying more money to your ISP to cover the stupid overage charges they will tack on. Or, you simply just avoid doing anything online until the your billing month starts over and your datacap is restarted.
If you dabble in gaming, a handful of hours a week, a streaming service might really suit you well if you don't want the hassle of spending money on a console or gaming PC.
But, not if you compare it to buying a new GPU every 2-4 years. At $240/year, you could run for 4 years for under $1,000 whereas a new GPU, like a 7900 or 4080 is going to cost you $1,000 in hardware not to mention utility bills.
The problem some people run into is they absolutely believe they need to upgrade because for some reason they think 4K is the bee's knees and if they can't have that 4090 or whatever other crap card to push a game with RT on and software to downscale and then upscale to give playable frames, then the world is melting and their lives are ruined!
4K - it's niche, those people can freakishly spend their money and constantly panic about fps
1440p - it's a nice spot to be and most cards, even from the 1080/1080Ti era can still operate games at acceptable levels (sure, some setting tweaks may be needed for settings and clearly you won't be pulling 500 fps+! Oh noes!).
1080p - still a great way to play games. Lots of older cards will still crush 1080p.
I ran my 980Ti for 6 years. I used it across 1920x1080p and 5670x1080p (as long as the game supported that wide resolution well). 1920x1080p, even towards the end of it's life with me, ran games exceedingly well. 5760x1080, as time went on I had to tweak settings to keep around that 60fps mark, but it still played games well.
With that aside, the cost to run a computer really isn't that much, but that does depend on where you live.
My city charges $0.119 per kWh.
If my gaming system draws 600W per hour and even if I gamed 8 hours a day, that means 4800W or 4.8kW used a day.
4.8 * 7 = 33.6kWh per week
33.6 * .119 = $4 a week
That's just shy of $208 in a year or if you want to look at per day cost, that would be $0.57 per day
($4 a week, that's less then one of those stupid fancy coffees folks like to buy nearly every day. That's about the price of an energy drink that people love to drink)
400W per hour, 8 hours a day, 52 weeks = $138.61 ($0.38 a day)
750W per hour, 8 hours a day, 52 weeks = $260 for a year. ($0.71 a day)
1000W per hour, 8 hours a day, 52 weeks = $346.53 for a year. ($0.95 a day)
If an extra quarter or more a day is too much to pay for your gaming needs on a high end system, then clearly the high end gaming system isn't geared towards someone like you.