Windows 10 game bar gains a frame rate counter and achievement overlay

onetheycallEric

Posts: 225   +47
Staff
In brief: While gaming overlays are nothing new, Microsoft insists that the Windows 10 game bar -- or Xbox game bar, if you'd rather -- is the best for gaming on Windows, as it's baked right into the OS. To its credit, Microsoft has added some compelling reasons to give it a shot, with a widget based layout and integrations allowing for a very customizable experience. With a new FPS counter and achievement tracking overlay, Microsoft is trying to make a stronger case for the game bar.

Microsoft continues to polish the Windows 10 gaming experience. Since its complete overhaul with the May 2019 update, Microsoft has been working towards making the Windows 10 game bar something of a centerpiece for gaming on Windows, revamping it to add interesting integrations for Spotify, Steam, and widgets for performance monitoring.

Now, the Windows 10 game bar is getting two new features: a FPS counter and an overlay for achievement tracking. Xbox's Mike Ybarra, Corporate Vice President of Xbox Program Management, teased the news via Twitter.

The FPS counter is self-explanatory; it displays a frame rate count in real time for the game being played. The achievement overlay seems to be more a quality of life feature, allowing users to track their achievement progress with a cursory glance.

You can update the Windows 10 game bar through the Microsoft Store here. However, some users on Twitter are reporting that the updates aren't appearing for them, so it isn't presently clear if the updates are restricted to certain Windows Insider preview rings.

Permalink to story.

 
Hey Microsoft, how about an API that lets app devs hook into windows update to keep their apps up to date automatically? Right now as it stands each app that's not on the windows store requires it's own update checker built into the software. This can slow down startup times and the system by having a bunch of programs running at startup or in the background. Not to mention you still have to manually download the file and install it. Letting other apps use windows update to automatically keep up to date would be great and of course it would all be an optional choice by the devs. I know it would save a lot of people some time.
 
How about they get it to support OpenCL or CUDA or NVENC. That would get my attention. As it stands right now there is too much overhead on the CPU. Get some of that overhead to the GPU.
 
It's sad how fps is used to justify outlaying ridiculous amounts of money on PC hardware.

Even more so when most gamers don't realize that in fact FPS is not an absolute measure of input lag and latency in games. In fact battlenonsense just did a video proving that a system with a lower overall FPS can achieve significantly lower input lag:

As it turns out, the input lag penalty for having a GPU at or near max utilization exceeds any gain from FPS. Makes sense given a GPU at 100% can not processes new requests in a timely manner.

The same should also apply to CPUs. In fact, someone should test Ryzen Gen 1 CPUs against their Intel counterparts of the time. Plenty of people were reporting their games "feeling" smoother but no one had quantified that in numbers at the time. In fact it's highly surprising that no one has run a test like BattleNonsense has yet. I would very much like to see TechSpot do an article testing these results for both GPU and CPU as it could change the way we should benchmark, in that we need utilization metrics included. Technically speaking, a CPU with higher 1% lows should deliver more consistent and lower input lag then a CPU with marginally higher FPS but has spiking, indicating that it is at high low and at times there will be periods where input lag increases significantly.
 
Even more so when most gamers don't realize that in fact FPS is not an absolute measure of input lag and latency in games. In fact battlenonsense just did a video proving that a system with a lower overall FPS can achieve significantly lower input lag:

As it turns out, the input lag penalty for having a GPU at or near max utilization exceeds any gain from FPS. Makes sense given a GPU at 100% can not processes new requests in a timely manner.

The same should also apply to CPUs. In fact, someone should test Ryzen Gen 1 CPUs against their Intel counterparts of the time. Plenty of people were reporting their games "feeling" smoother but no one had quantified that in numbers at the time. In fact it's highly surprising that no one has run a test like BattleNonsense has yet. I would very much like to see TechSpot do an article testing these results for both GPU and CPU as it could change the way we should benchmark, in that we need utilization metrics included. Technically speaking, a CPU with higher 1% lows should deliver more consistent and lower input lag then a CPU with marginally higher FPS but has spiking, indicating that it is at high low and at times there will be periods where input lag increases significantly.

You are already posting the definitive source for that data. Chris (Battlenonsense) will get to that. I am sure.
 
It's sad how fps is used to justify outlaying ridiculous amounts of money on PC hardware.
You probably have the most expensive rig on this forum and you're talking about how ridiculous spending money on FPS is? Don't you game at 144hz on your i9?
 
You probably have the most expensive rig on this forum and you're talking about how ridiculous spending money on FPS is? Don't you game at 144hz on your i9?


I like to be able to turn up every single option to max/ultra, but I don't chase FPS numbers.
 
untrue....

It's possible to have a game's detail turned to maximum but get poor FPS.

I can't tell you the last time I checked my FPS.
Well you don't own and i9 and a 2080ti to get poor FPS at Max detail. You could could Max out a game on a 1050, doesn't mean it's playable
 
Well you don't own and i9 and a 2080ti to get poor FPS at Max detail. You could could Max out a game on a 1050, doesn't mean it's playable


I'd feel more comfortable with a 1080 or 1080Ti as the most minimal GPU I could use.

It all depends on the game of course. I play at 3440x1440 and my 1080 gets ~30fps (not high enough) in Ark:SE on Ultra with max OC while Rocket League gets 60FPS on Highest with the same card set to max underclock and undervolt (runs at ~700MHz and ~0.6V). FWIW I have carefully set Ark's settings to get 60FPS with dips to around 50 in the busiest areas.
 
I wonder when they''ll add Xbox One controller support to Game Bar so that I can capture video and screenshots while playing a game. I don't use a keyboard so the shortcut/hotkeys are useless for me. I do like that it will work on any game. Also, I like how it puts the game name at the start of the filename so you know which game the screenshot was from.

Personally, I don't care about framerate as long as the game isn't too laggy, I mostly play Skyrim so it isn't a huge concern!
 
Back