What just happened? Did you know Thursday is Global Accessibility Awareness Day? Yeah, me neither. But apparently, this made-up holiday provides companies the perfect opportunity to show how inclusive they are by announcing features that make their products more accessible. Apple recognizes the day by showcasing functions that expand its growing list of accessibility features.

Although not due out until later this year, Apple has revealed several additions to the accessibility settings for Macs, iPhones, iPads, and Apple Watches. While the features are intended to help those with disabilities more easily use Apple devices, some are intriguing alternatives for those looking for more convenient input methods --- particularly the new gesture controls for Apple Watches, but more on that in a minute.

One of the first features revealed is door detection. Door detection is designed with the blind or vision impaired in mind. It uses the camera, LiDAR scanner, and machine learning on newer iPhones and iPads to help people navigate buildings better.

When arriving at a new location, the function can tell users where a door is, how far they are from it, and how it opens --- turning a knob, pulling a handle, etc. It can also read signs and symbols around the door, like room numbers or accessibility signs.

Next, Apple is developing Live Captions for the hearing impaired. Live Captions aren't wholly innovative. Android devices have had a similar feature for a while, but now those with iPhones, iPads, or Macs can have real-time closed captioning overlays on video and FaceTime calls. It can also transcribe sounds around the user.

However, two features make Live Captions different from Android. One is the ability to add name tags to FaceTime speakers making it easier to track who is speaking. Furthermore, when using it on Macs, it can read out typed-in responses in real time. This latter feature could be helpful for aphasia patients or others who have trouble speaking. Unfortunately, it will only be available in English when Apple releases the beta in the US and Canada later this year.

Last but not least, there are a couple of cool Apple Watch features. The first is Mirroring. This setting allows folks with motor-control issues to operate an Apple watch without fumbling around with the small screen. It syncs with the user's iPhone using AirPlay to allow various input methods, including voice control, head tracking, and external Made for iPhone switch controls.

Another innovative accessibility feature to the Apple Watch is Quick Actions. These are simple movements with your fingers, such as touching your first finger and thumb together (a pinch) that Apple first introduced last year. The watch will detect these motions as an input. This year it has improved detection and added more functions to the list of things users can control.

For instance, a single pinch can advance to the next menu item, and a double will go back to the previous. Answering or dismissing a call while driving with a simple hand gesture could prove very handy even for those without motor control issues. Users can use gestures to dismiss notifications, snap the shutter on the camera, pause media in the Now Playing app, and control workout sessions. There are probably many other examples, but those were the specific use cases Apple mentioned.

A few other features are arriving later this year, including Buddy Controller, Siri Pause Time, Voice Control Spelling Mode, and Sound Recognition. You can read up on what these do in Apple's press release.