FreshRSS

Zobrazení pro čtení

Jsou dostupné nové články, klikněte pro obnovení stránky.

Replacing verbal cues with some other method of relaying info to the player?

I really like an obscure game with an odd mechanic in it. I want to make a clone of it so I can keep playing it, even if its graphics and audio are inferior to the original!

This game has a highly unusual mechanic though. It relies on voice-acting to relay info to the player (or at least the opponent, it used to have online features). Not having this signalling would render half of the game's mechanics unusable.

Obviously, I don't have the ability to add voice acting to my game. So what other method can I choose? Having some sort of graphic or text pop up on the screen indicating the opponent's action would require the player to divert their eyes to elsewhere on the screen. I could have unique sound effects for every action, but there's literally dozens of them! Besides, it would take quite a while to learn which sound is associated with each action. With voice acting, the character is literally just calling out the name of the action they're performing!

Is there any other method I could use to relay this information? Or is a game like this simply impossible to play without audio? I mean, how many games are there where you literally couldn't play them at all if you're deaf? Some sort of audio cue seems a practical necessity.

The Access-Ability Summer Showcase is my favourite part of Summer Game Fest week

Periphery Synthetic, by ShiftBackTick, is a glorious thing. It's an exploration game and a playable EP. Existing somewhere between MirrorMoon, Soundvoyager for GBA, and Outer Wilds, it casts you out across a series of different planetary surfaces and asks you to make your own sense of everything.

When I play the current demo, the screen draws an undulating alien terrain in little blocks of light. But I can play the game even when I close my eyes. Periphery Synthetic is made to be fully playable without seeing the screen. Players can use echolocation and "terrain sonification" to navigate its spaces, while using screen readers to move through menus. It's a wonderful thing to try out, sounds changing in pitch to tell me whether I'm ascending or descending, whether I'm turning back on myself or headed somewhere new.

I found out about Periphery Synthetic in this year's Access-Ability Summer Showcase, which is created and hosted by accessibility consultant and author Laura Kate Dale. Like last year's show, it brings together a bunch of games that have brilliant accessibility options, and is itself available in ASL, BSL and audio-described versions.

Read more

Phones with Bluetooth hearing aid compatibility (updated June 2024)

More and more people have some form of hearing impairment in this increasingly loud day and age. You may remember hearing aids as bulky devices sitting behind your ears that are barely good enough to make people hear again, but this is far from the truth today. Most hearing aids support some form of audio streaming. Still, you need a compatible Android phone for most of them.

Android 15 improves accessibility with better hearing aid support

  • Google has announced that the Android 15 update will improve the platform’s support for hearing aids.
  • The latest release will work with hearing aids that support Bluetooth LE Audio.
  • The update will also offer better hearing aid management features like a Quick Settings tile, the ability to change presets, and the ability to view the battery level.


Because Android is used by billions of people worldwide, Google has to design the operating system with accessibility in mind. Hundreds of millions of people suffer from a degree of hearing loss, which is why Android offers assistive features like Live Captions. There’s only so much that Android itself can do to compensate for hearing loss, though, which is where dedicated assistive devices like hearing aids come in. Android has technically supported hearing aids since Android 10 was released in 2019, but with the upcoming update to Android 15 later this year, the operating system will significantly improve support for them.

Hearing aids, if you aren’t aware, are a type of electronic device that’s designed to help people with hearing loss. They’re inserted into your ears, similar to other types of hearables like wireless earbuds, but their main purpose is not to stream music but to amplify environmental sounds so you can hear better. Many sounds originate from your phone, though, which is why many hearing aids nowadays support Bluetooth connectivity. People with hearing loss want to be able to hear who they’re speaking to in voice calls, watch videos on YouTube, or even listen to music, all of which is possible thanks to Bluetooth.

However, hearing aids, unlike wireless earbuds, absolutely need to have all-day battery life. That’s challenging to achieve when using a standard Bluetooth Classic connection to stream audio from your phone to your hearing aids. Streaming audio between two devices connected via Bluetooth Low Energy (Bluetooth LE) is more battery efficient, but for the longest time, there wasn’t a standardized way to stream audio using Bluetooth LE.

That left things up to companies like Apple and Google to create their own proprietary, Bluetooth LE-based hearing aid protocols. Apple has its Made for iPhone (MFi) hearing aid protocol, while Google has its Audio Streaming for Hearing Aids (ASHA) protocol. The former was introduced to iOS way back in 2013, while the latter was introduced more recently in 2019 with the release of Android 10. While there are now several hearing aids on the market compatible with both MFi and ASHA, the fragmentation problem remains. Any advancements in the protocol made by one company will only be enjoyed by users of that company’s ecosystem, and since we’re talking about an accessibility service that people rely on, that’s a problem.

Fortunately, there’s now a standardized way for devices to stream audio over Bluetooth Low Energy, and it’s aptly called Bluetooth LE Audio. LE Audio not only supports the development of standard Bluetooth hearing aids that work across platforms but also implements new features like Auracast. We’ve already shown you how Android 15 is baking in better support for LE Audio through a new audio-sharing feature, but that’s not the only LE Audio-related improvement the operating system update will bring.

At Google I/O earlier this month, Google announced that Android 15 will support hearing aids that use both Bluetooth LE Audio (LEA) as well as the company’s ASHA protocol. Furthermore, the update will introduce a new Quick Settings tile that makes connecting and disconnecting to hearing aids much easier. The hearing aid Quick Settings tile is already live in Android 15 Beta 2, in fact, but I don’t have any hearing aids myself to test this feature out.

Android 15 hearing aids with LEA and ASHA

Credit: Mishaal Rahman / Android Authority

According to the images that Google showed, though, the Quick Setting pop-up will let users toggle various accessibility features like Live Caption, Live Transcribe, and Sound Notifications. It’ll also let users change the hearing aid preset, which “represents a configuration of the hearing aid signal processing parameters tailored to a specific listening situation,” according to the Bluetooth SIG. The exact presets that can be selected depends on what the hearing aid reports to Android. In the example image that Google shared, there were presets for “Restaurant,” “Music,” “TV,” “Outdoors,” and “All-Round.” Finally, Google says that users will also be able to view the battery level of their connected hearing aids directly within Android’s Settings and Quick Settings.

Improved hearing aid support isn’t the only accessibility-related improvement coming to Android. During Global Accessibility Awareness Day earlier this month, Google announced a number of accessibility updates to its Android apps, including Lookout, Look to Speak, Project Relate, and more. These changes, along with the upcoming improvements to Live Captions that we recently detailed, will make Android even more accessible to people with difficulty hearing or seeing.

Google will soon let you resize Android’s Live Captions feature

  • Android’s Live Captions feature will soon add a grab bar that users can drag to change the number of lines shown for captions.
  • This upcoming feature was announced at Google I/O and was said to be rolling out this month.
  • We also recently discovered that Live Caption will soon add new customization options around emojis as well as other features.


Live Caption is one of Android’s best accessibility features. The feature automatically generates captions, in real-time, for any speech detected in audio playing from your phone. This is a really useful feature for people who have difficulty hearing, but it can also come in handy for anyone who can’t raise their phone’s volume enough to make out what’s being said. The generated captions are shown in a floating box that currently can’t be resized but that’s set to change in an upcoming update.

At Google I/O earlier this month, the company talked about the major new accessibility features it’s bringing to Android. It started off by talking about how the Android 15 release improves the platform’s support for hearing aids, how the Sound Notifications feature has been updated to be more accessible, how the Project Relate app’s onboarding process has been improved, and so on. The company highlighted most of these changes in a blog post published during Global Accessibility Awareness Day (GAAD), but one change it didn’t highlight on GAAD was the rollout of a new grab bar for Live Caption.

The new grab bar will let users “easily change the number of lines shown for captions.” This feature is supposedly rolling out “this month,” i.e., May 2024. However, we’re near the end of May but have yet to hear any reports of this feature rolling out. Regardless, Google shared an image of what it’ll look like during their presentation, so we’ll know what to look out for when it does roll out.

Live Caption grab bar resize

Credit: Mishaal Rahman / Android Authority

Although Google says Live Caption’s new grab bar will let users tailor the number of caption lines shown, it didn’t specify exactly how many lines can be shown. The image it shared shows four lines, but it’s likely that more can be shown, given that currently, two or three lines can already be shown depending on text size. Speaking of which, it’s already somewhat possible to fit more text in the Live Caption window simply by enabling “caption preferences” in settings and overriding the text size to a smaller value.

Note that this is a system-wide setting, which means that media apps that already have built-in captions might also be affected. Once the grab bar feature rolls out, though, this won’t be a problem since you won’t need to change the system-wide caption text size in order to show more lines in the Live Caption window.

We don’t know exactly when this feature will roll out, but its rollout is likely imminent, given that Google said it’s coming this month. Google didn’t specify whether this feature will be rolling out to its own Pixel devices first or if it’ll come to other Android devices that have Live Caption. Live Caption is part of the Android System Intelligence app, which Google provides to OEMs in two flavors. The “Private Features” version of the app includes support for Live Caption, and it’s found on devices from many different Android brands like OnePlus, ASUS, and others.

As we spotted earlier this month, alongside the grab bar, Live Caption is also expected to gain several customization features. These features include toggles to show emoji icons, emphasize emotional intensity, include emotional tags, and show the word duration effect.

Hello, can I ask how difficult is for developers to add accessibility features to games? I am aware it probably varies by type. Recently, I asked if a sound only minigame in one video game could be reworked to add visual cues, as I am deaf. Lot of other fans harped on me its too much work for little gain, too difficult, that it takes away precious developers time, etc. So now I wonder how complicated such thing actually is and how devs view it. Thank you.

They're not wrong in that building such things isn't free. However, you're also right in that we on the dev side should be thinking about better ways of doing this - there isn't only one solution to these problems. Whatever final solution we implement doesn't have to be the most expensive means of doing so. It's actually up to us to think of better/more efficient ways of doing the things we want to do. Adding accessibility options is often a worthy goal, not only to the players who need those options to be able to play, but also for general quality-of-life. If we're making changes after the fact, of course they're super expensive. If accessibility options are a production goal that we plan for, they're much cheaper because we don't have to redo work - we do it with accessibility in mind in the first place.

For example - let's say that we're working on UI and we have this system:

Let's say that we want to improve things for colorblind players. If we wanted to make this more accessible, instead of just using color to differentiate the choices, we could also add different border visuals to provide additional context.

In such a situation, the difference in choices is still obvious if you're colorblind and it helps legibility for non-colorblind players as well.

These kinds of UX changes can be expensive if we decide to do it after the fact, but if it's something we decide is important to us from the jump we can compensate for those costs by creating efficient and smart solutions early. Remember, the cost of any change in game development is directly proportional to how close that change is to shipping the game. The earlier the change is made, the cheaper it is. Furthermore, we make resource allocation choices based on our goals. If we want to make a game more accessible, we will figure out a way to do so that fits within our budget and provides a good player experience. Players don't really have a say in how we allocate our resources and that kind of armchair producer talk isn't particularly constructive anyway. Telling us what's important to you and why (including accessibility requests) is really the best kind of feedback we can hope for. Don't sweat coming up with the solutions or fretting about where we spend resources, that's our job.

[Join us on Discord] and/or [Support us on Patreon]

Got a burning question you want answered?

Google is bringing loads of accessibility updates to its Android apps

A Pixel phone showing Android's accessibility settings screen.

Credit: Hadlee Simons / Android Authority
  • Google has announced a number of improvements to its accessibility apps on Android.
  • This includes a Find mode in the Lookout app, improved wheelchair accessibility info, and more.

Android already offers a healthy number of accessibility features, but you can never have too many options in this regard. So we’re glad to see that Google is adding more tweaks to its accessibility apps.

Google’s Live Caption feature could soon get customization options for emojis, and more

Live Caption on a Pixel phone resize

Credit: Hadlee Simons / Android Authority

  • Google is working on several features for Live Caption, which is shipped as part of the Android System Intelligence app.
  • These upcoming features include options around emojis, emotional intensity, and word duration, and showing these in the transcript.


Google introduced Live Caption as a feature back in Google I/O 2019. This underrated accessibility feature can create captions of any speech coming out of your device. I’ve frequently used it in situations where I couldn’t audibly listen to my phone audio but still needed to figure out what was being spoken. Google is working on adding new improvements to Live Caption that would make it even more useful.

Android Authority contributor Assemble Debug spotted several upcoming customization features related to Live Caption in the Android System Intelligence app.

Live Captions Android System Intelligence

Some of the upcoming changes revolve around emojis. Live Caption will soon include emoji icons in the transcript and will let you choose whether to show the emoji icon, too. You can show emotional intensity in speech and even choose to emphasize emotional intensity in transcription. You can also show emotional tags and include them in the transcript.

Beyond this, you can show the word duration effect and include the word duration effect in transcription.

As you can see, these changes are in the interest of giving users more options and customizability for their captions and transcripts. These upcoming features were spotted in the oemfull variant of the Android System Intelligence app, which is available on several devices beyond just Google Pixels, so we’re crossing our fingers on seeing these roll out to plenty of phones.

Would you find these upcoming Live Caption features useful? Let us know in the comments below!

Lilbits: Android 15 Beta 2, Accessibility for Android and iOS, and more emulators in the App Store

On the second day of Google IO 2024, Google released a new beta of Android 15 and announced a bunch of new features coming to Android for mobile devices and TVs. But today’s news isn’t all about Google: Apple also unveiled upcoming accessibility features for iOS, iPadOS, and visionOS. And after a slightly rocky start, […]

The post Lilbits: Android 15 Beta 2, Accessibility for Android and iOS, and more emulators in the App Store appeared first on Liliputing.

GTA 6 looks stellar - and it could be a huge moment for disability representation

The greatest gaming event of 2023 was the GTA 6 reveal trailer. I am still thinking about it now. Rockstar has mastered the art of making the gaming world pause for a few minutes to experience something lavish and mind-blowing. Their games encourage grandiose thinking. I want this new version of Vice City’s map to be the biggest in history, even though I’m currently exhausted of massive open-world games.

I’m attracted to an open-world sandbox for the chance to catch that fleeting and ephemeral feeling of immersion. GTA 6’s trailer delivered the real essence of immersion for me. If you ask me, a crucial part of immersion is the game’s power to remain in your consciousness even after you’ve put the controller down and turned off the TV, blurring the barrier between the fictitious and reality.

I spent hours in GTA 5 as the embodiment of Trevor Philips, Michael De Santa and Franklin Clinton. You know, just cruising around Los Santos listening to radio stations, earning money as a taxi driver, encountering hilarious random encounters or causing utter 5-star mayhem. This level of immersion pulled me through my screen into the world and narrative of GTA 5 like an early premonition of Alan Wake 2. I’ve been trying to grasp that fleeting sense of immersion ever since, only finding it when experiencing The Witcher 3, Cyberpunk 2077 and Red Dead Redemption 2. I’m sure you have your own touchstones for this!

Read more

❌