Autonomous transportation startup May Mobility is doing more than just talking about accessibility when it comes to self-driving transportation tech development. The company recently began developing a wheelchair-accessible prototype version of its autonomous shuttle vehicle, and just concluded an initial round of gathering feedback from the community of people in Columbus, Ohio, who would actually be using the shuttle.
May Mobility’s design includes accommodations for entry and exit, as well as for securing the passenger’s wheelchair once it’s on board during the course of the trip. The company learned from the first round of feedback that its design needs improvement in terms of making the ramp longer to facilitate more gradual onboarding and disembarking, as well as optimizing pick-up and drop-off points.
It still plans to work on implementing some improvements, before deploying its vehicles, but we can expect to see accessible May Mobility shuttles in operation across its pilots
Rohan Silva is obsessed with social mobility and why certain groups are so under-represented in the technology industry.
He co-founded Second Home, a coworking space looking to bring together disparate civic-minded, cultural, creative and commercial entrepreneurs at sites in Lisbon, London and (now) Los Angeles, and he has spent years examining how gender, race and class impact access to technology as a now-reformed politician. Throughout that work though, one area that he says he overlooked was accessibility and entrepreneurship focused on people with disabilities.
“At Second Home, we pride ourselves on having a diverse community. I can count on one hand the number of founders with disabilities we have in our community, so there is definitely something going profoundly wrong,” Silva says.
Enlisting the help of the European venture capital fund Atomico, Silva has set up a micro-investment fund of £100,000 to tackle the problem.
The latest feature for Comcast’s X1 remote software makes the clicker more accessible to people who can’t click it the same as everyone else. People with physical disabilities will now be able to change the channel and do all the usual TV stuff using only their eyes.
TVs and cable boxes routinely have horrendous interfaces, making the most tech-savvy among us recoil in horror. And if it’s hard for an able-bodied person to do, it may well be impossible for someone who suffers from a condition like ALS, or has missing limbs or other motor impairments.
Voice control helps, as do other changes to the traditional 500-button remote we all struggled with for decades, but gaze control is now beginning to be widely accessible as well, and may prove an even better option.
Comcast’s latest accessibility move — this is one area where the company seems to be
Although the meat of Apple’s accessibility news from WWDC has been covered, there still are other items announced that have relevancy to accessibility as well. Here, then, are some thoughts on Apple’s less-headlining announcements that I believe are most interesting from a disability point of view.
Accessibility goes above the fold
One of the tidbits I reported during the week was that Apple moved the Accessibility menu (on iOS 13 and iPadOS) to the top level of the Settings hierarchy. Instead of drilling down to Settings > General > Accessibility, the accessibility settings are now a “top level domain,” in the same list view as Notifications, Screen Time, and so on. Apple also told me this move applies to watchOS 6 as well.
Apple confirmed to me Accessibility, the menu, has been moved to front page of Settings. Also, selecting acccessibility features is now part of first-run “setup buddy”
From dark mode in iOS 13 to a redesigned user interface in tvOS to the dismantling of iTunes to the coming of iPadOS, Apple made a slew of announcements at its Worldwide Developers Conference keynote on Monday in San Jose. And accessibility was there in full force.
Accessibility, as it always does, plays a significant role in not only the conference itself — the sessions, labs and get-togethers all are mainstays of the week — but also in the software Apple shows off. Of particular interest this year is Apple’s Voice Control feature, available for macOS Catalina and iOS 13 devices, which allows users to control their Macs and iPhones
Apple is known for fluid, intuitive user interfaces, but none of that matters if you can’t click, tap, or drag because you don’t have a finger to do so with. For users with disabilities the company is doubling down on voice-based accessibility with the powerful new Voice Control feature on Macs and iOS (and iPadOS) devices.
Many devices already support rich dictation, and of course Apple’s phones and computers have used voice-based commands for years (I remember talking to my Quadra). But this is a big step forward that makes voice controls close to universal — and it all works offline.
The basic idea of Voice Control is that the user has both set commands and context-specific ones. Set commands are things like “Open Garage Band” or “File menu” or “Tap send.” And of course some intelligence has gone into making sure you’re actually saying the command and
With last fall’s release of iOS 12, Apple introduced Siri Shortcuts — a new app that allows iPhone users to create their own voice commands to take actions on their phone and in apps. Today, Apple is celebrating Global Accessibility Awareness Day (GAAD) by rolling out a practical, accessibility-focused collection of new Siri Shortcuts, alongside accessibility-focused App Store features and collections.
Google is doing something similar for Android users on Google Play.
For starters, Apple’s new Siri shortcuts are available today in a featured collection at the top of the Shortcuts app. The collection includes a variety of shortcuts aimed at helping users more quickly perform everyday tasks.
For example, there’s a new “Help Message” shortcut that will send your location to an emergency contact, a “Meeting Someone New” shortcut designed to speed up non-verbal introductions and communication, a mood journal for recording thoughts and feelings, a pain report that
Microsoft has selected seven lucky startups to receive grants from its AI for Accessibility program. The growing companies aim to empower people with disabilities to take part in tech and the internet economy, from improving job searches to predicting seizures.
Each of the seven companies receives professional-level Azure AI resources and support, cash to cover the cost of data collection and handling, and access to Microsoft’s experts in AI, project management, and accessibility.
Companies apply online and a team of accessibility and market experts at Microsoft evaluate them on their potential impact, data policies, feasibility, and so on. The five-year, $25 million program started in 2018, and evaluation is a rolling process with grants coming out multiple times per year. This one happens to be on Global Accessibility Awareness Day. So be aware!
Among this round’s grantees is Our Ability, a company started by John Robinson, who was born
A set of new features for Android could alleviate some of the difficulties of living with hearing impairment and other conditions. Live transcription, captioning, and relay use speech recognition and synthesis to make content on your phone more accessible — in real time.
Announced today at Google’s I/O event in a surprisingly long segment on accessibility, the features all rely on improved speech-to-text and text-to-speech algorithms, some of which now run on-device rather than sending audio to a datacenter to be decoded.
The first feature to be highlighted, live transcription, was already mentioned by Google before. It’s a simple but very useful tool: open the app and the device will listen to its surroundings and simply display any speech it recognizes as text on the screen.
We’ve seen this in translator apps and devices, like the One Mini, and the meeting transcription highlighted yesterday at Microsoft Build. One would
For those with speech impairments, artificial intelligence-powered voice recognition technology simply doesn’t work for them. Google is trying to fix that.
Today at Google I/O, Google unveiled Project Euphonia, to explore how artificial intelligence can better recognize those with speech impairments and other types of speech patterns.
“We also want to help those with speech disorders or people whose speech has been affected by a stroke or ALS,” Google CEO Sundar Pichai said at I/O. “Researchers from Google AR are exploring the idea of personalized communication models that can better understand different types of speech, as well as how AI can help even those who cannot speak to communicate.”
Voice recognition technology doesn’t work today for people with speech impairments because no one has collected large enough data sets, Pichai said. That’s where Euphonia comes in.
In partnership non-profits like ALS Therapy Development Institute and ALS Residence Initiative,
Microsoft has been leaning into accessibility in gaming lately, most visibly with its amazing Adaptive Controller, and a new patent suggests another way the company may be accommodating disabled gamers: an Xbox controller with a built-in Braille display.
As you might expect, it’s already quite hard for a visually-impaired gamer to play some games, and although that difficulty can’t be entirely alleviated, there are definitely things worth doing. For instance: the text on screen that sighted people take for granted, documenting player status, items, onscreen dialogue or directions — how could these be read by a low-vision gamer who might be able to otherwise navigate the game world?
Braille is a crucial skill for children with visual impairments to learn, and with these LEGO Braille Bricks kids can learn through hands-on play rather than more rigid methods like Braille readers and printouts. Given the naturally Braille-like structure of LEGO blocks, it’s surprising this wasn’t done decades ago.
The truth is, however, that nothing can be obvious enough when it comes to marginalized populations like people with disabilities. But sometimes all it takes is someone in the right position to say “You know what? That’s a great idea and we’re just going to do it.”
It happened with the BecDot (above) and it seems to have happened at LEGO. Stine Storm led the project, but Morten Blonde, who himself suffers from degenerating vision, helped guide the team with the passion and insight that only comes
Like most companies, Google doesn’t like when employees leave. Especially employees who ran key parts of the company for years. Leaving means competition. Leaving means potential opportunities lost.
John [Hanke, CEO of Niantic] eventually sat down with Larry Page to figure out what it’d take to keep him within Google . They talked about John’s interest in augmented reality. They talked about a book called Freedom by David Suarez, which centers around an out-of-control AI that taps a network of real-world operatives to control the world (the
Buttons or plates (like the one above) that automatically open doors can do a lot to make a building more accessible, but they aren’t always a perfect solution. For wheelchair users with limited upper body movement, the buttons can be tough to hit. Other times, the button is installed poorly — too high, too low, or just too far from the door to be useful, with the door closing too fast.
Portal Entryways is a startup trying to make these existing buttons more useful. They’ve built a device that piggybacks on top of existing access buttons, allowing these doors to be opened automatically (and, importantly, kept open) when a wheelchair user approaches.
Portal’s product has two components: a piece of Bluetooth Low Energy-enabled hardware that hooks into the existing door opening system, and a companion app running on the wheelchair user’s smartphone. The app searches for these Bluetooth Low Energy
Lyft, which has faced at least one lawsuit pertaining to its alleged discrimination against people with physical abilities, announced today it has expanded its wheelchair-accessible vehicle (WAV) service in New York City. Details on the blog are very scarce (we’ve reached out to Lyft for more info) but Lyft now has more than 20 partners in New York City to help increase WAV access.
“With more accessible rides on the road, we’ll be better able to help New Yorkers with physical disabilities get around the city,” Lyft wrote in a blog post.
But it’s not clear how many wheelchair-accessible vehicles are available now than before. Previously, Lyft had just a five percent success rate for finding wheelchair-accessible vehicles for riders, while Uber had a 55 percent success rate, according to a 2018 report from the New York Lawyers for the Public Interest. For both of these companies, they were able to find
A gaggle of new emoji have just been approved by the Unicode Consortium, meaning they’ll be standard across any platforms that choose to support them. This batch includes some much-needed representation for people with various disabilities, new animals from guide dogs to otters, food and many more objects.
Folks with disabilities get a nice variety of new emoji, though of course these aren’t exhaustive (for example, how do you represent a learning disability or mental illness?). Still, Apple’s proposal for the new emoji points out the necessity of, for example, having both mechanical and manual wheelchairs:
The type of assistive technology that is used by individuals is very personal and mandated by their own disability need. For someone who cannot self-propel and therefore uses an electric wheelchair, it would not be realistic to only show a manual chair. For those who can use a manual version, it would not
Google announced Monday that it’s launching a beta for a new Android feature called Live Transcribe, which can accurately create written captions from speech on the fly. It’s an accessibility-focused project made to help people with hearing loss communicate without making special arrangements or purchasing expensive…
Google this morning unveiled a pair of new Android features for people who are deaf or hard of hearing. As the company notes in a blog post this morning, the WHO estimates that 900 million people will be living with heading loss by 2055. The ubiquity of mobile devices — Android in particular — offers a promising potential to help open the lines of communication.
Live Transcribe is, perhaps, the more compelling of the two offerings. As its name implies, the feature transcribes audio in real-time, so users with hearing loss can read text, in order to enable a live, two-way conversation. It defaults to white text on a black background, making it easier to read and can also connect to external microphones for better results.
The feature leverages much of the company’s work in speech to text and translation. It starts rolling out today in limited beta for Pixel
Once upon a time, people had to wait for the Super Bowl to watch the ads. Those dark days are over. Now you can have companies sell you products on-demand, any time, day or night. Amazon has already debuted its latest Alexa ad, and now Microsoft’s getting in on the action — and this one’s a bit of a tear-jerker.
The software giant’s Super Bowl spot highlights some of the work it’s done to increase the accessibility of its products. Front and center is the Xbox Adaptive Controller, a $100 ad-on that makes the console more accessible to gamers with a range of different needs. The spot features a number of different children (and their parents) who are better able to enjoy gaming using the device.
The Adaptive Controller was created with input from a number of different groups, including The AbleGamers Charity, The Cerebral Palsy Foundation, SpecialEffect,