What’s New in Android Accessibility (Google I/O'19)

good afternoon is everybody ready for the best accessibility i/o ever right on right on my name is Brian Kemmler I'm a p.

m.

on Android accessibility our goal is to ensure that everyone can use our products no matter what their ability specifically we design and build accessibility suite sound amplifier live transcribe and now live caption we've invested heavily in the last year to make audio on Android much more accessible so we'll be focusing a lot on that today here with me today are Melissa Lauren and Kelly our passionate team of researchers programmers and and engineers today I'm going to share my expect 'iv on accessibility 'z mission and why it matters I'll discuss new products for the deaf and hard of hearing melissa will show you how we'll use structured research to inform our most important product making decisions lastly Kelly and Lauren are going to talk about new products for users with physical disabilities reduce dexterity low vision and blindness so now let's talk about our mission accessibility mission is integral to Google's so integral it's in the mission statement itself to make the world's information universally accessible and useful let's unpack that a little bit universally accessible means everyone can use Android we don't expect users to adapt our products rather we adapt our products to our users because we think of accessibility as a human right if there are barriers for anyone we have more work to do and it's with this mantra that we design build and apply Google magic to make both our products and the real world around us more accessible and more useful for more users in more situations the world's information means it's not just about making what's on the device accessible but it's about making the planet more accessible and to do this we apply advances in machine learning and artificial intelligence what I call the Google magic to meet the needs of underserved communities last February we launched live transcribe an accessibility first experience enabling any anyone to have a conversation without feedback allowing anyone to have a conversation even if they can't hear you can think of live transcribe as captioning for the real world sound amplifier and look out are also great examples of building accessibility first experiences and you could think of this overall approach as a practice practical useful and potentially transformative application of augmented reality so why is this important why do we care many of us in fact including myself at one point in time believed that disability was something that only affected a small set of the population even in the strictest sense that set is massive it's a billion people or 15 percent of the world's population but let's start to think about disability a little more holistically so if we include people who don't self-identify as having a disability this number one billion it doubles it grows if we think of friends family and loved ones of folks who have disabilities well then that number grows again and thinking of this even more expansively if we think about disability to include people with temporary conditions like a broken arm or a broken leg or situational ones like driving and texting don't do it then our circle expands to include nearly the entire planet so it's really everyone's work to make not just our products but the planet more accessible and when we do so we're affirming that accessibility is a human right so apart from doing the right thing I'm going to talk to you a little bit today about why there are some very positive karmic consequences when we design for accessibility first so much so a term was coined for it and that term is the curb cut effect so you see back in the day there were no curb cut just as at one point in time there were no closed captions so that meant that people who use wheelchairs effectively encountered a wall every time they came to to a curb so in 1945 there was a guy in Kalamazoo Michigan named Jack Fisher who was disabled and he advocated and fought for the installation of the country's first curb cut and they caught on globally sidewalks and more importantly cities became independently navigable for the very first time and this had positive consequences literally for everyone for parents with kids and strollers or people toting luggage for workers with dollies or hand hand trucks and now today we can't even imagine a world without them because we're much safer and it's much easier to get along to get around and this is what was dubbed the curb cut effect another realization of the curb cut effect is closed captions personally I love captions they're great for the World Series or the World Cup at a at a loud bar they let me listen in silence when I'm on the train or at the library or I don't want to bother my coworkers as awesome as they are for me they're game-changing for the four hundred and sixty-six million people on the planet who have hearing loss so about a year ago we prototyped and built an accessibility first application called live transcribe and we launched it in February I'm gonna switch over and do a little demo of live transcribe demo yep great so so this is nothing special this is a basic Android smartphone as you can tell I'm just speaking naturally and quickly and it's picking everything up in real time it works in over 70 languages and it's a great tool or a great bridge to carry around with you in your in your pocket if you need to transcribe a conversation it's amazing it also captures very subtle nuances so I can say something such as I bought in New Jersey in New Jersey and it doesn't work for the demo but I usually say it and I usually say I bought a New Jersey in New Jersey well it doesn't like it it usually gets context but the demo gods are not forgiving today it also has a great feature which we call tight pack so if you prefer not to speak or in case you can't speak you can type a response back to the person speaking right here in the interface so it's really more of a two-way conversation tool than it is strictly strictly captions awesome my demo didn't work back to the slides please so now that we literally have captions in the real world what's the next frontier for captions so I'm super glad you asked on let's turn to the growth of online video and build empathy for people who are missing out on this explosion of content so every minute 400 hours a video of being uploaded to YouTube so we saw that this was a problem and in 2009 we added automated captions to all the content on YouTube which was an awesome awesome first start but what about personal videos social media podcasts you know the stuff that we love to use each and every day they're typically uncaptioned meaning for the deaf and hard-of-hearing it's effectively a silent movie so led by the Creative Lab in New York who is here today we started exploring a simple question could we automatically caption any media playing across her phone so together with a whole bunch of teams from Mountain View in New York and Tel Aviv and speech research and the creative lab and audio framework and accessibility we banded together and we spent a year trying to solve this problem and so I'm gonna let the result live captions speak for itself and switch over to the demo so I'm using Google photos and this is nothing special here it's a video of my cat and my fiance and a little message that they've recorded for me hi honey hope you're having a great time at i/o we miss you so much Tico misses you hope you're having a great time so you can see I can caption anything anything on the device and it went away let me bring it back hi honey hope you're having a great time at i/o we miss you so much Tico misses you hope you're having a great time hi honey hope you're having a great time at i/o we miss you so that's live caption and it works with podcasts it works with videos that you record voice mail or instant messaging apps on video and so forth and we're super excited you may have seen that we mentioned it in the in the keynotes today what you see there is literally just the tip of the iceberg so we use recurrent neural networks that we've shrunk down from these monster models that used to run in the cloud and we run them locally on the device and that gives that gives us some really cool advantages so one it lets the product work offline so I can be in a totally offline mode I can be in a plane it's going to work to audio never goes up to the clouds so it just sits there locally and best of all developers and content providers never need to lift a finger if there's audio there are captions so with live caption every app billions of videos and podcasts from any any source are one step closer to being universally accessible on Android now over to Melissa to talk about how we use research to inform our most important product making decisions Thank You Bryan hi everyone my name is Melissa Barnhart and I'm a researcher on Android and today I'd like to talk to you about the user experience so when you think about the user experience I'd like you to think about empathy empathy is at the heart of design without the understanding of what others see feel and experience design is a pointless task and to be empathetic technologists we have to set aside our own assumptions and consider other perspectives now a great way to build empathy is through user research user research focuses on understanding behaviors and needs through a variety of methodologies like interviews observations or survey just to name a few like a compass user research can help point you and your product in the right direction and you can utilize user research at any stage in your design and development process before you begin building user research can uncover an unmet needs and inspire designs that actually meet those needs the number one reason for startup failure is lack of market need for a product but you can minimize this risk by conducting research before development even begins and when development is underway user research can help you evaluate different design solutions and ensure that your user experience stays on track and then when your product is out in the market you can use research to help measure the impact of your designs and also to show the ROI so today I'd like to share an example of how we've used user research to improve live transcribe a new Android application that provides real-time transcription for the deaf and hard-of-hearing a team of research scientists at Google identified a new opportunity for automatic speech recognition and built the first prototype for live transcribe so we have the technology but what about the user experience we had a lot of questions about live transcribed that we hope to address through research with the deaf and hard of hearing participants for example what are their first impressions of live transcribe also how would they expect scrolling to work in both landscape and portrait orientations should it scroll from top to bottom should it kind of fade away or should it resemble a physical book but our most pressing question was about the visual design what font style type form text size text color and background colors should we default to we assumed that users would want to see differently colored confidence levels to indicate the accuracy of the transcription but we didn't know for sure and good research starts with good research questions like these ones so to answer our research questions we took a mixed methods approach we started with a usability study in a Google lab to evaluate our Minimum Viable Product with people who had moderate to profound hearing loss but we also wanted to know how people would use live transcribe over time how would it work in different environments like in the home or in the classroom and what people's needs change with continued use so to answer these questions we partnered with Gallaudet University the world's premier university for the deaf and hard of hearing and that collaboration allowed us to collect longitudinal feedback from faculty and students who use live transcribe in everyday life and after iterating on our designs based on their feedback and adding new functionality we conducted a second lab based usability study again with participants who had moderate to profound hearing loss okay so let's dig into those lab studies a little bit our protocols started with questions like tell me about a time when you use real-time captioning what did you like about the experience and what did you dislike about the experience so background questions helped us understand previous experiences as well as current knee then we enabled the prototype and we handed it to the participant and for the next 60 minutes we initiated conversation about a variety of topics and generally observed how the participants interacted with the phone we also evaluated different text flows and probed for feedback on the visual design so in our prototype white italicized text signified unfine alized text whereas yellow normal text represented finalized text did these participants understand that color and type forum indicated confidence in the transcription and was that information even valuable so it was in this research that we learned that changing the type form and text color actually distracted participants from the conversation additionally our participants self-reported that italics made reading more difficult for them and these insights were reinforced by previous research in this space which says that transcript is easiest to read when it's not layered with these extra signals we also learned that our original scrolling text flow felt natural unlike the other flows that we showed participants it didn't force them to shift their eyes to another part of the screen so because of these insights we removed those colourful confidence levels we used a normal type form instead of an italic type form and we proceeded with the scrolling text flow and this is just the beginning we'll continue to make improvements as we conduct further research but what I really hope to show today is the power of empathy so by putting aside our own assumptions about the visual design and conducting research with our target audience we changed the product design to be more legible and ultimately accessible and next up is Loren here to talk to us about dexterity thanks Melissa hi everyone my name is Lauren and I'm a software engineer on central accessibility now let's talk a little bit about dexterity the challenges users with dexterity impairments space and the services on Android to help with these challenges according to the US Census more than 50 million people or more than 19 percent of the US population has some form of mobility impairment more so in the u.

s.

more than 2 million of these people have an impairment that impacts how understandable their speeches over 15 million people have limited dexterity in their hands and nearly 20 million people have difficulty lifting and grasping the majority of this population is over 65 years old those with severe dexterity challenges include those with conditions such as ALS cerebral palsy Parkinson's or spinal cord injuries some conditions result in users experiencing involuntary movements such as spasms and tremors they may also have difficulties with speech and having their speech understood many conditions impacting the Xterra TR progressive as loss of dexterity happens fine motor skills may regress people might have difficulty with picking things up or maintaining a hold on items grasping or interacting with a phone can be particularly challenging for people within this population there are varying levels of severity of dexterity impairments and we offer various apps and services on Android that can be used to help users based on their own individual needs we introduced accessibility menu last year designed to make common interactions easier for people with dexterity challenges accessibility many years simplifies common actions such as power off locking the screen and accessing notifications and recent apps when enabled it can be opened via a button at the bottom of the screen which helps those with limited reach now we're adding new controls for volume and brightness so the user can continuously adjust volume and brightness without leaving the menu we've also added an option for large buttons in settings to make them any buttons larger so those that may have more limited hand dexterity can use voice access to control their phone via voice voice access uses voice commands to perform gestures on elements on the screen without needing to physically touch the phone voice access was recently launched in September of last year you can download it now at G Cove slash voice access voice access places an overlay of numbers over the actionable elements on the screen simply speak to your phone with actions such as click seven or go back to interact with your phone when voice access is enabled you can continue using your usual google assistant commands such as setting a timer or checking the weather a feedback bar is always present at the top of the screen that captures your spoken phrases you can specifically refer to elements by their numbers such as top seven or you can refer to the name of the elements so if you wanted to open an app or you want to open Google keep you could say open keep we're also happy to announce that we brought support to French Italian German and Spanish so voice access is great for those with limited hand dexterity and understandable speech but there's many out there who may not be able to speak clearly for those Android users we have switch access switch access allows users with limited hand dexterity and limited speech to navigate through their phone using external buttons or switches I'll show you what a typical switch might look like so a typical switch could look something like this with two large buttons and a user can interact with a switch with their hands or if they have more control in other parts of their body they may use another part of their party body such as their feet in a comment to switch case one switch can be likened to a tab that moves the highlight on the screen when pressed while a second switch can be thought of as an enter that selects the highlighted item when pressed switches can be configured to meet user needs and switch access can be used with as little as one switch most input devices work with switch access including just a regular USB keyboard so text editing is commonly a pain point for switch users so we recently added text editing menus these menus add shortcuts for previously time-consuming text editing tasks such as highlighting in the leading text undoing or redoing actions copy cut and paste this allows text editing to be done much more quickly and with the use of as little as one switch I'll show a quick demo of this okay so in this demo I'm using switches configured to the volume keys assigning switches to the volume keys is a very easy way for developers to test out switch access on their own apps without needing to have an external switch so using the volume keys I'm going to go through the elements on this screen until I get to the text field I wish to edit I can select the text field and it brings up our text many years so I want to move a word around so I will go to highlight and then choose the granularity that I wish to highlight so I'll choose highlight previous words which highlights this word peaches here then I can cut the word with the menu and then now I want to paste the word somewhere else so to do this I first need to move the cursor so I can go to previous and then I'll do previous lines to move to the beginning of this text field and then I can paste the word there but what if I made a mistake we have undo and redo actions added to these many years text editing mistakes can be very costly for switch users so these actions allow our users to go more quickly back to text editing and then now I'll go back to the slides so switching gears let's talk a little bit about what's new for vision to start with here's a little bit of background on this population according to the World Health Organization it's estimated that 253 million people globally live with moderate to severe visual impairment and people over 50 are most at risk 36 million of these people are blind we're here to help people with vision loss be able to access and enjoy not just their phones but the world around them lookout is an Android app that helps make the real world accessible for people who are blind or low vision by helping them complete tasks and routines more independently we launched lookout in March of this year now I'll show a quick video that speaks a little more to what lookout does and how it's used by our users person definitely need someone's help to pick out matching socks a lot of times my daughter picks them out love you Thomas future the goal is to be less disruptive to that routine that they're engaged in Patrick Clary product manager on Google accessibility for anyone with any type of impairment or a disability there's a strive for independence and I think this is actually just human nature you're relying more on other people to drive you around or help you accomplish certain things it can be frustrating with lookout we had been talking to many different users with vision impairments and we knew that there was a need to provide them with more independence with daily routines if you're using a lookout for the first time basically you open it up and you can just start pointing it at things and it will start telling you about the space around you would you like some coffee it'll tell you about objects and it can also detect things like bar code I mean it all feels the same sir it seems it happened today I couldn't have dreamed about you know I dreamed for phones to be talking they are talking I dreamed for ability to navigate we can do that today artificial intelligence it's gonna play a major role in the lives of people with disabilities and in the case of lookout it's going to be able to provide these users with an understanding of the space surrounding so as you saw in that video lookout is able to detect and recognize a variety of objects including currency and text through computer vision and machine learning lookout uses several image classifiers that run continuously on the device this allows lookout to process video and provide results to users in near real time since information is processed on device Lookout can even work in airplane mode results are also scored based on a variety of signals including user context our goal is to provide the most relevant results to users in a timely manner lookout contains three modes Explorer shopping and quick read that allows users to further contextualize their environment and what type of information they want out of it Explorer is the most widespread mode that can be used for a variety of daily routines such as cleaning the house cooking dinner or visiting a new space for the first time shopping includes on device bar code lookup as well as currency detection quick read focuses on text and can be used for activities such as going through the mail or reading labels on items in the kitchen here's a quote from one of our users I use lookout once in preparation for making spaghetti it very accurately describes each item without me having to maneuver the device or struggle to get the items in focus finding the correct ingredients to use when cooking is something that many sighted users hardly think about this shows how lookout can be used to really change lives by healthy helping people who are blind or low vision be more independent you can find lookout on Google Play at Chico slash lookout app it's available in the US and it was initially launched on pixel and we're happy to announce that lookout is expanding to Samsung S 8 through S 10 and LG g7 and g8 devices supports these new devices is rolling out now we want to get lookout in the hands of as many users as possible so we anticipate releasing lookout on more devices throughout the year now I'll hand it off to Kelly to talk a little bit more about what else is new and vision [Applause] hello everyone my name is Kelly Chang and a program manager from kukup exiting and passing Taiwan let's talk about how to make a phone accessible for polite and your vision users talkback it's a enjoy screen reader we provide a spoken feedback so let who are blind or have difficulty they can use their phone without you without a look looking at a screen topic can read us on screen reader loudly and then users to interact with their devices by gestures before we jump into our new feature I want you to think about something what will you do to fight the movie details like what Tyler movie start at the cinema you can never get it easily on your phone and it's simple for you now imagine that how difficult this would be if if you couldn't see on the screen not so easy right today we are going to give everyone a better user journey in your screen reader to help you find your navigate and find your content quickly people who cannot see well they need to swipe one element at a time to let a screen reader to lure the problem and you want if you want to find the things they may need to do a lot of gestures that's why we enhance the screen search feature so we provide a user interface to perform a search and it can be triggered by not only keyboard shortcuts but also by touch gestures and the voice input less more it provides any languages supported by Android and it's always available you you can use it in accessibility suite 7.

3 now ok lights I have a live demo okay that's a you are curious that roarin shall talk about loss switch SS so you want to learn more so let's see how he works in tall back without search talkback on Chrome and just listen how many straps i doing connectionists ht1 open tap more options products and features google logo collapse open the accessibility products and feature explore some of goop product sources accessibility accessibility get started with an android OS from hearing aid to android access switch access help link double tap to activate see it takes like 10 or 20 swaps and then you can imagine light if the link at the bottom so we may need is a swap more than 30 or 40 times now with such feature it becomes super easy let's try it so to start search swap let land down expand Roy OS type search term showing English us QWERTY keyboard research editing type search term edit box so it will bring you to a search fake page then let's switch lights tabular switch but you may wonder how people with limit either time-limited a site how they do it but it's the same behavior you just attach on a keyboard really you listen that's what you want a later you just lift your finger up then either world works so let's do it SS W by T selected switch CH to matches and ELISA with matches will be showing and then you can check which one you are looking for then double-click lays it away per unit or details dicey the first one is witchy Isis help is they exactly what you want and then you go it clear and switch access switch access help link chrome get started with accessibility scanner only DoubleTap to activate it's that simple so we the purpose we get a user Tula place faster and to make people more productive and efficiency ok let's go back to lost lights ducting this is one of my favorite feature you know why because when I impaired using my phone in low-light conditions and I believe a lot of you may have this kind of experience and it make it easier for you use your phone like really and browse the phone easily and it's either helpful for people who are sensitive to light or they have low vision so it's a session wired up thing and you can find the options under settings and you can find out accessibility and the display it's come young enjoy a cube and it also saves your battery life now let's talk about not just vision but all kinds of disabilities lights try on if you want to change your volume so you click on your value button and there's a right side of the screen shine you can see there's a control panel so it will show only for a few seconds and you imagine like if you people with physical impairments our limited asides they may not get enough time to do it so that accessibility timeouts it's a new feature we give us a little extra time to interact with their devices we hope you people to find a way and you can change your time on the accessibility settings so let mega people – it's more efficient what app developer would need to do is later useless accessibility manager yang together becoming the tire and the follow Laval you do works and this is also coming in NGO eq we will have diff type sections and tomorrow so you are welcome to join us so let's hand it over to Brian for a summary thank you thank you thank you Kelly Thank You Melissa Thank You Lauren Thank You Creative Lab on today we talked about our mission of making our products and the real world more universally accessible we dove into the curb cut effect to drive home how important and what the wonderful unintended consequences of building for accessibility can be I spoke a little bit about Android and how now with our smartphone it you are able to caption both the real world and almost all of the content on the device no other operating system in the world can make that can make that claim Melissa did a wonderful job at describing and demonstrating the before and after of how we use her research to inform our decisions and really upend our own biases and make informed decisions based on data lauren demonstrated voice access and switch access that innovative way to use a two-button keyboard I think that is so cool that you can text edit with two buttons it's just awesome Kelly talked about talkback dark-themed and all of these wonderful new products that we have for vision so I'm not gonna keep you guys any any longer but I want to let you know about a couple of other talks including one today that we're going to do on accessible audio on Android at five pm on stage seven so if you want to dive in on the audio topic we'll be talking about audio it's gonna be really cool we'll do a hearing loss simulation tomorrow we're gonna have it talk on des mystifying accessibility development on Android that's targeted toward developers and then lastly we have two accessibility related experiments tent one the experiments sandbox and the other the accessibility sandbox so everybody thank you so much for coming today thank you for your interest in accessibility and we hope you have a blast at i/o [Music].

Hãy bình luận đầu tiên

Để lại một phản hồi

Thư điện tử của bạn sẽ không được hiện thị công khai.


*