What's new in Android (Google I/O '18)

[Music] Hey hey everybody welcome back to what's new in Android I'm Dan Sandler from the system UI team I'm Chet Haase from the Android toolkit team and I McGee from I still don't have a name for my team so from the Android team and you may remember us from other talks such as what's new in Android 2017 what's new in Android 2016 what soon Android 2015 was soon Android 2014 not what student row 2013 we don't talk about that we don't talk about that that was the time that we had jellybean two years in a row was brilliant we didn't have to redo the logo or anything but now what's new in Android what I like to call the Android keynote nobody else does but I like to call it that because this is where we talk to you about all of the developers stuff going on in the Android platform in particular let's talk about Android P specifically let's talk today about oh hang on Android api's [Applause] at first we'll start with this regression distribution you saw in the keynote to introduce the dynamic add bundles tours demo was pretty clear it's pretty easy for you all you have to do is click a different menu when you build your application and we're going to save you some space it's going to be faster and easier to download for your users to download your app and I'm sure you have a lot of questions about it so we have a couple of talks this afternoon at 5 p.


and 6 p.


go there if you want answers because we don't have them so you're gonna see a slide a whole lot like this through the rest of the talk I feel like the main function we serve in this talk is to tell you the other talks to go to we're like the appendix we're like the index for the rest of the content let's be more like obsolete like vestigial is that what it is I don't like to think about that yeah let's be clear a lot of people back at work I've done all the hard work we just get to go on stage and talk about their hard work so speaking of that let's talk about Android jetpack we heard Steph talk about this in the developer keynotes this is a set of components as well as guidance on how to build better Android applications all of you are familiar with most of what is in Android jetpack already what we're doing is adding to it over time with stuff that's going to make it even better and we're also improving it over time one of the major steps that we're taking is what I like to think of as a refactor because it's a refactor my favorite thing about the support library is how the package names embed the release number in them so for example we support things like v4 actually we don't support v4 anymore we have a min SDK of at least 14 now but it's in the package name isn't that a little bit silly so we're doing away with that we're doing a whole lot of tedious renaming and we're also providing tools to make it easier for you to do the similar refactoring you're going to need to do in your application as well as an Android studio everything is being renamed to something more appropriate called Android X if you want to know more about the details of that the renaming as well as doing more modular more fine-grained splits to make sure that you know dragging too much stuff go to the talk learn what's new in Android support library also there was an article that was posted on the Android developers blog about a half hour ago check for more of the details there let's talk about Android test which is part of this new a jetpack thing going on Android test is the 80s L stuff the espresso stuff that hopefully you are already using really good ways to test your application now they provide first-class Kotlin support as well as more elegant api's for reducing a lot of the boilerplate here's a simple example we used to have a way of asserting which a was not necessarily obvious in the parameters you were passing also in the order of the parameters that you were passing and then it would give you an error message that also didn't really help very much so we have something a little more sensible now you can assert that it's actually working on the visible property and the error message gives you something more that you can work with a little bit better go to the frictionless Android testing talk for more information about that stuff jetpack architecture is about the architecture components that were announced last year at i/o and then iterated with feedback from the community and finally went one dot o in the fall so we have the release parts of those which includes all the lifecycle stuff and the view model stuff as well as the room the persistent data model stuff and live data so hopefully you are using that stuff already at least in your new applications and what we also have is recently we released the paging library for doing asynchronous data paging into recyclerview that was alpha then beta because that's how those things work and now it's 1.

0 this week so please start using that and we also talked in the developer keynote about a couple of new things that you should check out soon work manager is currently in preview there's going to be a talk about it it's about job scheduling but job scheduling in a way where we handle all the cases back in previous releases instead of you having to use specific approaches depending on what version and Device Lee Iran also navigation it turns out that up versus back is a hard problem for applications to solve we are making that much easier and we're integrating with the tool to make it even easier yet so go to all these talks there's an overview talk as well as specific talks on navigation control where and work manager and also a talk on recyclerview and paging me again it says your name on this plate keep building suspense into this thing what's going to happen next who's he gonna hand the click or two still mine still you okay let's talk about battery this is one of the ongoing efforts in Android to help the users because it turns out battery is really important we're all power users unfortunately we just keep using the power so what can we do about it we can create these app standby buckets we're going to monitor the usage of applications and see how actively the user is using it and then make determinations about how much access that application has to ongoing things in the system which are going to take up more battery we also have background restrictions that the user has the ability to kick in in settings so if an application is behaving badly let's say holding wakelocks for long periods of time or waking up constantly or accessing services way more than they should when it's not on charger then we'll note that and and expose that in settings and then the user can take some action on that if they deem that necessary go to the battery session on Thursday morning to learn more the details there so one of the things that we've been focusing on with Android piece privacy maybe that's what it stands for so one of the things we've done is that when your app is in the background it doesn't have access to the microphone anymore it doesn't have access to the camera anymore and it doesn't have access to the sensor kind of so you won't be you won't receive the data from the sensors automatically you can manually pull from the sensors and you'll get to batch update but the best thing that you should to do if you want to get access to the sensor data is to keep a foreground service running instead so no more microphone no more camera for you I think I've heard in the past that some apps were trying to stay alive in memory by playing a white noise or keeping the the microphone on don't do that anymore it's not okay cut lean it's this little thing we announced last year so we're busy we want to make it better for all the Kirkland developers out there I'm sure there's a lot of you here today so some of the things we've been doing the art team has been really busy with d8 r8 and art itself they've been looking at the bytecode generated by the cutting compiler divina lies some of the bytecode patterns that were different from the ones generated by the java programming language compiler and they develop in optimizing for those patterns we've also been adding a lot of knowledge the annotations to our Java API is both in the core libraries so lip core and our support libraries to make it easier for you to use the platform API so when you're when you're in Catalan and finally we launched on github a new library called Android KTX it's a set of cutting extensions for existing platform API and the goal here is to try to take advantage of some of the Catalan language features to make existing API is easier to use they're already easier to use just by using cotton but with the extension that gets even better we and I want to thank the community because we've received dozens of pull requests and also bugs and feature requests from you and we've accepted a bunch of them so if you have ideas if you have things that you'd like to see in Android KTX please go to github and we'll take a look at your at your PR and this is an example of the kind of code you can write with KTX if you want to create a bitmap you don't have to specify it's a RGB 8 8 8 8 anymore you can call apply canvas with a meta click create the canvas for it becomes this and at the bottom you can see font since the destructuring assignments for a color integer so you don't have to do any shifting or masking of the the int into bytes we'll take care of that for you there is a talk by Jack Wharton on Thursday at 10:30 a.


is going to go through most of the extension it's going to talk about the philosophy behind the extensions how we write them what are the kind of extensions that we that we want to see in that library what we're not looking for in that library so before you do all the work and send a PR go out in that talk to understand what we are looking for we already talked about the Android test stuff that is part of jetpack earlier that's probably a better more holistic way to test your application but if there's a specific situation in which you find it necessary or helpful to mock the framework and I don't mean ridicule it because that wouldn't be nice then it is possible to do that in easier ways now in mockito we are not changing the framework but we're actually integrating changes into mockito itself so you can now mock final methods and soon you should be able to mark static methods and check is making that face because it doesn't understand why that's that is so interesting to you and system created objects like activity eventually working on that internally but it should be on the way eventually a background text measurement this is part of a bunch of smaller changes that we made in the text area it turns out that measurement is really expensive so most applications do text and that we'd bet that the text operations in your application are some of the most expensive going on in the UI thread which can contribute to Jang wouldn't it be nicer if you could offload that to a background thread so that by the time you actually need to render the text or perform those operations on the UI thread most of the hard work was done for you so the observation is that 80 to 90 percent of the operations that are necessary for actually displaying text happen in text measurement well we've made it possible much easier to actually perform this as a background operation so we have a class called pre computed text and you can query that and you can say I want you to pre measure this and then you can set that text that's spannabis x field later whenever you want so you do it in a background threads like this you say create this thing and then you can set that on the textview later when you actually need it should be much faster magnifier is something that if you're using the preview releases you might see if you select some text it pops up this little bar up above it makes it easier to manipulate the cursor that's really great for text but the other cool thing about it is that it's also available for your applications for any other use case so there's an api that allows you to pop up this magnifier for whatever happens to be in your view so you can show this and dismiss it and use it for your stuff as well so core functionality that we want in the system user interface but also useful API is for developers to use for their own purposes I don't know if you've worked with your design department and they've specified something about text in your UI and okay well I want this aligns you know this many dips away from the top and then I want the baseline on the bottom this many dips away from the bottom and then you have this interspersed vertical alignment stuff going on and then you sort of puzzle with this for a while and you basically futz patting in all kinds of different configurations to sort of get it to where they wanted to get it we have worked with that and created some new attributes and methods and properties for you to use that make that much easier so we allow you to just pass us the information about the baseline alignment calculations that you would like to perform and then we'll futz with padding on your behalf SmartLink if I I think of this as being Lincoln Phi but smarter so we already have the ability to ask for links in a block of text and it'll detect things like phone numbers and addresses but we also have the ability through machine learning models and stuff that you've seen through smart text selection to detect other entities we can do the same thing with link of Phi it takes a little bit potentially longer so you do this off thread so you would basically generate the links off thread and then set it on your text view later using code similar to this there's a text talk on Wednesday evening so please go to that for more details about all of this as well as more so location now you can take advantage of a new package Android that net dot Wi-Fi though that RTT it's the Wi-Fi round-trip time API it requires compatible hardware on your phone it also requires a compatible access point and it allows you to – sorry – find the precise location indoors for the the user's device you need to request the find location permission and you don't need to connect to it to the access point so if you're building an application that requires locating the user into inside the big building you can take advantage of this pious API in Android P accessibility has some improvements for navigation through the app so it's easier for you to declare these functional blocks makes it easier for accessible users to understand how things are being grouped on the screen there's an important talk on accessibility right now no that's tomorrow tomorrow okay so hopefully it's tomorrow I'm not sure that's correct it's what it's tomorrow two minutes ago okay all right it's on YouTube all right yes it is eventually like now yeah I think I got the day wrong sorry about that it's now if you're in the wrong talk I invite you Oh my turn security new API in android pd unified biometric dialogue so we deprecated the fingerprint manager because there are more ways to authenticate yourself with your body than just with the fingerprint could be your eyes could be whatever else that device manufacturers will think of next so now we have a single UI for all devices and all means of authentication we also have stronger protections for private keys and very important in your application if you're using the API build with serial it doesn't work anymore the API is still there but basically returns bogus data so you cannot rely on it at all anymore various changes in Enterprise just a couple of them that are interesting we made it easier to work with work profile apps or different profile apps by having these different tabs you can associate with them so they're not all mixed together be to actually have these whole sections of the different profiles also you're allowed to lock packages to a specific task you could have a launcher with just a minimum set of a group or a single application that works in combination with the ability to have ephemeral users which now gives you kiosk mode so you will no longer have experiences like I had on a recent flight as you can see from my blurry picture where you see a movie and you wonder what operating system is running under that so you swipe from the bottom of the screen you see the ICS navigation bar and then you press on the recent test and you swipe the movie away and you confuse the heck out of the system I have to admit that's what I try every time I'm on the only plane right base and it works a surprising number of times this is like more fun for Android engineers this is what we do hopefully this is nothing – the navigation now is OK can I talk now it's alright very very very briefly all right let's talk about actually a lot of the system UI stuff it typically gets shown at one of the keynotes that precedes what's new in Android so you've all seen a lot of the great stuff that users are getting I'm going to talk to you about some of the stuff that as developers you might be interested in the first one is display cutouts aka well there are other names for it these are coming to the ecosystem all over the place and so as a developer you need to know where it's safe to draw and where it isn't when you get your window in sets on you window inset something or there you get a display cutout object which gives you all kinds of interesting data about the cutout but you're probably going to want to use something called window layout in display cutout mode on your windows so there's a the basic way which is I I never want to overlap the cutout just leave a black bar at the top or the bottom whatever I'm not all that interested a little more advanced would be display cutout mode default which is if you were already going to clear the status bar just fine we'll let the app window draw underneath the cutout is well so you'll get the nice action bar color extending through the status bar and extending underneath that cutout better still or more advanced still is short edges cutout mode which means essentially anytime there's a cut out on the short edges of the device and we're in portrait I will just draw underneath you don't have to do anything special and in that situation you do need to look at the display cutout and ask it for the safe in sets essentially well okay I'm drawing everywhere but you tell me what rectangle what single rectangle of the screen is safest to draw in and then finally the cosmic version of this you can do short edges but you can actually get the bounds of the cutout as a region so you will get the exact set of rectangles that are unavailable to you on the screen so you can display UI in the corners if the corners are available or if there's a corner cut out you can move things out of the way so that it's visible in the center of the display this is the most advanced version of it and you can put the electrical tape away because you can actually simulate notches now in developer options on your device this is really really exciting slices actually was what we were calling it internally and we liked it so much we just kept it you've seen slices now and a couple of keynotes is essentially something that we've discovered and learned about on system UI and the toolkit over many years of dealing with remote views for for app widgets dealing with notifications essentially the problem of getting content from your app into some other place so slices is our new approach to remote content that you can actually use to project UI into your own app or into other apps that support it it's very structured this is not sort of here as a canvas or an absolute layout go nuts with it we give you a structure to fill out and a whole bunch of very flexible templates in which to populate that data with some display hints so that the receiving end of the slice a slice host kind of knows what to do with it these are interactive these are updatable this is meant to be something that holds rich UI sliders controls live information possibly videos things that actually feel like real UI is opposed to a snapshot of something happening in a distant process somewhere slices are addressable by content URI and this is how they're passed around the system and how they're passed along to app indexing to be shown in contacts like search and then finally slices is entirely inside the support library it's entirely in jetpack so it's backwards compatible you can use slices all the way back to API 19 there's a great talk about slices tomorrow bright and early building our interactive results for Google search come to find out more about how all this technology works and how you can build your own related to slices is actions you can think of these as shortcuts with parameters ramallah likes to think of them as visible intents this is essentially a deep link into your app with some additional payload it's not just a link to music it's linked to a particular album or something like that and you saw these as well in the keynote showing up in a predictive space inside our app launching experience actions you define in actions XML file that goes into your apk or app bundle and that's who can get registered with that indexing so that search results and predictive features can show those actions and there's a talk about this – Thursday slightly less early in the morning integrating your Android apps with the Google assistant notifications there's a lot of great stuff about digital wellness and controlling notifications that you saw in the keynote and I'm very excited about NP I'm gonna talk about some of the developers stuff though that we have in here we asked users what notifications are most important to them users love messages so we focused our energy in enhancing the messaging style API you can do inline images now you can do participant images and attach other metadata about the participant and we finally now have UI for smart reply which we've had on Android wear for years so when you use remote input dot set choices those will now appear as chips right in the notification so you can respond instant in the middle of a chat without leaving the notification shade there's tons of other stuff as usual I had one other slide that we added about ten minutes ago to this deck about notifications and I'm just gonna let that sit on the screen for a little while so if you're doing something in the background the user still needs to know but Android P does a much better job of allowing notifications you may already have running testify to that background activity including things like overlaying Windows so with that talk about the runtime on supporting one of the most important things for you as an Android developer is to understand our deprecation policy it was announced a few weeks ago soon we will require all the applications to target some of the newest API levels and we're doing that to make sure that we can keep the security level of Android as high as possible as well as performance and a lot of other nice things so what does it mean for you as of August this year new applications published on the Play Store will have to target API level 26 and as of November this year and the update to an existing application that you publish on the Play Store will have to target API 26 and you can expect those numbers to go up over time if you have a native code in your application we've been supporting 32-bit and 64-bit for years now we will make 64-bit ABI is required as of August of next year you'll still be able to ship 32-bit support in your application but we will ask you to ship 64 bits as well one of the reasons to ship 64-bit on 64-bit devices is that you get better performance and much better code out of it if you want to know more about the depreciation policy there is a talk to more after know on Thursday afternoon and I'm sure you have a lot of questions for the folks there I've compatibility if you've tried the Developer Preview in P you might have noticed something different if you're one of those nerdy applications that uses some of our private API on Android we have two types of private IP is the APS that are actually marked private and then there's the this weird add height thing that we use in the in our source code it's a Java doc tag that we process especially to indicate that this is a public API force not for you just for us and we're a bit jealous because a lot of you were using there so from now on a lot of the CPS will trigger warnings in one form or another it might be too so it may be logs when you make an illegal call for some of these API is we need to hear from you that you need those API is for your application to keep working sometimes it's just another site we didn't make the API public it's just because we didn't think about it so please go to this URL and let us know if there's an API that you think should be made public we might say yes we might also say no we have three types of lists I won't go into too much detail here but basically if an API falls in the blacklist you will not be able to call it ever I'm not sure if we have anything in the blacklist right now but those will evolve over time so again let us know what API is you need this is one of the important reasons this is why we ship previews we need you to go out there and try your applications on the previews because this is the time to find out when there's problems that either you can fix before the real release is out there or you can let us know if it's a problem that we need to work on instead and that's a chat trying to increase his engagement with this podcast so there's this podcast called a DVD Android the wrapper backstage and in the episode 89 they had who did you have in the episode Brian Carlstrom Brian Kallstrom and they talked about the compatibility issue and what it means for you so good listen to it and you came the release 17 of the NDK brings a lot of very interesting things so first of all the neural network API is that were part of API level 27 we also have a new shared memory API if you do a lot of j'ni more importantly we have finally a send the address sanitizer to make sure your code is in scribble all over the memory now you don't need a really device anymore to use it and we also have a undefined behavior sanitizer it can be very difficult to detect undefined behaviors in your C or C++ code so now there's a tool for that we finally remove support for the deprecated API so if you still use md5 or MIPS MIPS 32-bit or 64-bit support is gone you should not be shipping those anymore in the upcoming release our release 18 we will remove GCC so the GCC compiler was deprecated last year now everything is compiled with clang in the NDK we we think we gave you enough time so Jesus is going away so if you're still using it now it's maybe you should not be in this talk you should go fix your code and finally we added support for the simple per CPU profiler and we also have support in the IDE in Android studio for the for native profiling so you don't even have to type anything in the terminal graphics and media the camera API is getting better and better we've added a lot of things that we are using ourselves in the camera in the camera applications for instance who give you access to the time stamps of the optical image stabilization so if you want to build a kind of stabilization with built in our video recording an individual recording part of the camera application now you can if you're doing if your app is doing a lot of selfies and using the displays with flash you can tell the camera that you're doing that so you can adapt the exposure accordingly we have support for USB cameras I haven't seen any use for it but I've heard some of you asked about it so now it's available multi-cap camera support they are some phones coming out out there with multiple cameras in the back or in the front I suppose and now we can expose them as a logical camera that contains more than one stream of data image the color I'm sure a lot of you have familiar with did map factory and you should not fond of that API trust me will not either so there's a new one called image decoder it's part of Android P the idea is to make it only easier to decode images but also to make it possible to decode animated images so image decoder can decode bitmaps which also drawables including animated image drawables you think all the animated gifs I don't know how kids are using them possible I actually love those now so there are few concepts that you have to learn when you when you learn image the Curan we're going to go through them in an example so we have the concept of a source we have the concept of the post processor and finally the header listener so this is what the API looks like first you have to call create source on the image decoder the source can be an ascetic and EFL the scripture it can be many different thing and the goal here is that once you create a source you can decode multiple images from the same source this is particularly useful if you want to build thumbnails you can decode the same source once at high resolution once at lower resolution or even intermediate resolutions and that can be done from multiple worker threads then you can called iko iko decode bitmap you pass the source and you can should we pass ahead or listener so their head early center here we have a lambda it gives you back the decoder itself the metadata about the source about the image and the source and it's inside the header or listener that you can set the option so in bitmap factor you had bitmap factory that options that you had to pass to decode here you have to wait for the header listener to be invoked upon since we should point out to like that set target size that's kind of a fundamental difference where the old bitmap it's up there first and then oh I see that right there that's that's a huge difference like before if you wanted the right target size you needed to work with what in density as well as in sample size in density and do a lot of trickery we didn't know how they worked either yeah so a lot easier now yeah usually our answer was figure it out anyway so now you can just tell us what size you want the bitmap to be in and what take care of it finally and you can also set the post processor so post processor is a same thing as simple interface it gives you a canvas so you can draw on the bitmap right after it's decoded so you can you know I don't know a header like the title or submitted data on it media we are adding a support in the platform for the HD a profile of vp9 so you can play HD of videos in your applications YouTube was doing it but they had their own way of decoding decoding videos if you want a device that doesn't support a GTR do it the playback will be in a low dynamic range instead but on a device that's capable like a pixel to you will be able to see the HDR stream in all its glory while also adding support for the and bear with me because it's a little bit confusing a format called hgih gif it rolls off the tongue the high efficiency image format so it's based on HD VC which was also called h.

265 and the filename extension is commonly hdic i don't know why they use different letters but that's what they did so we have to deal with it so it is a container and it can store multiple images so you can use it to store a single image with higher quality and higher compression ratios than JPEG for instance JPEG is a lot easier to say I like JPEG but you can also saw multiple images if you want animated images of short movies this is part of the support library or I guess jetpack now and this is not the the this is not part of the compressed API that you find on bitmap because it's a container it works differently so this is what it looks like you have to create a builder you have to tell us the path where you want to output the file you have to give us an advance the width in the height of the image and the source of the image can come from the bitmap it can also come from a surface so you don't necessarily have one Sun so if you doing gyah rendering or if you're doing video playback you can include that directly as an image and we'll have to go through an intermediate bitmap and then when you call writer the choose you can add multiple bitmaps in this case we're adding only one and when you call stuff we write it out I should point out too that even though it's spelled hei see it's pronounced jiff go what are you still doing here Volken so this is the slide that I'm excited about very few will be and I'm sure you will be super excited about things like subgroup ups in YC RC B format more more more seriously anybody who's building middleware for instance the Unity engine and you know a few building games Volken is a low-level graphics API that gives you a lot more control over the GPU so we get higher performance cone and Vulcan 1.

1 adds new capabilities that were not possible in OpenGL but it also closes the gap with OpenGL so for instance to support protected content will you're not able to playback protected content invoke and before that won't lock a lot of things like video players for instance new networks API is they are not technically part of P because we announced I mean they are part of people to announce them in API level 27 you might have missed that it's a C API that's designed for mashing on it's also a fairly low-level API it's meant for basically playback of trend models so things like tensorflow are built on top of general Network API so you would use tensorflow to do all the learning and the neural network ApS can do basically the inference on device the interesting thing is that we're also unlocking the access to the DSP that we use on pixel too so you can get hardware accelerated machine learning effectively on select devices a our core is just so in the developer keynote couple months ago we introduced a our core 1.

0 they just announced a our core 1.

2 this takes care of tracking things in the real world the claim is that you had to write a lot of OpenGL code and it's not always pleasant so to make your life easier we introducing support in the emulator so when you create an EVD you can specify a virtual camera stream for the back camera and you get a full 3d scene it works in any any view that displays the camera stream and using the keyboard in the mouse we can just navigate around around the UI you even get patterns on that fake TV screen so you can do image recognition and pattern tracking that kind of stuff it's available today also we have seen forms so they already talked about it in the Deaf keynote I won't talk about it too much I will only say this is what I've been working on for the past few months so I still don't have a because we did that with a dream anyway that's what I was up to and if you want to know more about sin form there is a talk tomorrow afternoon at 5:30 p.


and we're going to talk about the underlying tech and how to use the API and finally Chrome OS on pixel book you can now run Linux applications on your Chromebook and in particular you can run Android studio I believe there are some limitations but one of the things you can do is run your and all the apps as apps on Chrome OS so you can use Chrome OS as a full-blown Android development platform I guess whenever it's available many many more sessions to that tomorrow and Thursday with also like you know I said that earlier there's a lot of folks back home we've been doing all the hard work we're just talking about it so thank you very much to all the engineering teams and the PM's and the designers and tech writers everyone who made this possible we didn't do anything we're just here to hear you we like to be happy some of us did even less yes okay and that's it for today so thank you very much trust me with the sandbox office hours bring your questions yes we have we have framework office hours there's Android sandbox running all week we've done our best to bring as many people from our teams here to be able to talk to you about these new features these new API is as possible because we know you can't get that anywhere else besides Google i/o and also there's an overflow session for our overflow area I can't remember where it is exactly most of the speakers from Android sessions will make their way afterwards so if you can't get your question answered at the session you should be able to find them after that like us there now thank you [Applause] you [Music].

Hãy bình luận đầu tiên

Để lại một phản hồi

Thư điện tử của bạn sẽ không được hiện thị công khai.