Josh Hickman shares his research at DFRWS US 2019.Josh: My name is Josh Hickman. I am the forensic scientist manager for the digital evidence section in the North Carolina state crime lab. And I am here this morning to talk about Android Auto and Google Assistant. And I know I’m the last presentation before lunch, so I’ll try to do my best to get you out here on time. Real quick before I do get started, I gave this presentation to our labs administrators as a dry run and I inadvertently set off a couple of Google Assistant phones during it. I will make a conscious effort to not say the hot word to trigger Google Assistant, but I may slip up inadvertently do it. So you’re forewarned.
So Android Auto and Google Assistant work together. You know, the ability to have our vehicles interface with our mobile devices is not a new concept, but it’s one that’s really started to get into the mainstream.
I bought a new car back in December. Android Auto and Apple car play or a feature that I ended up having to pay a little extra for. So that is something that is again not new, but really starting to get into the mainstream.
So when we talk about Android Auto, what is it? If you’re looking for a very elementary, rudimentary elementary definition obviously we need a device that can run Android. We need a car. And not just any car. There are some caveats there. I’ll get to that in a minute. And Google Assistant, that’s what makes the whole hands-free thing really work. If you take all of those things and you add them up, you get Android Auto. And one thing to note: Android Auto may or may not be a default app on a device.
Prior to PI Android 9 it was an app that you actually had to run out to the Google play store to download. I discovered that in PI that is now an app that comes in the the stock image from Google. If you’re looking for a more robust definition, you can really define Android Auto as a method by which Android projects certain application services up on senior vehicles, infotainment or telematics head unit, and don’t take my word for it. Google actually considers it to be a projection because it’s actually in the file folder name. This area is actually in the user data partition. So from a forensic standpoint, your ability to get to this data may vary depending on your tools, capabilities and you know, the status of the device.
Is it rooted? Is the boot loader unlocked? Is there a recovery partition there that you can utilize? I will tell you from my test devices one was rooted, the other was not rooted, but I did have torque installed, which allowed me to access this area, the device. So when we talk about application services, are we talking about the big things, I want to be able to make and receive phone calls while I’m driving, while I’m sitting in traffic and not having to hunt for my phone in the car. I want to be able to send text messages and have text messages right back to me. For the same reason I [don’t] want to be looking for my phone and fumbling with the phone and inadvertently, you know, accidentally run into the back of somebody who stopped in traffic.
You know, and I want to be able to do those things and not get lost. Right? I want to be able to travel to my destination and not get lost. So the big three, we were talking about phone messages and Google maps or navigation. But if you’re a third party developer and you feel like your app has got some use case there inside the vehicle, the Android SDK actually allows you to tool your app to be able to be used at Android Auto.
So there are some big names in this space. Spotify, Apple Music, WhatsAppm and one that I actually don’t have up here, Telegram actually will allow you to use it via Android Auto. I found that to be very interesting. And then, you know, if you’re a podcast person like me, I had Google Podcast available on the device. But you may have sensed a pattern here with the third party developers, you know, audio playback, communication and navigation stuff that we’ve been doing in cars for a long time, just not in this fashion.
So like I mentioned before, not every car has Android Auto capabilities. And so there is support globally, 47 different auto manufacturers have various models in there. There are stables that will actually support Android Auto. And again, they’re major players. Most of the major auto manufacturers globally do support it.
Now, one thing to note the newer models will actually support these. So if you’re looking at a vehicle, 2016, 2017, that era, there may be very limited or no support for Android Auto. However, 2018 and 2019 there’s ample support across these. Now, I actually wrote a blog post about this back in January and at the time the count was in the high thirties, so 10 additional auto manufacturers between January and two weeks ago actually came on board.
But if you have an older car and you want this functionality, you can actually go down to your local best buy or car electronics store and pick up an aftermarket stereo unit and actually install it in your car. And you will actually have this capability. And for those of you that are CarPlay people the same goes: most of the time you’ll find these aftermarket units will support both.
So just want to give a quick look around. This is the home screen. When you plug the device into your vehicle, there’s actually two connections that need to happen. You need a hardware connection, and it actually requires a Bluetooth connection to your vehicle as well. I’m not sure why that is the case, but that’s what’s required. If you don’t okay the Bluetooth connection, it will not function correctly.
But this is the home screen. As you can see, I’ve got some data sitting here in regards to the weather, and then it looks like there’s some navigation stuff down there at the bottom. It’s just like working in an Android. This stuff is contextual based on time and location. So I took these screenshots at work in Raleigh, and it knows that anytime after lunch I usually head to the house.
Now that’s my old address. I have moved since that, so don’t come there looking for me. You will not find me there. You’ll probably freak out the new homeowners. But it’s pulling all that pattern of life data down from Google. That’s attached to my test account. This is my test account, but it’s really my account. But all of that will change over time.
Now when I hop in the car, if it’s closer to five o’clock and I’ll hop in the car, it knows that I like to listen to podcasts. So it’s going to put a card up there and Hey, fire up Google podcast and I’ll queue up your next podcast there. The buttons down at the bottom are virtual buttons and I’ll run you through them real quick. Left to right. The first one is navigation and this is how it looks on your head unit inside the car.
As you can see, I just grabbed some, some directions to Starbucks. And then down here in the bottom right is just the actual, the act of navigation, a phone, again, number pad you know, the usual buttons that you see there. And it functions just like phone on the device.
So I can loop a third person in. I can actually Google cast the audio to a cast device, which I found to be interesting. I don’t know why you would want to do that. I can mute the mic and I can actually pause the phone call. The circle button in the middle is actually the virtual home button. It works just like the 100 button on your Android phone and you press it, you get back to the home screen, the headphones relate to the audio playback stuff.
So just below Google Podcasts, Spotify is actually there, but the green dot indicates which app actually has control of the audio interface at the time. And in this case it was Apple Music. The last button on the far right, I call it the emergency exit. This screen will actually look different depending on your vehicle.
The bottom card, ‘return to Google’, I was emulating this to get the screenshots, but for my particular vehicle, which was a Nissan Rogue, that that car actually says ‘return to Nissan’. And what that does is it actually causes me to leave the Android Auto interface and go back to Nissan connect, which is Nissan’s homegrown software for their vehicles. The key thing to remember though is that Auto is actually still running in the background even though you’ve exited the UI. If I need to come out of the UI real quick and adjust my equalizer on my audio and then go back into the Android Auto, I can easily do that.
So what gets left behind when I use Android Auto? Well as it turns out, not a whole lot. Again, because this is a projection, the apps are actually still doing all the heavy lifting in the background. And because of that, they’re actually the keepers of the data. So for example, if I send a text message to my wife on the way home, there’s no remnants of that text message in Android Auto. I just have to go to the vehicle database and pick the messages out of the database there. So keep that in mind. The individual apps are still the keepers of the data.
However there is some data that is specific to the Android Auto usage, and I’ll walk you through those real quick. So I had access to an Oreo image and a Pie image that both had Android Auto data in it.
There is some consistency, and there is a subtle difference between the two. This is the consistent stuff and I’ve got screenshots of everything up here, and we’ll just… as you can see, you know, there’s quite a little bit of data there… and this is the Pie only stuff. Now, whether this is due to an update to the app, the differences between the OMS platforms or the specific capabilities of my device, I’m not sure, but just know that you could potentially see some differences there. So your mileage may vary.
The first one, the time that Android Auto was last run. And as you can see, I’ve got the Oreo file on the left, the Pie on the right. All the time stamps are Unix epoch. So, you know, plug it into your favorite decoder, and you’ve got your timestamp.
The next is the Bluetooth Mac address of the pair vehicle. Now you can get this from another source inside of Android. However if I just pair it via Bluetooth, in other words, I just go get in my car, just hit the Bluetooth and pair it and I, you know, do it that way. It will not show up here. The Mac address is registered here only when it’s paired via Android Auto.
This was actually an interesting find for me. I was not actually expecting to find this. The car named there Nissan is actually the Bluetooth name at my vehicle. So that’s where it’s pulling it from. But there is a set of GPS coordinates in there.
Now the coordinates are accurate enough to where it actually got the exact parking space of where I parked my car when I got out of the vehicle that day to go to work. And I was actually really surprised by that, how accurate it was. And if you notice, there is actually an accuracy reading there: 3.0. I don’t know if it’s like three millimeters, three meters, three feet, three miles; probably not the last one. And you know that. So let’s see. I’m going to skip through a few of these here. These are some Pie-only settings. And I think the PowerPoint presentations we actually get personal on. So I’m just gonna kind of skip through this real quick.
And again, this is what it looks like when I get a message. This can actually still be distracting. So you know, Google Assistant is there to kind of keep your eyes on the road and off of this unit. So again, this is where Google Assistant comes into play here. I can summon it, it’ll listen to me, it’ll accept input and actually take action on my behalf and then provide feedback based on that action. It may be able to accept additional input based on what I’m actually asking it to do.
As it turns out, Google Assistant leaves remnants of itself behind on your Android device. All of this stuff is stored in [inaudible] data. And you know, I’m not sure how, I don’t take the same view of product buff data. Some other people, I tolerate it and that’s about all I can do with it. But for as far as Google, as far as Android Auto is concerned, there’s actually stuff there that can be beneficial to you.
This is where the [inaudible] buff data is stored, it’s actually stored in the Google quick search box in the app sessions folder. And as you can see, these were the screenshots that I took of the two devices that I had access to. The cool thing about it is that these protocol files will actually tell you where they came from.
So in this case, Android Auto is car assistant. Sometimes in these protocol files you will actually find audio data embedded. And it’s just a matter of being able to go in there manually, carve it out. And then, you know, for me, for example, actually it’s just stuck at VLC player and it actually played.
I don’t have time to do that. So we’ll just skip the video, but I just had a quick demo there. And again, this is the file that I generated during that that particular example. The thing to remember about these files is that there’s two things in here: the actual text of what I told Google Assistant at the time, along with a timestamp. There are markers inside of these protocol files that will actually alert you to when you’re getting ready to stumble across some texts that a user spoke to it. It’s a nine-byte string. And I’ve got up here in this screenshot of just, it’s up there in the red box. But the cool thing about it is down there at the bottom, there’s the timestamp.
Being able to know what a user told the assistant is pretty beneficial if I’m investigating a traffic collision and the guy says, you know, I sent a text message but I had Assistant/ I actually now have a way to verify that he was in fact not using his phone and he was just barking at it while it was sitting there in a cup holder.
I’ll get to the file structure here. So again, because it’s [indecipherable] data, it is very unstructured. We don’t have the profile to help me coat it. But based on my interaction with several samples of Android Auto and Google Assistant, I was able to come up with a rough structure. The diagram on the left is just the structure of the entire part of our file.
But if you’ll see there, the big purple block down there says ‘vocal transaction,’ that actually has a structure all in and of itself. So I just wanted to highlight that real quick. The cool thing about these protocol files is that there actually is other forensic applications to it. So if I have my phone just sitting on a desk and I say, okay, G do something, there’s actually a protocol file that gets generated as a result and the header indicates that instead of car [indecipherable].
If I do a search vehicle via the Google quick search box, it’ll say ‘search’. For those of you that have used Google lens, if I just use lens, you’ll get a part of a file that says ‘lens’. So there’s a lot of forensic applications. They are beyond Android Auto.
And real quick, just looking ahead, if any of you guys were paying attention during IO, you can basically take all the screenshots I showed you previously and [indecipherable] it, because Google has decided that they’re going to change the aesthetics of Android Auto. This is what it’s going to look like. And I’m not sure if that’s going to change the forensic artifacts that are available. The other thing that they said during IO was that they are going to start pushing a lot of the natural language processing out to device on the edge.
They want you to be able to use Google Assistant when you don’t have an internet connection. So if that means that there’s actually going to be more data in these protocol files is yet to be seen. But once that starts happening, I will actually be running more tests.
I apologize for the quickness of that presentation. I’m pretty risk responsive. There’s my email address, do respond on Twitter, and I do have a small blog site and the data for the Oreo empire actually available through that. If you want to download that you can do that there and tinker with it as you see fit.
Host: Okay. Thanks Josh. Do we have any questions? We’ve got just a minute, if there’s any quick questions.
Audience member: It’s a simple question. Isn’t there any Brutus bandwidth problem as far as communicating back to Google?
Josh: I don’t know. I live in an area that has AT&T’s fake 5G coverage. So I always see the 5G stuff pop up. I will tell you that with the embedded audio data, there is a lossy compression. So in my orient image, I actually did see MP3 data with the PI. You actually see OG borbus, which is a lossy compression so that it actually does get compressed down fairly big. The files are maybe a few K at best. It just depends on the session with Google Assistant at the time, but you really see no lag in there at all during the interaction.
Host: All right. Any other questions? All right. Thank you very much, Josh. Sorry about the time.