Presenter: Lance Mueller, Director of Forensics, Magnet Forensics
Join the forum discussion here.
View the webinar on YouTube here.
Read a full transcript of the webinar here.
Welcome everyone. This is Lance Mueller at Magnet Forensics. Today, I am very excited to have the opportunity to introduce and demonstrate the new Internet Evidence Finder Advanced Edition.
IEF v6.1 will add support to process iOS and Android smartphones. 6.1 will have the ability to read binary image files that may have been obtained during a physical acquisition process using forensic tools such as Cellebrite, XRY, or Lantern. In addition, it’ll also have the ability to parse files that may be obtained through a file dump using other forensic tools.
v6.1 will have the ability to parse data from native and 3rd party apps that may have been installed. In addition, it will carve data from those native and 3rd party apps that may have since been deleted or, if the phone has been factory reset, it’ll carve unallocated space looking for those artifacts. v6.1 has the ability to parse and carve iOS and Android native client apps such as SMS, contacts, email, call logs. In addition, it also will parse and carve iOS and Android 3rd party apps, such as Facebook, Twitter, Snapchat, and many others.
IEF Standard edition will continue to support PC and Mac investigations. The new IEF Advanced Edition will have the ability to parse PC, Mac, iOS, Android, and Xbox artifacts. The new IEF Advanced Edition will be available in late June or early July 2013. Existing IEF Standard users with active SMS will receive IEF Advanced Edition as part of their current subscription. At the time of the next SMS renewal, the customer can choose between Standard or Advanced SMS subscription, based on their use or need.
Let’s jump in and do a demonstration.
Here’s the initial splash screen of IEF 6.1 Advanced Edition. Notice that we have the addition of the Mobile button to the far right. These other three buttons have existed in IEF Standard Edition for quite a while, and they provide the user to point to a physical drive that may be connected, files and folders, or an image of a hard drive. This new button, Mobile, tells IEF that you actually want to process either a physical image obtained through a forensic acquisition tool for mobile phones or point to a path where there’s a file dump of files pulled off a mobile device.
So let’s click on Mobile, and we get the initial screen to tell IEF what type of mobile device we’re going to process. Are we going to process an Android or are we going to process and iOS device. So in this example, I’m going to process an Android phone. So I’m going to click on “Android”, and I now get the 2nd UI, asking is this going to be a file dump or an image. So in this example, I’m going to provide it an image – a physical image that was acquired through Cellebrite. And I’m going to say “Images,” and a dialog box is going to open up, to where you can navigate to where the image is stored. In this case, this [galaxysSmage.dd] is the image that I received when I queried the phone through Cellebrite. I’m going to say “Open” and IEF is then going to show me the search type which defaults to “Full Search” all the time. And I can then choose additional search options, so if I didn’t want to do a full search, I could do a quick search or a sector level search.
So full search means that IEF recognizes the file system in the image, and can distinguish between files, folders, and unallocated space. And a full search would include all those files and folders, as well as unallocated space. This is the recommended way to search, because you’re going to not only get unallocated and allocated, but IEF will be able to tell you, when it finds a search hit, where it is. So if it’s in a particular file, if it’s in a particular area, in unallocated.
If you do a quick search, it will do all logical files and folders, excluding unallocated. If you do a sector level search, it takes a look at the physical image and starts at physical sector 0 and goes all the way to the very last physical sector in the image, regardless of the file system. So IEF will not distinguish or interpret, or try to interpret, any type of file system. It’s just going to go sector by sector, looking for internet artifacts. Now, the key difference between this search and the full search is that when it finds artifacts in a sector level search, it will provide an offset of where that hit is, but no reference to the file that that hit was in. Because in a sector level search, IEF is not even paying attention to files and folders. It’s just going sector by sector by sector.
So you would use this search, or you would be forced to use this search when you load a physical image of a phone that has a file system that is not supported by IEF. For example, YAF. So when you do an Android phone that might contain a YAF file system, which was the older file systems used by Androids, IEF will automatically default to Sector Level, because it takes a quick peek inside the image and says, “Oh, I do not recognize this file system, so the only way I’m going to search this is sector by sector.”
In this case it defaulted to Full, which was because it took a quick look into the image and said, “I recognize that file system, I can parse that file system, so I will default to parsing the file system, looking in all the logical files and folders, and then looking in all the unallocated space.” And then every hit that IEF finds will be able to provide a reference back to a particular file that that hit is in, along with the file offset, or, in the case of unallocated space, an offset within unallocated space of where that hit resides.
Okay, so let’s continue with Full Search. Let’s say “OK”. And we notice here that the image that I’ve added has been listed. Now, I could continue adding images. I could add a few more phone images, I could add an image from a PC or a Mac. I could point to files and folders from a computer, PC or Mac, or I could even point to a physical drive that might be attached to my examination machine through a right blocker. And each one that I add would appear here. And over here on the left, I can remove those items by just clicking on the little trash can, and if I’ve accidentally added something I later decide I don’t want to process, I can remove it from the list by just clicking on the little trash can.
Okay, I’m going to proceed with just this one image. We can see that we’ve loaded the [galaxySimage.dd], and my search type is all files and folders including unallocated space. If I had chose Sector Level, it would say “Sector Level”. If I had chose Quick, it would say, “All Files and Folders,” but it would not say “Unallocated Clusters”.
Okay, from here, I click on “Next”. And IEF is displaying all the different Android artifacts that I can choose to search for. Now, by default, all of these are selected. Much like in previous versions of IEF, the default is “All Artifacts,” but you can pick and choose individually by using a little check box in the upper left-hand corner of each artifact. Now, you can uncheck them all in one click, or put them all back with another click, all at the bottom left-hand corner of this window. And you can individually check or uncheck the artifacts that you do or do not want to search for.
Notice that there are two artifacts that have additional, small, little plus marks. This one right here, and this one right here. These artifacts, and this little check mark, indicates that there are additional options for this specific artifact. So for pictures, if I click on the little plus mark, I get a new dialog asking do I want to extract Exit data and do I want to do Skin Tone Detection. Now, these are marked by default, but if I chose not to do Skin Detection, I could uncheck this, and I could also uncheck the Exit Data. And it’s asking me do I want to resize each thumbnail that’s presented to 256 pixels. I could change this to make each picture a little bit larger, or a little smaller so more would fit in one screen. I’m just going to use the default settings and say “Close,” and then I’m just going to look at the additional options for the videos here, and you can see that Skin Tone Detection is marked by default. And then I’m going to go ahead and say “Close”. And I have all the other artifacts selected. So I’m ready to perform my search with one more piece of information that I’m going to need to enter.
So I’m going to choose “Next,” and here it’s asking me for the output path of: where do you want IEF to create the case file and all the information about the artifacts that are found. So using this path here, IEF is going to create a folder called IEF, with today’s date and time, and then, it’s going to put all the artifacts and information about processing this evidence into that folder. So I could later revisit that folder, reopen and view the artifacts that were found by IEF. The case information is completely optional. I could proceed without it, or I could continue to add a case number and evidence number, my name, choose a logo for the report, and put any notes. And then this last option, “Enable Search Alerts,” would allow me to define some keywords that, as IEF is searching, if it finds one of those keywords, it will immediately notify me by sending me an alert, so that I could pay attention and come back to the keyboard console if I was to kick off this search and then continue working, either put IEF into the background or work on another computer nearby – it would send me an alert saying, “Hey, we found one of your keywords that you were looking for!” in real-time.
Okay, from this point, I’m ready to just launch and let IEF start finding the evidence by clicking the “Find Evidence” button. It’s asking me, “You are about to start without giving any case information, is that okay?” Yes.
And what should happen is IEF should go away, and then it should come back with two windows. The first one is going to be the status display that you see here in the upper left, and then the other one here is the IEF Report Viewer that will start to populate with artifacts as the search is running. So we can see that we’re currently searching All Files and Folders, and we get that by this progress circle here as it’s spinning, it’s letting us know that it’s still going through all files and folders. Once it completes, this will turn to a check-mark, and we’ll get this circular status to move down to the unallocated as it processes that object.
Now, as the search continues, you can see that more and more artifacts are beginning to populate the Report Viewer. This happens in near real-time. There’s a Refresh button right down here in the bottom right, and you can press this at any time, and that causes the Report Viewer to refresh with the latest hits. If you do not click the “Refresh” button, it automatically refreshes, right here with this option, Configure Auto Refresh – you can see that it’s automatically sent to 10 seconds. So every 10 seconds, the search engine up here is sending the results over here to the Report Viewer. So if you just let it go, it will refresh the Report Viewer every 10 seconds. Otherwise you can manually cause a refresh by just manually clicking on the Refresh button.
When the search is complete, our progress window here will state “Complete,” and the “Refresh” button will actually go away, and we’ll actually get a message right here, in this lower pane that says, “Your search is now complete.” So we can see that we’re still searching all files and folders, we have not yet started unallocated clusters. We have some elapsed time, and then a current search progress that indicates our current object of about how much time is left.
So for the interests of time, I’m going to pause right here, I’m going to let the search complete, and then we’ll come back to this as soon as it’s done, and we’ll go through the results.
Okay, I’ve fast forwarded a couple of minutes here. You can see that the total elapsed time is indicated as 3 minutes and 36 seconds. I’ve left everything exactly as it was; here is the actual progress bar or the search status – you can see that it says “Complete”. And then down here, the “Refresh” button is now missing, and it says, “Your search has completed.”
So at this point I can close the search status. I’m going to click on “Close,” and this will go away. And then I can actually start to investigate and look at some of the search hits that have populated the IEF Report Viewer. So I’ve opened it up here, and here at the top, IEF, like in previous versions, provides some quick links to some refined results. Basically what this is is these are kind of like shortcuts to look at the common search types or artifact types that people like to look at. So cloud services are categorized here, search queries, rebuilt web pages, and then social media. These are just categories that are also included down here in their individual categories. So if I look at, like, Parse Search Queries, we can see several here. These would also be included down here in the browser type that they used, and in this case I believe they used Chrome. So these same searches or hits that were seen here would also be down here. But this is a quick, easy category that you can look at specific types of activity – in this case I’m looking at search hits. But we can also look at rebuilt web pages that IEF was able to find and put back together, or we could look at specific social media URLs. Again, these would just be a summary and provide a quick access to those search hits as opposed to going into each individual one of these categories and seeing them individually.
Okay, so we can see that in this image we have several artifacts that were found in the Chat category, we have some Gtalk hits, so we can see the username, their nickname, their local account that was used. And again, these are contacts, so that’s just showing us names of people. We have some Kik Messenger contacts, again showing their contact ID and their display name, and over here their username and then in one case here we even found a profile URL for this particular user. We have some messages from Kik Messenger. Here we have the status, and then we have the message partner, the person on the other side of the conversation, and then the status of the message itself, and then the actual text message along with its timestamp.
Here we have SMS messages, and then we have corresponding Sent and Received times, and the message type, of whether it was sent or in the inbox, along with their timestamps and the numbers that the message went to. Then we have Android SMS Carved. So here you might find messages that were carved out of unallocated space, but also, since carving with the search option of Full means that if we see anything that looks like an SMS message in any other type of file, we’re going to try to carve it out. And what you also find in this type of search is that you get SMS messages that are also in the [SQL-like] database journal file, so in Android the SMS text is stored in the [MMS, SMS.db] file, but you might also see that IEF was able to find some text messages that are in the accompanying journal file for that database, which is a separate file. So if we highlight a few of these and we take a look down here, we can see exactly where this data was found. So in this case, we have a message right here, and we actually see where this data was found. And as I continue down these files, we can see that each one of these is a different message that was carved from a different location.
From here we can move on and see some of the other artifacts. Here we have Snapchat Photo Transfers, some received images, some contacts from WhatsApp, some WhatsApp messages, some carved messages, some profile pictures, and then we also have some Skype messages that were carved. So here we take a look, we see some dates and times along with the message itself and where that data was carved from. In the cloud category we have some Dropbox URLs along with the filename that was referenced in the artifact that we found, any dates and times that we were able to extract having to do with the modified date or the displayed date, Dropbox account information – so here we see that there was a Dropbox account referenced Joey Blowey, with this user ID and this email, along with where that data was found. In this case it was found in the preference db of Dropbox application.
And then we move down to email. So this would be the native email client. So whether they were using it for Enterprise, like Outlook, or if they were using it to connect to an iMap server. This is the native email client, and here we have the sender, if there was any cc, bcc, and the actual message down here at the bottom of that information. And we also see Android Gmail, so if they have the app Gmail installed, we can see that here are some messages, the From address, the To message, the Sent, and the Received, and the subject.
[AMR] files would reference any voicemails that were recovered, and we see here that there’s 17 voicemail messages that have been recovered. Now, some of these may be garbled because we were able to recover bits and pieces. We look for particular bites that indicate that this is an AMR file, and while we might be able to recover pieces of it, it’s not always fully complete. So you’d have to listen to each one to decide if it’s relevant or audible.
Multimedia, in pictures we can see the same type that works in PC and Mac, we can cycle through the various pages and pictures. And then as we move down here, we can see Android contacts, so this would be the native contact manager application installed on the Android. And then call logs – so this is giving you a list of calls to and from various phone numbers along with their dates and times, and if there’s a name associated with that number. And then as we move down into social media, we see here we have Android Facebook, Foursquare check-ins… Foursquare is very important because you see that it has a latitude and a longitude associated with it, along with an address, so this would be really key if you’re trying to determine some locations of where the user was. If they were using Foursquare, then it may be possible to put them – or at least the phone – at a particular longitude and latitude at a specific date and time, which may be very crucial to the investigation.
And we also have some Foursquare locations, again here are some dates and times along with longitude and latitude, and then the distance away from that particular location, and then we have some Foursquare searches, what they were searching for, and we have some Instagram and some Twitter, that falls under the social networking category. And then in the final category here we have Web Related, which would be all the web URLs and browsing history, in this case they were using Chrome so we have some bookmarks, we have some cache records, cookies, histories, and then searches. So here specifically we have some Chrome searches, looking for particular keywords on particular dates and times. And then any downloads that they may have pulled down and where that download went to – so here we have a file that was pulled down and saved to the SD card called simpleviewer.
And then we have Google Maps, and Google Maps again shows us where the search query was – and we have beertown, beer, stuff, and then a location, a longitude and a latitude specifically. And then we have generic browser history, Browser Activity.
Down here in the bottom we have a timeline, which we introduced in v6, and it rolls forward with the newest version in 6.1, which means that any phone artifacts are then plotted in a timeline fashion. So it’s loading up the timeline, and what we should see here is some activity, based on times, and we take a look at some chat – and let me zoom in here, I’m just going to make the slider go a little bit smaller – and here we have our activity, and it looks like we had lots of activity in April, with less activity in the early part of the year, and yes, last year looks like we had kind of a big in activity here, and then very little activity in the year prior to that.
So we could zoom in just by specifying a particular area, and then when we see a specific artifact like this chat, we can click on it and bring up the actual Kik message – in this case it’s a Kik message with its date and time, saying this was back in October 18, 2011, and the actual reference of the URL or the message details.
Okay, so I’m going to close the timeline, and then the last thing I want to display here is: we were mentioning about Foursquare and Google Maps being able to provide some geo-location information. So if we come down here to the world map, it takes all this information that might have geo-information, so we see right here, and it plots all this out. And we can see that right up here, we have lots of these artifacts that have information that points to this particular area in the world. So as we zoom in here – let’s just zoom in… little too far… zoom back out, okay – so here we have our hits that belong to one of these category types, and if we zoom in a little bit more, little bit more, we start to see that these hits are focused right around Kitchener, Ontario, in Canada, which is the hometown of Magnet Forensics, and as we zoom in even more we start to see where these locations are actually breaking up, specific addresses or points within the city. So we had 15 artifacts that reported this location, two, and four, and two… so this helps plot this geo-information that was found in these artifacts, like pictures, like the [XF data], the Google Map Tiles, Foursquare, and plots them on a map for you, so that you can actually see where this user has been moving around.
So you can definitely see that right here in the middle we have a high concentration of activity where we have some other kind of activity, kind of on the outskirts of this center location. If we kind of zoom in some more, we can start to see that we have even more activity here, right around this block, right in here. And this is actually the location of the Magnet Forensics corporate offices, and this was some test data that was used by moving the phone around and letting those artifacts get copied into the phone, and as we extracted it, it’s able to plot those locations.
So I can close this, come back to the Report Viewer, and from here, as I’m browsing through these artifacts, I can bookmark them just like in the PC and Mac version, the IEF Standard version, which is the current version 6.0. In the same way, as I see things I can bookmark them as being important, and then later decide I want to export all these artifacts out to a particular type, or I can specifically export only the Android Google Maps, which I happen to be highlighted on at the moment, right down here. And I can just export that data out to a specific file type, CSV, HTML, whichever. And if I end up closing the report viewer, at any time I can actually navigate back to where I specified a case folder, and open this back up and continue reviewing it just as if I never closed the application, until I’ve looked at all the artifacts and determined if they’re relevant or not.
So that’s a demonstration of IEF Advanced. We’re very, very excited to release this to the public here in the next few weeks, and getting some feedback and seeing how you like these new features. We encourage you to visit Magnet Forensics website, magnetforensics.com, and if you are a current subscriber with current SMS for IEF version, then you will be getting an email here shortly notifying you of the release. IEF v6 also has an auto-update feature, so if your examination machine has internet connectivity, it will automatically send you a message saying there’s an update available when we release v6.1. But if you do not have internet connectivity, then you will most likely get an email. But if you were not in the email that was registered with the purchase, then you can always go to the customer portal or check our website online and get the latest news about when exactly v6.1 will get released.
So thanks for tuning in. Hopefully you found the mobile version as exciting as we believe it is. And we’ll look forward to getting some of your feedback. Thanks, and take care.
End of Transcript