Julie: Hi. Hey everyone. Thanks for joining our webinar today, Exploring New Features In BlackLight 10.2. My name is Julie O’Shea, and I’m a marketing manager here at Cellebrite. Before we get started, there are a few notes that I’d like to review. We are recording this webinar today, so we will share the on demand version after the webinar’s complete. If you have any questions, please submit them in the questions window and we will answer them at the Q&A. At the end of the webinar, we will be exploring the new release of BlackLight version 10.2, which we’ll be releasing very soon. If you’re a customer, you will get a release update email notifying you of this news and encouraging you to visit the My Cellebrite community portal to download the latest version. Now I’d like to introduce our speakers today. We have Sarah Davis and Stephen Villere.
Sarah has a Master’s of Science in Digital Forensics and Cyber Analysis from George Mason University. She also holds a Bachelor of Science in Computer Science with a minor in Criminal Justice from Gardner-Webb University. Sarah has experience working directly with the training and support of forensic software and instructing courses centered on digital forensic skills and software. She has presented at digital forensic conferences and taught lectures at George Mason University.
Stephen has been in law enforcement since 2002, and started his career in forensics in 2003. His background includes being directly involved in building two forensic units and conducting digital forensics examinations for criminal investigations of all types. He holds his Bachelor’s in Computer Information Science from Loyola University in New Orleans, a CFC certification, and his TCFA certification. He lectures with Loyola University, University of New Orleans, and with the American Society of Crime Laboratory Directors’ Train the Director series. Thank you for joining us today. Sarah and Stephen, if you are both ready, I will hand it over to you now, Sarah, so you can get started.
Sarah: Thank you. So to start with, we’re going to start out with tagging and then we will move on to talking about OCR, and then I will pass it over to Stephen to talk about timeline.
So to start with tagging, there’s been a couple of different updates with BlackLight 10.2, having to do with tagging. First of all, tagging is now faster than it’s ever been before with BlackLight. You have the ability to tag thousands of items at one time, and larger tags — so, say, of thousands of items — are moved to a process so that the UI is freed up for the user. So whenever you tag those thousands of items, you can still navigate around BlackLight and that tag is just moved to a process. And we will show that in just a little bit.
There are new hot key commands for tagging within BlackLight. So there’s a couple new hot keys, or you can tag without having to right click and hit ‘tag’ and everything like that.
There’s more user control over the metadata fields within your tags. So you can toggle on and off what pieces of metadata you want to include or exclude for your tags. You can also customize the order of the tags in your report. So in your report section within BlackLight, you can kind of customize the order in which you want your tags to appear.
So, talking about those hot keys, if you’re in BlackLight and say you have a file or some sort of piece of media or a photo selected, you can hit command+T and that’ll create a new tag. So you’ll get the pop-up window that says ‘name your tag.’ You can enter the name and hit ‘enter,’ and it’ll save that tag for you. After you create that tag, you can hit command+T again and whatever item you’re on will add to that tag. So say you’re on photos, you create a new tag, and then you go to another photo and hit command+T again, it’ll add that photo to the most recent tag.
You can also hit command+shift+T and that’ll create a new tag. So rather than hitting command+T and adding to your last tag, if you hit command+shift+T, that will create a new tag. So you’ll get that new pop-up window again, that says ‘name your new tag,’ and that’s how you create a new tag. So let’s actually go into BlackLight and look at some of these.
So let me share my screen. So, in BlackLight, I have a filter set currently, just so we can show a couple things, but let’s say you want to tag all of these photos. I have a filter set right now to show all of the pictures within our media view that have geolocation data associated with them. So if I select all of these — select all the photos — you can see down here in the bottom, if I zoom in, you can see that there’s 762 photos. So I want to tag all of these. So I’m going to right click and hit ‘tag media as,’ and I’m going to create a new tag. So this is that little pop-up window I was talking about, when you create a new tag. So I’m just going to label this ‘pictures with geolocation.’
Alright. So when I hit ‘OK’: I’ll hit it in just a second. When I hit ‘OK,’ over here on the left hand side, under tags, you will see this new tag go over there and you will see the little process, kind of looks like a pie chart. You’ll see that process pie over here on the left-hand side. So I’m going to hit ‘OK.’ And you can see that little process pie pop up over here on the left-hand side. That means that this is now processing. You can see it completing. If you try to hit ‘X,’ so you try to get out of BlackLight, you try to close it down, you will get this pop-up window that says there are items still being processed for tagging.
Closing the case will lose all tags currently being processed. So if you hit ‘close case,’ you will lose what you’re tagging. If you hit ‘cancel,’ you’ll save your tags. But just know, if you hit ‘close case,’ all of these tags will be gone and they won’t be saved. So I’m going to hit ‘cancel,’ and you can see all of our items are now processed. So you can see all 762 of those items that we just tagged are now tagged over here. And we can go through those.
Another thing that is new — and I’m going to use the downloads tag that I have set here — is this ‘configure’ button up here in the right-hand corner. If you hit configure, this is a new window within BlackLight. So this is basically where you decide what you want to include or exclude from the metadata having to do with your tags. So you can… first, you have the tag name. So I’m in my downloads tag, and I’m going to use it just for the example, but you have all your other tags would show up here.
You have the tag type. So you can see, I had two types here. We have whole file tag and picture tag. So let’s say you had some sort of tag… you tagged some hex somewhere. That would also show up here, it would show up as hex tag. You would see that here. But for right now, I’m just going to use picture tag.
So you can toggle on and off kind of what you want to include for the metadata. So let’s just move a couple. So let’s say we want to exclude this. All you do is click on the item and hit this arrow that’s pointing to the right, and it moves it over. If you accidentally exclude something that you want to include, all you have to do is click it and then hit the left arrow, and it puts it back into including it.
You can also drag to reorder what you’re including in your metadata. So let’s say you want date accessed first, you want to see it above date created. You can move it and modify this to where you show what order you want. Let’s say you want file system ID to be the first thing you see. You can move it up to the top. So you can kind of drag and drop where you want things to be displayed.
Another thing you can do is, you can copy the settings to other tags. So if I want to copy the settings to all those 762 photos that we just tagged, I can hit ‘copy these settings to pictures with geolocation,’ and it’ll take a second. And then you’ll get the pop up that says ‘These settings were copied to pictures with geolocation.’ If you start messing with this and you want to go back to the original, how everything was set up, you can hit the reset button and everything will go back to its its original form. So it’s kind of your safety net to go back to your original form if you start messing with everything.
Let me close this window. Another thing to know is this bottom… let me zoom in… is this bottom little line in all of your tags. So you can see it says 28 of 32 additional metadata selected for reporting. That means that, when we moved over some of those metadata fields to exclude, this is currently showing 28 of 32 possible metadata fields. So only 28 of them are showing because we excluded some when we hit this configure button and we moved things over to exclude.
Alright. So let me stop sharing my screen. And we can get over to OCR.
So the next section we have is OCR. OCR stands for ‘optical character recognition,’ and it’s basically used to recognize texts within photos and scanned documents, like PDFs. So it’ll take your PDFs, your scanned documents, your photos, all of your image files, and make them searchable. So you’re able to search for things within those files.
Alright. So let’s go over a few examples of how you can look at this OCR image texts within BlackLight.
Alright. So to start with, let’s say you’re in BlackLight, and you’re in browser view. So I’m in my browser view, and let’s say you found a PDF. So the PDF I have selected here is racerparts.PDF. If you select that PDF and go down into the strings tab, this is where you’re going to find that OCR image text, in the strings tab. If you hit the strings tab and you scroll down, you’re going to see this banner, this banner that says OCR image text. You can scroll down and look for it, or you can do command+F and use that command+F function to search for OCR image texts. So if it’s a big document, if it’s a couple hundred page PDF and you don’t want to scroll down through all of it, you can just hit command+F and use that find function to look for that OCR image text banner.
So you can see over here on the left-hand side, underneath OCR image text, you can see the different things that were pulled from this PDF. So I put a little screenshot of what the PDF preview looked like. So you can see ‘racer parts’ was pulled out, the phone numbers, the term ‘replacement emblems,’ and then you can see the information like loads in stock, and more coming in daily. You can see all that stuff was pulled out. So that’s just looking at this OCR image text within your browser view, kind of just clicking on a PDF.
The next thing that you can do is use a file filter. So now under file filter, we have a filter for OCR image text. So what you would do is click on ‘file filter.’ And then in this dropdown menu, OCR image text will be at the very bottom, all you do is select it, and then you have three different options. You have only files with OCR image text; only files without OCR image text; and then files with or without OCR image texts. So those are the three options you have. You select whatever you want, and then you hit filter, and you can see I filtered for only files with OCR image text, and then we see that racer parts PDF again. So that’s using our file filter.
The next way is using index searching. So what you can do is in BlackLight, over on the left-hand side, you can see index searches. All you do is hit the green ‘add’ button, and this window will pop up for you. You’ll need to name your query and then enter in what you’re looking for. I’m searching for ‘replacement emblems,’ because I know that was a term we saw earlier in that racer parts PDF.
So I said ‘replacement emblems,’ and I hit ‘run query.’ When you hit ‘run query,’ this is what you get. So over here on the right hand side, you can see under type, it says OCR text. So this means that when you ran the query, the type was OCR text. So you can see, we see the racer parts PDF that we’ve looked at a couple of times now, and you can see down here in the bottom, ‘replacement emblems’ is highlighted in yellow. So whatever you enter here to query for, that’s what’s going to be highlighted in yellow for you. So ‘replacement emblems’ is what’s highlighted. So that’s using our index search.
The last way to look at this OCR image text is to use our content search. So doing a content search, you start kind of the same way that you did with index searching, by going over here on the left-hand side and clicking the green ‘add’ button, you can click the green ‘add’ button and you’ll have a new search. You can name your search. You also decide what partition you want to search through. I just selected the data partition for this case. And then you enter your keywords. I did ‘replacement emblems’ again because it’s something we’re familiar with, we’ve already looked for it.
The most important part is make sure that you select ‘deep search.’ You have to make sure that ‘deep search’ is selected over here on the right hand side. You can also narrow things down more over here, but this is all I selected for this. And then you would hit ‘start search,’ and then this is what you would get. So your results pop up in this pane right here. You can see we got seven results, including that racerparts.PDF that we’ve been looking at.
Another thing to notice is that right here, it says deep search hit and you get, yes, yes. So that means that because we had deep search selected, we did get hits when it did the deep search. So you can see ‘replacement emblems,’ which was my keyword that I entered. They’re both highlighted in orange over here.
Similarly to how we’ve looked at it in other views, you go into the ‘strings’ tab down here at the bottom, and you see that OCR image text banner yet again. So this is where you see all of your OCR image text.
So that’s four different ways in BlackLight that you can look at this OCR image text data. So four different ways: using content searching; index searching; using our file filter; and just using the browser view and looking at different PDFs and images. One thing to remember is that when you are looking for this stuff, make sure that you have processed for OCR image text, and then if you’re going to be doing index searches, make sure that you’ve done both processed for OCR image text, and processed for indexing.
Next, we have BlackLight timeline and I will pass it to Stephen.
Stephen: Thank you, Sarah. Thank you, Julie, for that introduction. Hope everyone out there is doing well today, wherever you are. And thank you, Sarah, for showing us some of the great features to come, like tagging, and some of the enhancements that were made there. And also the feature of OCR that’ll be brought into BlackLight now, I’m sure some of our users will appreciate that feature.
So like I said, my name’s Stephen Villere, I’m a forensic engineer with BlackBag. And today I’m going to talk to you about the Cellebrite BlackLight timeline feature that will now be brought into the tool. For our long time users, you may have seen this feature in our tool a couple of versions ago, or a few versions ago. And it actually was taken out of the tool and completely revamped, and we have completely brand new stuff to show you in reference to the timeline feature.
So the first thing I’m going to do today is, we’re going to talk briefly about a couple of things. We’re going to go over the overview of the user interface, because it has a couple of helper features for you that you will have to get used to, being as it’s been a long time since you may have utilized any of these types of things within that BlackLight user interface for timeline.
So we’re going to talk about those things specifically, because they’re going to help our examiners actually go through and get the best use, and make the most use, out of this new feature.
Next thing we’ll talk about is like, if you were provided a date and time, or if you, going through some artifacts, found a date and time of interest and then wanted to go further from there, how might that look going into the user interface, looking at timeline and then going through the other sections of BlackLight? And we would like to call that, you know, a pivot point or a starting point in our investigation.
So I’m going to share my screen, so you all can see my example case here. Let me pull that up shortly.
Alright. You should be able to see my screen now. So what I have here is an example case with one of our images that we generally use in a lot of our training classes. So anybody that’s gone through our training courses, you may have seen this image before. And what we did was, we brought this image in and we processed it in this newer version of BlackLight. So we see here on the left-hand side, we have our component list, which will show our racer data active, and our bootcamp active. So we have a Mac OS entry here and a Windows entry here. As we go through this data, take a note that we’re actually going through multiple pieces at one time. So we’re looking at the Mac OS entry and the Windows entry.
With our screen in the middle here, OK, this is our timeline view. So this is our new setup for our timeline view. We have some nice colors going on here. Hopefully most of you can take advantage those colors. If you cannot, there will be a couple of different shades going on here, but on top of the colors, we also have some visual bars that will show activity, right? So the higher the bar, is the more activity and dates and time entries were going on around that timeframe.
So the first thing I would like to do is show you all: timeline is going to be located in the top left-hand side of your user interface for BlackLight. So you will see it here right next to ‘case’ info. So once we have our items checked, we would click on ‘timeline.’ Then we would like to show you the helper icon that we have placed here.
So this is going to be a question mark circular button. As you click through this icon, it will help you as a first time user, or a user getting used to this timeline feature. I’m going to click through each one of these, and it’s going to highlight certain sections for you all, so that we can then go through it and show it to you in live form.
So when I click on it the first time, it says I can continue to click multiple times for additional hints. I just continue to click on the question mark, and it’ll actually visually show me where some of my user interface buttons are. So if I were to click on either of these arrows, what it will do is, it’ll move that top bar over to the left or right. This is how you scale. So you can scale through the years, or if you’re zoomed very far in, it’ll scale through the hours.
Next, you have the plus and minus buttons here. So for both of these, they’re pretty self-explanatory. If I click on the plus, it’s going to zoom in. If I click on the minus is going to zoom out.
This scale bar here can also be selected and then moved, so we can click it and drag it seamlessly across to the left or right. We can also scroll while we are focused on the scale bar. So the user interface here is kind of fluid, right? It’s going to be very fluid with being able to go in and out of the data and change our focus very quickly, which makes it very nice to be able to use this feature now.
We also have our click and drag to select the ‘range’ option. So in the middle section where you see all of your activity bars, you can actually click, select, slide it over, it’ll highlight in yellow that section that you are selecting at the time, and then it’ll zoom into that section and focus in on it. So it makes it very easy to get close, and get into the data that you want to focus in on.
On the left-hand side, you will note that there is a row of items here. The row of items here is broken down into a couple of different categories. The categories are pretty self-explanatory, but there could be variations in these categories that you want to be aware of as an examiner. As you click on, or select, these different rows, that is going to change your view. And that’s very important, because if you click on something, we’re no longer seeing everything else, but it also allows you to quickly focus in on one particular category, or one type of artifact.
Lastly, we’re going to have some actual numbers, right? Some tallying numbers here. What this represents is all of the dates and times for any of the entries that are under those categories. So it’s an interesting number here, because you may have a lot more entries here than you have files that are appearing in the middle window. And that’s basically because we have, or we have the possibility of having, multiple dates for one artifact. So with that, we could have four dates with one artifact, and we only have about five files. So then we could have about 25 entries there. So it could be some variation there, but that’s how we are getting to these numbers here.
So that completes going through our question mark here, just to kind of get you a quick overview of that interface. The only other one that you’ll see — but it won’t come from clicking on the question mark — is if you were to actually focus into a range that doesn’t have any activity. So if we click on, and we just scan over and zoom into nothing, so there’s no entries here, because we’re now out into the future, right? It will actually pop up and help you as a user, it’ll show ‘zoom out to see the counts.’ So it’s pointing to here these two arrows.
This one button here is kind of like the start over button. So you’re going to want to get used to going through and focusing in and zooming in and out. And you may get to a point where you’re kind of feeling lost and you want to start back over. So you’ll then click on that button. And when you click on that button, it’ll bring you all the way back out. And anything you have selected on the left hand side will then be showing up for your entries here.
So that’s just a little bit about that unit user interface and how you can utilize that. So I’m going to show you if I use my cursor, okay? And I highlight over that scale, I can click and select and I can simply drag it from left to right. Notice that once I go so far to the right, it will actually start dropping off the artifacts, and my numbers will change. And that’s because my focus has changed and it’s fluidly filtering out those dates and times values and what you’re focused in on.
So additionally, if I were to zoom in to 2019, I can zoom in, I can drag over 2019 and just keep zooming in until I’m looking at pretty much all of 2019 here. If I then click and drag my scale over, you can see how my numbers are immediately changing. So it’s a very fluid interface, which makes it extremely nice to work with. I want to get back to start over. So I want to click that and get back to my view here.
So the first thing that we’re going to want to show you here is how you can also scroll up and scroll down right here, to see all the variations of categories that exist over here on the left-hand side. So you’ll see a lot of entries here. The other interesting thing here is, if the entry doesn’t exist. So if there isn’t an entry for a note, this will just drop off and not appear. So the more you focus in and the less activity that was going on during that time, you will only have selected categories that appear here in that fluid view.
Now, also with that said, I want to zoom in, and I want to start looking at some information. So if I was provided a date and time, or if it’s a brand new examination and I just kind of want to get to a starting point, we’re going to walk through how that may look in this new timeline interface.
So what I’m going to do is, I’m going to leave my Windows bootcamp volume checked off, because that’s the information I want to look at. I’m going to uncheck my Mac OS volume here. So if I scroll down now, I still have a lot of entries here. Still have a lot of good information, but I want to zoom in to, you know, and see what the user’s activity look like. So if I go all the way down and I actually get to the bottom here, I can click on a section called ‘user accounts.’
So I’ll click on this section. Notice my bar, I’m selected now in my bar, and I get two entries here. So the two entries here that appear, I only have two different dates here. I’ve got a 2010 date and a 2019 date. That’s going to be November 5th. This is going to be our last login for some user account. Currently in this view, we’re seeing some limited data, because timeline is trying to render that information for us very quickly and enable that fluid view. So this is a starting point for me. As an examiner, I want to see when that last user logged in. So we have one of those entries here.
Now, how would I go into the next part? What am I going to go look at next? Inside the tool we can right click on this entry, and then we can do ‘reveal,’ and we can reveal that item in what they’re calling the native view.
In other words, this information was parsed out at some point in time, in our case. Let’s go look at where that information was parsed out from. So if I click on ‘reveal the item in native view,’ and then I click on ‘select device’ because it wants to go to this volume’s selected device, it will bring me to actionable intel — so this is another section in our tool that has parsed out data. It’ll also bring me to that artifacts entry.
So what I’m looking at here is a Windows user’s log in. The user’s name is Josh. I have a user ID, and I can scroll over here and get some additional information about that. The date that we had seen from the timeline view was the last login date and time, which was November 5th, but we also have the last password change date, and the last failed login date, and a login count. So we’ve got some other good information here about that entry. So we are pivoting from timeline view, over to looking at that artifact and native view, to get some additional information about it that might be located in the tool. So we can easily do that by doing that ‘reveal’ and clicking ‘reveal in native view.’
So I found some more information about it. I also now have what? As an examiner, I have the last time the user logged in. So this may be useful for me because it could give me a starting point. I can go then look around this timeframe, and see what other artifacts I can find. So I’m going to click back over to my timeline. Okay.
Now notice when I clicked back over, if I just click the button here and I just click back over to timeline, it brings me back to exactly where I was, but what if I wanted to focus in on that date area, that November 5th date? So I can simply go back to that entry. So I’m here at Windows user Josh, and I can actually pivot from this entry also into timeline. So if I go over here to my last login date, and I select that November 5th date, and I right click on it, I can do ‘reveal’ and I can reveal this timeframe in timeline.
Now, an important note here for user interface is, if I were to go over to a different section of this artifact — let’s say I go over to Josh and I click ‘reveal’ — I won’t have that option for revealing the date and timeline because it’s not a date, right? We’re on the username of Josh. So that’s an important note, because right here I have a last login date and right here, I have a last password change. So it needs to know which area I want to actually pivot from. So in order to do that, it only gives you the option when you’re on a date and time.
So I click ‘reveal’ and then I’ll click ‘reveal in timeline.’ Now take a note. Okay. It doesn’t immediately look different, but if you look closer at the scale up at the top, I’m now not looking that entire large focus that I was on before. I am now focused into around that November 5th date. We’re actually looking at the day here. And then we’re looking at our hour breakdown within the day. What this also did was, it changed my numbers over here on the left hand side. And it should be much smaller at this point, because I’m going to have less artifacts in a smaller period of time.
Additionally, I have all of my bar graphs here. So notice in the beginning of November 5th, it doesn’t look like a lot of activity was going on. We got a sprinkle here, but it looks like some of that activity started going on around 16:00 and up to about 19:00. So I’m going to focus over on just those bars. So I’m going to slide that over and I’m going to get that focus even better here.
So now I’m into 16:20 over to about 19:00. So that’s kinda what I’ve focused in on and scaled into. Now my left-hand side, I only have the entries that work for me. So what we’re seeing here is all of the artifacts that we can see based on our timeline focus. So from here, I still have the artifact that I started with, that user account last login. But additionally, from here, I can go look at every file that had any kind of change date and time, modified date and time, created date and time, in this timeframe. Now it’s still a lot of files, right? A lot of activities going on, even in that short period of time.
But let’s say I just want it to look at notifications. I could click on the notifications category. I could come over here and I can open up a couple of these entries here, and I can start perusing through those notifications. So I can see I have a OneNote notification, I have an X-Box app notification. So I can go through and look at some of those entries there. I could keep going down. I could go down to my activities cache. I could go down to my bam or dam. I could also open up my am-cache and specifically look at some of the entries under there, which could be extremely useful depending on the type of case that you’re working. And close that up.
And I could also go to device connections. So I’m actually looking at last connection times and dates for devices. So that could be extremely useful for me. So what if I want to find out more information about that device? Well, I could do the same thing I did before, I could pivot over, or I could take a tag of that entry. So we notice here I’ve already done that prior for the webinar, and I could go look at that and I can see, I now have my tags for those devices that were connected here, and I can get additional information about them. They’re already in my tags.
And like Sarah said, now I can go in here. I can see some of the information, but I can also go up and configure some of this further, from here on out, with that new feature. I’m going to go back over to my timeline. And I’m also going to look at some jump list entries. Jump list entries are great artifacts on the Windows side. It’s also fantastic artifact to look at in timeline, because with timeline, what I can do is notice I’m at a short period of time here. I’m at, you know, just about less than a day, half a day or so.
From this fluid view here, I can actually start scaling back, so I can either come up to here and I can start scrolling with my mouse, or I can start clicking on the zoom out. As I continue to click on the zoom out, as soon as I get far enough out and there’s some more activity that occurs, I will actually have other artifacts that will pop into play here. So I can keep going with this and I can get down to another activity period where things start to pop in. So notice we’ve got something in October that’s now popped in, and I have additional entries here. So now I’ve got a whole list of entries from jump lists.
The other good thing here is, the apps that I’m looking at is not just one particular app here. Right now, I’m looking at all of the entries. So this view provides that holistic view for us. And we can look at a bunch of items at one time.
Lastly, we just have maybe a few more minutes left, so I want to finish up on timeline with just one more thing. And then I’m going to go over to the next feature that we would like to just touch on briefly, before we get to the end of our webinar.
So last thing I would like to show you all is that this works on either side, right? It works on Windows. It works on Mac. Okay. So I can uncheck my Windows side, or my Windows volume. I can check on my Mac side, and then I can come here — I want to start over OK? So I can start over and look at all my entries here.
One of the most interesting things, or fairly common things, people like to look at now is Airdrop related data. So now I’m on my Mac side, I’m looking at my Airdrop downloads entries. So from my Airdrop downloads entries, I can quickly pivot, right? So I can go into here, I can look at my Airdrop downloads, I can see names of files that were being exchanged, I can see participants. And of course like before I could pivot from here, go to that native view, if I would like, or I could go and create a really quick filter, let’s say I wanted to find these files, right?
I could go over to our file filter, now I have a pre-saved filter here, but you could do this very quickly, put in your information, do a filter. And now I’ve pivoted over and I’ve actually found those files from timeline, right? Because those entries I was looking at was not the actual files. Those were the entries for the Airdrops. So now I have the actual files themselves, or what could be the files themselves. I may need to go verify some information there. But from timeline, I quickly went to the Mac side, looked at my Airdrop downloads, found some information, and pivoted away from that. And this is just one more thing that you can think about.
Also you can of course click on or select multiple entries at one time. So I can click on, let’s say my active Mac OS volume, my snapshot from that volume, my Windows volume, and a volume shadow copy, or multiple volume shadow copies, that exist here. And then in my timeline view, I want to start over again. I can go back to the beginning, and I can see a ton of data here. I mean, I can see a ton of data from all of those selected items here.
Alright. So that wraps up a little bit of our timeline feature. Hopefully that gave you a very nice overview or introductory to it. I’m going to then move back over to my slides so that you can see the next thing we want to talk about. The slides for this will be very quick.
So, the last thing we’re going to talk about is going to be the DAR extraction file. All right. So for those of you that are able to get DAR files from your extractions we want to be able to take these files and bring them into BlackLight. We are bringing this feature into BlackLight with this version, and we’re simply just going to talk about this as an introduction. So we’re going to talk about Cellebrite BlackLight ingestion of DAR files. I have a pre created and pre-processed version of a DAR file. And then I’m going to visually show you what that looks like if you were to bring that into a brand new case, or a brand new processing for that case. So let me drop this screen real quick and open up my screen sharing again.
Alright. So now I’m going to bring over my process case here. Okay. So what you’re looking at here is we’re looking at a DAR file that was brought into our case here, we can over here on the right hand, around the top side, and you’re looking at a full file system DAR file. Now what we’re going to do is, if we had a DAR file that we wanted to bring in — an additional DAR file — we can bring in an additional DAR file to our case, simply dragging and dropping that file in. So we would drag it over into evidence, or we could select add, and then browse to that location of where that DAR file is located.
So you’ll see our ‘add evidence’ ingestion window come up here. Once that comes up, we will have the various options that already exist for the processing that we can do as we’re bringing in different devices into a BlackLight case. So you could go through, check off some things here, add them as needed, or you could just go ahead and hit ‘start’ and process additional stuff later. So we would hit ‘start’ on that, and we would bring that in.
Once that processing finished, right, we could come over and actually see the phone in our case. And that’s what we’re looking at here. So once it’s brought in, it’s going to look like, and it’s going to feel like, any other evidence that you would have brought into that BlackLight case. So you can go over the browser and you can browse the information from that phone, from that DAR. You can of course, go look at timeline and you can see timeline information about that DAR in here.
Additionally, we don’t want to forget to mention that even with your full file system extractions, or your file system type extractions, we’re now parsing out, or being able to look at, those unified logs here. And in our system logs section, under the unified logs, if we put in a quick filter for Airdrop, we can see a bunch of our entries here from that iOS device’s file system extraction. So we can get a ton of data related to Airdrop here from those unified logs.
So that’s going to wrap it up for me today. I’m going to go ahead and switch off and turn my screen sharing off, get back to the slides, which I believe is just going to be the thank you slide. Give me one second. Yeah.
So, thank you for attending our webinar today. Hopefully everybody enjoyed it and enjoyed going over these new features that we want to highlight and talk about. And that we’re excited about coming out in Cellebrite’s BlackLight 10.2. I’m going to hand it back over to Julie. If any questions came in while we were doing a webinar, Julie, feel free to ask them at this time.
Julie: Great. Thank you guys. We have had a few questions come in, so let’s go over a few of those. And we will start with: does the tagging view not show all the items that are in the tag?
Sarah: So in the tagging view, do you guys remember earlier when I said there were, like, 23 of 26 showing? That’s where you see that. So not all of them. There won’t be every metadata field for every single file. So you’ll see, whether you toggle them on or off, or you include or exclude certain pieces of metadata, it’ll tell you there’s 26 metadata fields available for this file =, and we’re only showing you 23 because you’ve excluded three of them, or something like that.
Julie: Thanks, Sarah. How about: does timeline work across any and all evidence items in the case?
Stephen: I’ll answer that one. Yes, absolutely. It does. As I showed you, you can select the different entries that have already been brought in and processed, and you can also select multiple entries at one time to be able to look at that timeline view across all of those items for your dates and times.
Julie: Thanks, Stephen. Let’s see. How about… this is a good one. Can I bring in multiple DAR extractions to one case, and can I search or filter across all of them?
Stephen: Absolutely. Good question. And yes, you can. You’ll be able to bring in, just like you would be able to bring in multiple images, into a BlackLight case, you can bring in multiple DARs. So if you had three or four iOS devices, you can bring them in and you can start going through each one of them. And then filtering is a feature of the tool itself. So filtering would work across whatever you have selected. So yeah, you could filter a broad spectrum across all of those devices in the case.
Julie: Thank you. And, similar question about extractions: Can I add different types of extractions from the same device to the same case?
Stephen: Yes. good question. I’m pretty sure I know what they’re getting at here. And yeah, that ability is in — or will be in — this version of BlackLight. You could bring in, for example, let’s say you had a BFU extraction from a device and later you got an AFU or a full file system from that device, and you wanted to compare the two or look at the information all under one umbrella or one case. You can absolutely do that now, bringing that into the case, all in one.
Julie: That makes sense. Thanks, Stephen. And this is another good one. What types of files does OCR work on?
Sarah: Yeah, I’ll take that one. So I know I mentioned earlier that it’ll work on PDFs, so like scanned documents, like PDFs and photos, but it would really work on on any type of image file.
Julie: Thanks for clarifying that. And I think we’ll have time for one more today. So let’s go with: what dates does timeline use?
Stephen: I’ll take that one. So timeline right now, currently it actually will go across the four major file dates. So it’ll go across those, you know, the created, the modified, the changed: those major four. And then it will also use any dates and times from artifacts: things that have been parsed out through the tool. So you could have things like you saw the last log in date that is not necessarily a file date and time. But that artifact of the last login date and time gets processed through that timeline. So you can see that.
Julie: Great. Thanks Stephen, and thanks Sarah. So we have had a few more questions come in that we didn’t get chance to answer, but we will reach out to everyone individually after the webinar to answer those questions. So Stephen and Sarah will be in touch. So stay tuned for that.
Also, if you’d like to learn more about BlackLight, you can click the BlackLight logo in your webinar console to visit our BlackLight product overview page. And as we mentioned in the beginning, to find all the information on our latest release of BlackLight, make sure to visit the My Cellebrite community portal.
Thanks again, Sarah and Stephen, and thanks for walking us through how tagging, OCR, and timeline can help enhance investigations. Thanks everyone for joining us today, and hope everyone has a great day.