Deepfake Videos And Altered Images – A Challenge For Digital Forensics?

Si: Welcome friends and enemies to the Forensic Focus podcast. Desi and I are back. We’re gonna say Happy New Year to each other because we haven’t actually spoken in the better part of…a little while. Even though you’ve had a podcast come out, that was a pre-recorded one. Desi’s been moving.

Desi: Yeah, that’s right.

Si: And as you can see, there’s not huge amounts of piles of boxes behind him. So he has successfully been moving or they’re all on the other side of this camera.


Get The Latest DFIR News

Join the Forensic Focus newsletter for the best DFIR articles in your inbox every month.

Unsubscribe any time. We respect your privacy - read our privacy policy.


Desi: Yeah, I now have two extra bedrooms to put boxes into, so that’s where they are.

Si: Ah, excellent. Good stuff. So it all went smoothly? All is good? And you’re enjoying the new time zone?

Desi: All is good. Enjoy the new time zone with no daylight savings, and the weather is much nicer. So, well, it’s summer for us, so it’s usually nice anyway. But I think every day’s been over 30 degrees Celsius here.

Si: That’s pushing, I mean…that’s the wrong side of nice for me. I mean, up to about 28 is okay. 30 and above starting to be a little much.

Desi: But that’s because you’re from the UK. Like, bad for us is like 50.

Si: Yeah. To be fair….your temperature maximum is definitely toastier than ours. I think we topped out at 32 last year.

Desi: Low thirties are nice. Pretty good.

Si: All right. Okay. I’ll take your word for it! So yeah. New Year. Kept your New Year’s resolutions for the whole of January?

Desi: Oh, I forget. I’d have to go back and watch the episode. I can’t even remember what we said we were gonna do.

Si: Ah, I can’t either. I know didn’t…what I’ve actually done.

Desi: I did say I had a book that I was reading, and I wanted to finish it when I traveled, but I did not. So that’s still on the list of things too. And then I’ve managed to add more books to my list of books to read after I started work again. So it’s an uphill battle. And I feel like everyone that reads can relate to that.

Si: A hundred percent. And that was my New Year’s resolution, actually: I’m not going to put any new books onto the list of things to read until I have reduced the pile by a significant amount. I’m four books down this year, and nearly finished the fifth. And I reckon if I can carry on (it’s been about one a week) I reckon if I can carry on one a week for the rest of the year…

Desi: I can’t read that fast.

Si: I’ve devoted…I’ve basically given up YouTube. That’s the way I’ve gotten around it is basically by not watching several hours of YouTube, I can read for several hours, and that’s getting me through things. And this is always the way, the more you do it, the quicker I’m getting at it and getting back into the habit of it. I used to read so much when I was a kid. And, you know, that died off. And actually, I’m really getting back into it. I’ve been heckled on Twitter already today because I said I wasn’t buying any new books, and somebody called me a heretic. So, that’s going well. But yeah, I actually made it into and out of a bookshop without buying anything the other day.

Desi: Nice! Well done.

Si: Amazing bookshop. It’s in Bath in Somerset in the UK, so down south a bit from where I am. And it’s a absolutely stunning, beautiful bookshop. So, I mean, over three or four floors, but an independent, not like a Blackwells or a Waterstones or one of those big retailers. It’s actually an independent bookshop, and it’s incredible. Absolutely amazing.

Desi: I think both of those retailers are both UK specific, because I’ve never heard of them.

Si: What’s your go-to bookstore?

Desi: We had Dymocks over here, which was, like, really big when I was young. And there’s still a few around. Like bookstores are a dying breed, I guess. But the other…the like massive one that kicked off probably when I was like a teenager, was Borders, which I think is in the US as well.

Si: It was in the UK, and in fact the Oxford branch used to be one of our favorites because it had a really good cafe.

Desi: Yea, had a little cafe. It was like the first place that used to do that in shopping centers that wasn’t just like a cafe that you went to. But I’m pretty sure, like, there might…I can probably do a quick Google “how many borders are left in Australia?” Ah, I should put “Borders bookstores”.

Si: It’s one big border all the way around the outside!

Desi: Right. Okay. So I’m pretty sure there’s none. The first result on Google says “there was no buyers willing to take the company over and the last remaining nine orders stores in Australia have officially closed forever.” And that was in 2011. So I’m way behind on what…

Si: I haven’t seen one for a while, but yeah!

Desi: There you go. There you go.

Si: Oh dear.

Desi: So we met New Year’s Resolution, though. I wanna ask, did you actually clean up your office or did you just change the camera angle?

Si: Both, but not enough behind me, is the honest answer. There was a large pile of equipment on my left, which is now no longer there. I did have the top of my printer cleared because I was printing things. (I can do this because it’s a roving camera.) You can see now that it has reoccupied a flat space and now contains…

Desi: It’s now storage. Yeah.

Si: It is now storage again. That’s part of the tidying process, is that things kind of get put onto there before they get redistributed to shelves. But the bookshelves are looking okay.

Desi: Yeah the bookshelves are good. And you can actually see behind me, I do have a box of storage.

Si: Ah, yes. Yeah. I see. And a hard hat?

Desi: Yeah. That’s my new job. And I have, I guess, a Fly Away Kit, which incident response teams that need to go all have, so yeah that’s sitting there hopefully…

Si: That’s interesting. So I’m gonna use that as an interesting segue to move on to one of the things I wanted to talk about today. So I’m gonna share my screen. I’ve been playing with this thing. Remember we were talking about ChatGPT at the end of last year? How could you forget? We wrote our thing. Well, there is an equivalent for image generation for ChatGPT.

Desi: Oh, I saw this. And is it the creepy lady one? Is that the same one?

Si: Well, I don’t know about that. Hang on, let me get…where’s Discord? Oh there’s Discord. I’ve lost it. There it is. Let me show you. I’ve been playing with this. It’s something called MidJourney, okay, and you access it through Discord (and obviously the links will be in the show notes for anybody who’s doing it. And, let is that the share screen button? That is the share screen button. Window. Window Discord. There we go. Share.)

Okay, so you can see this and what you do is you join the Discord channel. There’s a free trial for anybody who’s interested, then there are subscriptions that you can pay for going on after that. And what you do is you put in a drawing prompt, and then it will come back to you with four, sort of, of its creations. And here’s what you can see. (This is somebody else’s going through.) Here you go, “lion, single photograph”.

So what happened is this person put through a prompt, it gave them back four, and then you can choose one to elaborate on and upscale and stuff like that. And I’ve been playing with this a little bit. And in fact, if you follow me on Twitter, you’ll see some of the…I’ll put the Twitter link as well. I don’t say anything interesting ever…

Desi: Mona Lisa in the shape of a banana!

Si: Yeah, there you go. (And, sorry, not safe for work, I guess technically!) But you can also upload your own photographs and then have people (oh you won’t be able to see that because that’s opened in another window.) But essentially you can upload your own photograph and then insert that as part of the training material for the thing.

So…look at this and it’s cinematic portrait photograph. So I was playing with this and when you get a subscription, you can get your private channel, you can have dms with it. So this was my brief playing this morning of a full color photograph of an Australian incident response analyst. For some reason they think you’re all military! I dunno why Australians are that way, but your hard hat led me back straight back to this picture in the bottom right.

Desi: Man, I forgot to mention, but I have a uniform that looks exactly like that in my Fly Away Kit!

Si: Really?! Thank God for that. Yeah, so it’s like that. And it’s absolutely fascinating because, you know, we can…well, let’s take number four and we’ll get variations on number four, and see it making variations here. It’s absolutely daft. I mean, I’ve had…you can see it working as it goes through and does it. Now it’s not terribly fast, but when you consider how many people are actually hammering the private channels…let’s pick one of these.

Desi: I, as you kind of scrolled up, I could see there was like a slow tick of new post. So it seems like people are always in it.

Si: So that (oh no, I didn’t mean to do that. That one. It’ll do that again. Let’s upscale that one. So that’s number 1, 2, 3, 4. Let’s do 2. So, upscale 2.) But it’s really quite…it can be quite inventive. I’ve had some…the upscaling is much slower than generating new things.

Desi: So what is upscaling doing?

Si: So basically it takes the image that you have as a quarter screen and it makes it into a full…not a full screen, but a higher resolution one. But it keeps the image, it just adds more detail into it. So what you do is, it’s just “slash imagine”… So, you can see some of the other things I was playing with this morning. So, “two podcasters”: there they are. That was after bouncing through a couple of variations. And it does different styles. It does…it be quite…

Desi: It’s hit the demographic on the head, I’m pretty sure.

Si: Yeah. I think so. You know, it can be quite photorealistic in some ways. You know, it looks like a good render as opposed to… So that was actually the prompt of a computer program, there. So, this was was worse, this was “two podcasters talking about digital forensics, one from Australia and one from Britain.” And this is what it decided that we were, so, I wasn’t gonna show you that having been so humiliated by it. But it is very, very clever. It’s pretty cool. So have you got…do you wanna give it a go? Let’s see, what would your prompt be?

Desi: Oh, “messy office of an investigator”. Let’s see if it just takes your background! So while we’re waiting for that to…I looked it up. The…who I was thinking of was the…some artist was doing some investigation and there was like a common woman that seemed to keep popping up when they were requesting images, like over and over again. And they…I can’t remember whether the AI named it or he just named it, but it was Loab, L-O-A-B. So I’ll chuck that in the notes as well. It’s pretty creepy in some of the photos that pop up with it, but yeah…

Si: Yeah, that looks about right. I think probably…

Desi: Yeah, that’s pretty good. I like the bottom, the modern one in the bottom…like modern-ish, one in the bottom left. That’s pretty cool.

Si: Bottom left. Okay, let’s do that. So bottom left, let’s upscale that. (That’s not bottom left, that’s bottom left.) What I’ve found is that it seems to be inherently…as all good AI training sets taken from broad internet data, it seems to be inherently sexist and racist. If you don’t specifically say, “I want a female, non-white person”, you will get a white bloke. Regardless of what you ask for, you say, “I want a cybersecurity student.” It gives you a white bloke. You say, “I want a person standing in the window,” it’ll give you a white bloke. It’s just, it seems to me to be very inherently  biased towards Caucasian males of mid-range thirties, I would say, maybe a bit younger.

Desi: I mean that’s been the same for like every dataset trained, like when you give it the internet, like a subset of it, it gets to the point where it’s the cesspool of the internet, which is quite a lot of bad stuff that’s not on the surface that it’s picking up. So, I remember when all the images were coming out, like the AI art won that competition and everything, and more and more artists were using it, there was like a whole bunch of articles come out about how AI was sexualizing women. So even if you fed it…especially apps where you could feed it your photo, it would then sexualize whoever you fed that photo because…the dataset that it’s pulling from is, in all honesty, probably like 70% porn because that’s what the internet is. So, if you are just letting it have free reign, that’s what it’s gonna try and train itself on, I guess.

Si: A friend of mine did comment upon the attractiveness of some of the female models that were being put forward. And yes, it’s a very dangerous…

Desi: Well…

Si: It’s funny though because we’re in…go on.

Desi: I was just gonna say, I think especially when you are feeding the AI pictures of people, like real people, and you are saying, “do something with this person”, and especially if that person isn’t giving their consent to use their image and it kind of…almost goes back to, like, what you are putting out there of your family and your kids and everything, like, their likeness or their image, because it could end up in some pretty dark imagery from this AI stuff, which is really scary. And then you’ve got the issue of, as it gets better and better trying to figure out whether it’s real or not, right? Like, you hope that the forensics keeps up with it.

Si: And this is (I mean, I know we sort of slightly pre-planned this conversation), but this is sort of the thing that came up in the issues with deep fakes, isn’t it? Because you put out material of…I mean, we are doing it now to a large extent, you know, this is a recorded podcast: we’re giving our voices over and our faces to…actually, I must get danger money off Jamie for this! But we are doing this and it’s only a very small amount of material that is required in order to start creating a deep fake. And you’ve got your…one of the links you sent to me before this chat was to do with an entire television show based around this.

Desi: Yeah, so it was called, “Deep Fake Neighbor Wars”, which I’m not sure…I’ve got the IMDB reference and I’m not sure whether it started yet because it’s scheduled to come out in 2023. Oh, looks like there’s been six episodes already. But the premise is it takes the faces and I guess the deep fake voices of celebrities, and chucks them into essentially, like, I don’t know whether the UK has a TV show called “Housos”, but it’s like lower socioeconomic neighborhoods, where it’s just like the day-to-day struggles of, like, blue collar battling families.

Si: Yeah. All right. I get the genre.

Desi: Yeah, that’s like “Housos” is a big Australian comedy kind of show, I’m just not sure whether it made it anywhere overseas. So, yeah, that’s the premise and it’s just like those celebrities are battling their day-to-day stuff. And when you watch it, like the deep fakes are pretty poorly done. And I feel like that’s probably on purpose, like it was a low budget deep fake, essentially, to make it clear that it’s taking the piss.

It’s not like they’re trying to seriously say like, I don’t know, Drake or someone was actually doing this. But again, like, we’ve seen some demonstrations of some really good deep fakes out there that would be hard to tell if you weren’t trying to pick it out, or you weren’t, like, digging into the forensics of it or knowing where that person was at that time.

Si: Yeah, I think the issue is that superficially now they’re good enough to fool the eye, especially when media is being consumed on small, handheld devices frequently. What you can pick out on your, you know, 45, 50 inch television is a bit different to what you can pick out on the mobile phone in your hand on the train when you’re watching the news in the morning commuting, or whatever. So, I think there’s that risk.

But actually, you know, even with that, it’s not until you start looking into the data of it that it becomes totally obvious. Hopefully we can have…we can get Martino from Amped on later in the year, because they have a product called Amped Authenticate, which is specifically for doing this kind of thing. And it’s…I understand it’s got some pretty good results out of it, like really good results out of it in identifying stuff.

Desi: I wonder what the…because there’s always limits of detection technology, right? Like, I wonder what the limits are in terms of like if you deep faked a smaller part of the image, whether that’s…becomes harder because I guess you can’t rely on metadata anymore, because most services will strip metadata off things before you upload anyway to protect people from stalking you and that kind of thing. Yeah, like, I wonder what that limitation is.

Like, if you are the main focus and you deep fake that whole thing. Like, potentially that’s what that technology, or the technology that we have today, detects. But if you were like an individual in the background and then that was used in court to prosecute you because you were there at the scene, but you weren’t, and you can’t tell the difference and you don’t have an alibi otherwise. That’s scary, right?

Si: Yeah, I think so. And I mean, especially with (and he says being slightly critical of his own government) some of the draconian laws coming in to do with protests and, you know, restriction of the freedom to protest. Evidence of being at a protest that has been provided to the police in some form or the other, you know, are first of all in a string of prosecutions, is the legal aid authority going to be willing to hand over cash to everybody who wants to dispute their presence at a rally? And second of all, you know, are people even gonna think about it? I…this is an interesting one that I’d not considered as a possibility, so yeah.

Desi: Because I think we’ve spoken about that before, right? Like, it’s, like, the economy of scale isn’t there to look into the digital forensics when, like, if hundreds of people getting prosecuted, the ability for them to defend themselves would be very low because the courts probably aren’t willing to hear all those cases in depth unless you had the money to throw at the court and, like, lawyer up and everything. So…

Si: Yeah, I think perhaps what we need…I think the sort of equivalent does exist in the UK. Certainly with the EncroChat phone data. You know, there was a higher level case that went through as to whether that material was admissible or not for various legal reasons. So, whether each and every single EncroChat.

But it’s sort of…in the US you have this concept of the class action lawsuit, don’t you? Whereby everybody participates in a larger thing. I wonder if perhaps there’s a room for that sort of thing. Incidentally, I’m not a lawyer and any lawyers who wish to comment on how this works, please feel free to add your comment to the bottom of the text.

Desi: Yeah. Like, in all honesty, like I’d love to get some of our listeners’ opinions if they do know the legal system more than me. Personally, class action lawsuits to me, make lawyers richer. Like, you may get off, but you still go through that emotional stress of having to deal with that class action. And at the end of it, your payment will be probably…the company will give a bulk payment split between hundreds, if not thousands, depending on what kind of class action it is. And it’s probably not worth being involved in the first place.

Si: I did see somebody (again, it was one of those Twitter things that scrolls past fast enough that you don’t pay that much attention to it) but somebody had received their payout from whatever class action they’d been involved in and it was $6.32 or whatever! But the lawyer who’s taken 10% of the winnings, of the hundred million is quite well off! So, yeah. I’ve just found out…we actually have…it seems that Deep Fake Neighbor Wars is on in the UK as well, and like you said: six episodes. Kim Kardashian and Idris Elba I’ve got in mine, my first one.

Desi: Yeah, yeah. So, Deep Fake Neighbour Wars is made by a UK team. So it’s originating from over there.

Si: Ah, it’s our fault. Okay, that’s fair.

Desi: Yeah. I think it I found it when I was reading, like, a BBC article or something, and we just happened to be talking about this stuff.

Si: I’ll put the ITVX link in then for all the UK viewers as well. And…you go to it, but I can see Will Smith, Idris Elba…that’s embarrassing, I have no idea who that person is. In the text there’s Greta Thunbrg, and Conor McKenna…Conor McGregor rather. And Ariana…that’s probably Ariana Grande (he says, not knowing who that really is) Not really my genre. Anyway! But looking at it…I dunno, we have, interestingly, there’s quite a strong history of impressionists in the UK.

There used to be a lot of impressionists that turned up on television. And you would see…and I guess from Spitting Image onwards, we are not adverse to parodying our great and good…or great anyway. So, I guess it kind of fits as a concept within our remit. Although I wonder what sort of legal permissions are required in order to do it. It’s a fascinating idea.

Desi: Yeah. So, this was interesting is, I don’t think they actually sought any permission at all because this was the point of the article was, like, are you gonna get in trouble for, like, using US celebrities (or, like, celebrities because Conor McGregor is not American)? But using celebrities in a deep fake in a situation that’s not them. And they were talking about, is there any legal precedent for this? And they hadn’t…they were interviewing a lawyer, and the lawyer was saying there’s potential for one or a few of the celebrities to sue.

Just like…potentially not to make money, but just to set the precedent in the US legal system. Because I’m not sure whether it’s the same in, like, everywhere, but precedent is like a huge thing over in the US, so they always want to get one case through so that if something else comes up later on, then it can be used to basically…

Si: Yeah, it certainly is in the UK as well. Precedent is a significant portion of what makes up the law. So yeah. It’s interesting because I mean, again, you know, we have sort of…satire is protected under law to a certain extent in the UK, and this is our Private Eye…Ian Hislop the had to through a Private Eye and if I dunno if you come across…there’s a program called Have I Got News for You in the UK, which is definitely satirical, usually absolutely on the nail, but satirical.

You know, Ian Hislop actually is the most sued man in the country for…but actually he wins a huge amount of his cases because he is, you know, satire is defensible by law. Also telling the truth defends you. So, you know, in his case he does very well out of it. It’d be interesting just because it’s embarrassing for someone.

Desi: Are they trying, are they trying to sue for defamation?

Si: Yes, I suppose they’re suing for defamation as opposed to use of rights. Well it’s difficult isn’t it?

Desi: I was thinking, like, deep fake, like stolen identity because you are stealing their likeness to use and they potentially, their likeness and their characteristics of the individual to make something.

Si: I think you’re…I think in terms of making a program, you’d probably be stuck at identity theft that wouldn’t work. Because that would be…the definitions of that I suspect sit far more in use…in terms of forgery and fraud, neither of which these really are. What I suspect though…

Desi: If I stole some of your information and try to open a bank account, that would be (social security number) that’d be identity theft, right?

Si: Yeah. But if I take a photo…in the UK, if I take a photo of you on the street, that’s fine. I haven’t stolen your identity. I’ve just…and that’s legal as well, street photography’s legal in the UK.

Desi: But you’re not using my photo to pretend to be me to do something else.

Si: No, but I could sell that photo of you as a piece of art without having to get a model release for street photography.

Desi: Yeah, okay. But I suppose you’re taking that photo in a public domain, so that would…like, if you went into my house and took a photo of me, you’re not allowed to do that.

Si: No, no. Yeah, at that point I would need a model release in order to do it. But then, you know, but also then the copyright of that image, even the one that I took in your house with your permission, the copyright of that image belongs to me, not to you. So I can do with it what I like.

Desi: Well, until I sue you and I get the rights for it.

Si: No, well no, the copyright goes to the creator of the image, not the person in it.

Desi: At the point of time. But if you took it in my house without my permission.

Si: Ah, without your permission, yeah.

Desi: Yeah, and then it went… So that’s why I’m saying, like, the photo of the public domain is different.

Si: But then the thing is, is that all of these people’s…if they’ve taken this material that they’ve used to program their machine learning from sources that are in the public domain, the people who would be able to sue for the copyright infringement wouldn’t be the individual, it would be the creator. So, if you’ve ripped off (I’m gonna say…I’ve got Idris Elba here as an example because he’s on the screen if you), if you’d ripped him off from (he says, trying to think of a film he’s been in) Star Trek (whichever one it was, the new ones with Chris Pine.

Anyway, he’s in one of those), if you’d ripped out his image from that, the copyright belongs to, well, I dunno, JJ Abrams or Dreamworks or whichever studio (it’s not Dreamworks I know that), but whichever studio it belongs to and therefore are they able to sue for that? But then you are also allowed to create derivative work. Derivative work is protected under law as well.

Desi: Yeah. And I guess this is why it was, like…I can definitely see arguments for both sides and, I don’t think, like, I think this comedy series would be funny and I don’t think they’re malicious at all, but you can definitely see when it would be used maliciously, and I guess even it would be interesting, I don’t know ever know what happened with it, but when they were taking celebrities faces and putting them on porn actors.

Si: Yeah. I think that’s actually illegal under UK law. I’d need to check it. I think that that starts to be sort of under offensive images, or things like that. So it’s kind of an interesting aside. It’s a hundred percent illegal if you’re doing it with anything under a legal age, that’s guaranteed to be…

Desi: Quick Google: looks like in the US it’s covered under the revenge porn law.

Si: Yeah, it probably is in the UK as well. I can imagine. Which is interesting because in a lot of cases, I suspect the people doing it are intending it far more complimentarily than revenge. But, you know, it’s an interesting debate, isn’t it?

Desi: I guess again, you run into the thing of like child exploitation material that could be deep faked that way.

Si: Yeah, but that’s already covered under the Indecent Images Act. So the…because you know, it’s…even if you were to draw a cartoon, that’s covered under the act. So anything that depicts…so indecent images covers quite a wide variety of things, although interestingly it seems to a certain extent parody is still protected under that, which is a bit weird. Because I remember a case going through the courts whereby bestiality is also…so, apologies listeners, you know, if you wanna turn off feel free!

Bestiality is still banned under (well is still?) is banned under the UK (it should be banned under the UK, that’s fine). But there’s some video that was circulated and the guy ended up in court and essentially it’s a woman and a tiger involved in whatever act. But when you have the soundtrack on, it’s…the Tiger says, “this sure beats doing Frosties adverts for a living”. And I dunno if you have Frosties in Australia, but it’s Kellogg’s spokesperson…spokestiger is the…yeah. And the judge ruled that that wasn’t…because it was a parody and therefore, you know, that wasn’t found to be part of…wasn’t prosecuted or wasn’t prosecutable under the indecent images thing.

Desi: Interesting.

Si: So, you know, the law is an ass without a shadow of a doubt! But yeah, a fascinating field and one which is only going to grow more. I just mean on a different note, how do you feel about the use of deep faked celebrities who have passed on?

Desi: Oh, like the Tupac hologram that they had?

Si: Oh, I mean, that’s one of them. I mean, I believe that, you know, I think John Wayne has turned up in some advertising or something like that. You know, Marilyn Monroe has been deep faked a hundred times to do new things I’m sure. You know, that kind of artistic intent.

Desi: Yeah. I don’t know. I feel like that…so to take an example, right, like, the indigenous Australians have…whenever you watch their shows and they have someone who has passed away in it who is indigenous and they’re then showing it later on, or it’s an episode, there’s always a warning at the start that says, “be warned indigenous people who have passed away are in this,” because cultural belief (and I may be a little bit incorrect here), but it’s like capturing the soul within the image or something similar. Like I know that the Japanese had a concept similar to that as well.

So, they always put that warning and that warning’s been there since, like, as far back as I can remember, like when I was a kid, shows would always have that warning at the start. Now that’s a very deeply religious and philosophical debate about the individual who is watching and the individual who you’re, like, recording or potentially deep faking, right?

So, like, personally, I don’t care. Like, if it happened to me, I’m dead. Like, my philosophical and religious beliefs…and as long as that image isn’t affecting any of my immediate family in a negative way, like if it was used to harm them in some way, then I don’t have an issue with it. But I can definitely see that there is huge issues both, like, religiously and philosophically for some cultures and some people. So, it’s not just like a legal thing at that point. Well, all of this isn’t just a legal thing. Like, it’s the moral and the ethics of it all as well, which is sometimes not considered in a legal sense.

Si: That’s a very interesting philosophical argument to go down. Yeah, definitely. It’s interesting because, you know, I did a photography degree during lockdown because I couldn’t go anywhere and it seemed like a good idea at the time. But one of the pieces of work that I did for…in order to get my degree, one of my final projects, I actually shot on film rather than shooting using a digital camera, I actually shoot a fair amount of film.

And one of the arguments…you had to write something called an artist’s statement. And the artist’s statement is a lot of, as a technical person (I say this with a deep, deep love) a lot of guff that you write to justify your artistic intention behind it. And one of the reasons I wanted to shoot on film was because I actually wanted to create a direct relationship between the place and the finished product.

So, I was doing landscape work and I wanted to create…because, you know, actually the film has been touched by a photon that has bounced off an object in that landscape and then has hit the…has been diverted…it hit the film and it has created that effect. So I can understand in that philosophical term of the, you know, if I have a photo, a proper photo, a real photo of a person, that image itself has been touched by, in a way, or has been touched by something that has touched that other person. So, I can grasp that and I can see why you were there.

It changes a bit with digital because at the end of the day, when you put a…when I display, you know, a digital photo on my screen and the video that people will be watching when they’re (if they’re not just listening to this on Spotify), the video that they’ll be watching of us talking, those electrons that they’re seeing now have nothing to do with the electrons that have have been…the photons that are bouncing off me and going into this camera here. And by the time you’ve created the deep fake of someone, you are so far removed from the original, I wonder whether it really would be anything to do with them anymore. But you know, again, this is a massive philosophical argument rather than a real one!

Desi: I think from a technical standpoint, it goes down to layers of that abstraction and both an analog photo and a digital photo are the same to a human looking at it. Like it’s still providing that same information no matter what form or how it came to be.

Because we can be ignorant of that fact because our eyes just perceive what we’re looking at. And I will…so I did look up quickly. I wanted to make sure that I got this correct and I got this from a AIATSIS, and this is the cultural sensitivity section of the website (and I’ll link this in the show notes) “that lists deceased persons Aboriginals and Torres Strait Islander people should be aware that this website contains images and voices of names of deceased persons.”

So that’s kind of, like, the warning that I always saw on shows. And then under that, it’s got: “in some Aboriginal and Torres Strait Islander communities, hearing recordings, seeing images or names of deceased persons may cause sadness or distress and in some cases offend against strongly held cultural prohibitions.” So, I just wanted to make sure…it doesn’t really say what those cultural prohibitions are. I could be wrong on that, so…but I will link this so if people want to go read it, they can.

Si: I…Australian broadcasting…I mean, I’ve…funnily enough we watch a lot of Australian programs in this house. I mean, not like Neighbors and Home and Away, which to be fair, in my years I have done, not for a long time, but I have done. I was really sad to see Neighbors that ended, actually. Anyway…

Desi: It’s coming back.

Si: Is it really?

Desi: Yes. Yeah, I’m pretty sure it’s coming back this year.

Si: Any of the original cast?

Desi: Ah, fucking Tony’s still there. So, yes, I’m sure a few more.

Si: Excellent. Oh, I’ll tell…

Desi: I’ll spoil the twist for the end of the season already, because it’s probably already happened like eight times. But someone will be getting married, they’ll be on their way to the wedding, they’ll be in a car crash, they’ll die. And then there’s like a huge thing about it. They potentially may go into a coma or just be off the show forever.

Si: If it ain’t broke, don’t fix it! You know? It’s worked for the last 20 years.

Desi: Should we start making a TV show?

Si: Yeah, Forensic Focus: the series. Yeah, sorry, total diversion! Yeah, we should. No, we watch a lot of Australian TV actually. Australian Lego Masters, fucking brilliant, great program.

Desi: Ah, that’s good.

Si: We can’t wait for the next season of that. But actually it is Master Chef, Australian Master Chef, which makes ours look like a nursery equivalent. It’s, you know, you guys go, “oh, we’ve got a fully stocked larder”, and the amount of things that are in there is ridiculous. And then in the UK it’s like, “we’ve got a fully stocked larder for you.”

And there’s like two small tables with a couple of things on there. It’s like, “really, is that the best you could do?” But it’s when they go out into some of the filming locations and they put forward the (and I again, I apologize to any Australian listeners, I’m gonna get this completely wrong), but they say thank you to the Aboriginal people of the land and thank them for the ability to film on there and, you know, give respect. Yeah.

Desi: That’s very recent.

Si: That’s so nice.

Desi: Yeah, that’s probably like not as…that hasn’t been around as long as the warnings at the start of TV shows. But yeah, thanking the original owners of the land has probably, I would say, maybe about a decade, that it’s been in use. And then it’s definitely more widely used in the last five. So, JJJ, which is like our national, kind of like, radio station that’s part of the ABC as well, they’ll…every time they have people from callers they’ve got…they’ll always be like, “oh, we’ve got this caller from…”, and then they’ll say the original, indigenous land name and then the, like, I guess westernized or the colonial name of what it is now.

So, it’s good. Like, it’s slowly making…it helps me, it makes me more aware of where our heritage and, like, who was here before us and, kind of, the history of the land, which is always really interesting. Which we, like, we studied all through school, probably not enough of it, but it was always pretty interesting to hear.

Si: No, that’s cool. Yeah, that’s really cool.

Desi: Coming back to the weird deep fake images stuff, which we started on, I had this other article, which was creepy as hell when I saw it. But Nvidia is creating an AI to go with its next series graphics card maybe they’re releasing…? Or it could be out already? But essentially what it does is while you are, so while we’re talking now, if you turn the feature on, your eyes will just look at the camera.

Si: Ah, now I’ve heard…I saw this, and I saw a vague demo of it. There’s a demo of it somewhere.

Desi: Yeah, some guy on YouTube had the feature that I (like in my seven hours of YouTube while you were reading) he was demoing the new feature and it was pretty good. Like, it kind of failed when he turned his head too far to one side and the eye was like clipping back and forth because it was trying to catch. And then he also tried this where he would, like, put his hand over and the eyeball would just move.

Like, it would try and avoid the hand even though his eyeball was here. And that looks really weird. But yeah, if anyone hasn’t seen it, I’ll find the video and I’ll post it in the show notes. But, it’s like a small thing, I guess, that’s kind of like deep faking your own eyes.

Si: Oh, hang on, hang on, hang on. Let me…

Desi: Yeah, super creepy.

Si: Right, bear with me. (Can I expand that? Yes, I can. Can I share that while I’ve got it expanded? Probably not. No, click to exit full screen. Let’s just do that in small. But…share window. That one.)

Desi: Yeah, it was a one 113 video. What’s his name? BSPCN, I think.

Si: So, Juras–. You post the reasonable one, I’ll post…I’ll do the unreasonable one, because I haven’t seen this yet!

Desi: See, some of the scenes look normal and then others are just like…this is weird.

Si: That’s so disturbing! Uh, brilliant. Yeah, I’d seen that. I’ve also seen the real time deep fake video as well. (Oops, sorry. That’s the Deep Fake Neighbour Wars. I’m just gonna stop sharing the screen, but I realized closing it doesn’t do that.) I’ve also seen realtime deep fakes being done. So somebody’s basically moving and they’re overlaying it at the same time. Funnily enough, I’ve come across the eye feature earlier. It’s…for stills photography apparently is something you can get in Photoshop. Photoshop will do it for you.

Desi: Well, I guess it’s potentially been around for a while, right? Like phones, because there’s some mathematical model, right? If you have n number of people, and you’re trying to take a group photo, there’s a mathematical formula (I’ll also look this up, and try and post it in the show notes), but there’s a formula to predict how many photos you need to take in order to get a photo where everyone is…has their eyes open and isn’t blinking.

But then I guess smartphones and digital cameras these days do away with it by just taking, like, multiple little snapshots of photos and then stitching them all together. So, I guess that technology has always been, like, the stitching photos.

Si: Photo stitching has been around, I mean, you know, you’ve been able to stitch photos with film for a very long time. But photo stitching in Photoshop is very easy. In fact you can do it in Lightroom as well. I did this over Christmas actually. (Let me see if I can find the picture. That’s the right folder, that’s a good start. Yeah, open up.) Okay, this is not the world’s most incredible photograph by any stretch of imagination, okay. So I’ll say that in advance. I was messing around at Christmas. I have a four by five box camera.

Desi: Oh, is this the one that said you were gonna try?

Si: Yeah, this is it. So this is the box camera I tried. I have an adapter that allows me to put a digital camera onto the back of it. Now, a digital camera’s sensor size is small. Very small! And the four by five sheet of film is what it says it is: it’s four inches by five inches. It’s large, you know, in comparison, definitely. So this adapter, when it fits onto the back of the camera, actually takes six shots across the…three at the top, three at the bottom, or if you’ve got it…that’s five by four landscape, so, three at the top, three at the bottom, and then you stitch them together. This is actually just three shots stitched together. But that makes it a…what is it? Yeah, it’s a 8,500 pixel wide image.

Desi: Yeah, right.

Si: As opposed to, you know, the normal width. But it will stitch it beautifully. The important thing in this was to have a scene that wasn’t moving because, you know, when you…it’s a very long and manual process. But in something like a camera phone, those…you can fire off six shots in rapid succession and then restitch them and map them back together without any trouble.

Desi: Yeah. We got those panoramic modes as well, which as long as you turn slowly enough, you know, no moving stuff in the image, you can get quite a large picture.

Si: Yeah. I’ve seen some quite interesting tricks done whereby if you’ve got somebody with you, you start them on the left of your panoramic shot.

Desi: I’ve definitely done this before!

Si: And then they run around behind them, and the other person on the right, on the other side of the panoramic shot!

Desi: Oh man, it’s so hard to run fast enough to…because you…it’s still only quite a small angle for th…yeah. I was doing it when I guess panoramic first came on phones, so the angle from them was quite thin, so it was a very fast runaround to get into the other side.

Si: But yeah, I’m intrigued…if you can find that algorithm for how many photos I have to take to get to guarantee myself a shot.

Desi: (Photos to take…)

Si: It’s one of…it is probably not as many as you might imagine it is. So, it’s that particular piece of maths…it’s the birthday problem, for statistics, is that the number of people you need in the room for two of them to have…the probability of two of them to have the same birthday is actually not that high. It’s only…I think it’s about 30. So typically if you’ve got a lecture theater full of students, two of them will have the same birthday.

Desi: So it’s one (in brackets) one minus XT times n. I dunno, I just…I remember when I first saw it that if you had like an analog camera, it’s a lot, like, you are using a lot of film. But with the digital camera, it’s fine.

Si: Yeah.

Desi: But I will, yeah…I’ll post this in the show notes.

Si: Yeah, the idea of having a 36 image roll of film is definitely a thing of the past, isn’t it? Although, I don’t think it’s made photography that much better for it, because everybody now takes a shit ton of photos that are lousy, composed them, as opposed to really, you know, concentrating and taking a good photo once!

Desi: Yeah, true.

Si: “23 people are required in order for the probability of two people having the same birthday to exceed 50%.” There you go. Anyway. Same issue for cryptographic hashes, by the way ladies and gentlemen: there’s our link to forensics for the day! The probability of having the same cryptographic hash. Again, put the links in the footnotes. Yeah, I like that. It’s funny we got onto the whole of this topic because of the terribly edited Boris Johnson photo.

Desi: Do you wanna show people that photo before we jump off?

Si: Yeah, I’ll just pull it up. Yeah, so the UK users almost certainly would’ve seen this one where Boris Johnson, who can’t generally get out of the press because he is an attention grabbing whatever, found himself unceremoniously removed from it because he’s been deemed to be a liability, which I think is a sensible understanding of it. And we were looking at this…actually as Desi pointed out, it’s not a particularly good (you can’t see my mouse, oh, you can see my mouse moving in the video, that’s cool.)

Just here you can see the edge of his elbow quite well, and this is not particularly well done. The logo is not particularly well done. But we are back to the days of the removal of people who are no longer deemed to be complimentary to the party. Let’s see if we can find the typical one. I think it’s the beginning of nearly…it is the one where Stalin has had his advisor removed.

Desi: Oh, right.

Si: I know it’s in the other book, but that book’s next door. So, this particular book is written at the time when it’s all about film photography, not…no digital editing here.

Desi: While you’re looking for that…oh, there we go. You got it.

Si: I was gonna say…it’s not, but you can see that, you know, people have been doing photo tricks without Photoshop for a very long time.

Desi: While you’re still looking for that, for our listeners and viewers at home, I just wanted to point out something that got pointed out to me this week is that forensics focus has an events page. So that kind of lists all the major conferences or hackathons or summits, like the Magnet Virtual Summit that’s coming up, that was part of our last episode that’s all listed there. So, you can go to forensicfocus.com/events and just sort by the month that you have some potential free time and you might want to go see something or check out some of the virtual conferences.

Si: Yeah. We will be turning up to some, if not (well it won’t be all of those), but some of the ones in that list there will be a forensic focus presence. (I can’t find it in the book, so let me just share the screen for this.)

Desi: Definitely some of the virtual ones, one of us will definitely be across in there and in talking about that in some of the episodes and then I was having a look at some of the Australian ones that are happening that seem pretty interesting that I’ll be trying to get to in person as well. So, any of the Australian listeners, if you’re there, feel free to come and say hi and I’d love to hear some feedback or ideas in person if you guys have any.

Si: Yeah, I’m going to (I dunno who Jeanie is and why she keeps popping up on the screen), yeah I’m going to try…I’ll be at any of the UK ones, or some of the UK ones, some of the online ones. I may well bring an outside broadcast unit recording device with me and may interrogate anybody who comes too close! So, we’ll see how that pans out.

Desi: That does not sound inviting at all.

Si: I was gonna say, I was talking…my youngest daughter is doing podcasting as a, sort of, an additional activity at school. And she’s ended up as the podcast host, which I think is hilarious. And we should…we were talking about sort of interviewing technique and I sort of said, “well, I’ve only sort of got two references I can give you: one is sort of like an online course with a company called Domestica, who do actually (it’s a really good company if you’re interested in some sort of more arcane courses, it’s got some fun stuff in there), but there’s one on interviewing people.

The only other material I have is this, which is for carrying out interrogations in police cells!” And I thought, “you know, like I can lend it to you, but it probably won’t go down very well.” So yeah, my technique may be somewhat informed by the literature that I have available. So, you know, I’ll try to be nice! Right, let me close that window. And, I have to say, you know, for the viewers that they’re going to be listening to this or (can you have a viewer that listens? Yes, I suppose they’re listening simultaneously.)

Desi: Listening with their eyes!

Si: It’s…listen with their eyes. It’s nice (we should get subtitles on). It’s nice to be back. It’s great to catch up with you. Glad that you’re looking so well now.

Desi: Great to chat again.

Si: …the house move has gone smoothly. We will be back. I dunno when this will actually go live. We’re gonna say Happy New Year, because for us it’s still January.

Desi: Just!

Si: You are unlikely to get it before February at this rate. But you know, nonetheless, we’re glad to be back on board. We’re very much looking forward to this year and, yeah, we’ll call it a day today and we’ll see you soon.

Desi: See you all.

Si: All right. Cheers.

Leave a Comment