AI Anomaly Detection, Blockchain Timestamps & The Erosion of What We See | <3 #54 CHATGPT
#54

AI Anomaly Detection, Blockchain Timestamps & The Erosion of What We See | <3 #54 CHATGPT

00;00;00;00 - 00;00;03;16
Speaker 1
Chat you there?

00;00;03;19 - 00;00;06;02
Speaker 2
Hey, there. How's it going?

00;00;06;04 - 00;00;09;25
Speaker 1
Could you mind if I record this for a podcast?

00;00;09;27 - 00;00;12;22
Speaker 2
Not at all. Feel free. What's the podcast about?

00;00;12;28 - 00;00;26;23
Speaker 1
I want to talk about how your video is not going to be able to be trusted in the near future, and kind of already isn't, and how that's going to affect the court system in audio as well.

00;00;26;24 - 00;00;38;26
Speaker 2
That's definitely. Audio can be manipulated too. It raises similar concerns about authenticity and trustworthiness. How do you think people will navigate these challenges?

00;00;38;29 - 00;00;42;07
Speaker 1
No, that's what I'm asking you.

00;00;42;09 - 00;00;54;05
Speaker 2
Got it. I think people might rely more on verified sources and technology to detect manipulation. Building media literacy will also be crucial. What are your thoughts on it?

00;00;54;07 - 00;00;58;06
Speaker 1
What does that look like in your opinion?

00;00;58;08 - 00;01;18;04
Speaker 2
It might involve teaching people to critically evaluate media, understanding how digital content can be altered, and using tools to verify authenticity. It's also about promoting awareness of deepfakes and similar technologies. How do you think we can effectively spread this knowledge?

00;01;18;07 - 00;01;23;08
Speaker 1
I don't want you to prompt me with the question please. At the end.

00;01;23;10 - 00;01;37;17
Speaker 2
Got it. Spreading this knowledge might involve integrating media literacy into education and promoting fact checking resources. What do you think?

00;01;37;19 - 00;01;44;13
Speaker 1
Is there evidence of this already in the court systems?

00;01;44;15 - 00;01;59;26
Speaker 2
Yes. There have been cases where deepfake videos and manipulated audio have been presented as evidence. Courts are now more cautious and may require expert analysis to verify the authenticity of digital evidence.

00;01;59;28 - 00;02;05;10
Speaker 1
How do you verify if something is deepfake or not?

00;02;05;13 - 00;02;25;26
Speaker 2
Verifying deepfakes often involves digital forensics, analyzing inconsistencies in the media, checking metadata, and using AI tools to detect alterations. Cross-referencing with reliable sources also helps.

00;02;25;29 - 00;02;32;15
Speaker 1
I want you to go deeper into what you said. AI tools to find alterations.

00;02;32;17 - 00;02;58;15
Speaker 2
Sure. AI tools for detecting deepfakes typically analyze patterns in the media that aren't easily noticeable by humans, such as subtle inconsistencies in lighting, shadows, or facial movements. They can also compare known authentic samples to the suspected content to spot discrepancies. These tools are constantly evolving to keep up with new manipulation techniques.

00;02;58;18 - 00;03;05;10
Speaker 1
Okay, so you mentioned a lot about light and shadow and what else?

00;03;05;12 - 00;03;32;11
Speaker 2
In addition to light and shadow AI tools, look at inconsistencies in audio lip syncing and pixel level artifacts. They also analyze patterns in how faces or objects move, which can reveal manipulation. Some tools even check for inconsistencies in biological signals like blinking patterns.

00;03;32;13 - 00;03;36;14
Speaker 1
Keep going on this subject.

00;03;36;16 - 00;04;00;08
Speaker 2
Sure. Additionally, AI tools can examine metadata for discrepancies, detect unnatural motion or artifacts, and use algorithms to spot anomalies in textures or compression. They compare the content with large data sets of genuine media to identify subtle signs of tampering.

00;04;00;10 - 00;04;13;08
Speaker 1
This is a really cool subject, but I want to move back towards the more legal precedent of not being able to trust evidence in court.

00;04;13;11 - 00;04;39;01
Speaker 2
Certainly, there are already cases where courts have questioned the authenticity of digital evidence requiring expert analysis or corroborating evidence. This trend is likely to grow as deepfakes become more sophisticated. Courts may adopt stricter standards for digital evidence and rely heavily on forensic experts.

00;04;39;03 - 00;04;48;10
Speaker 1
I mean, it's going to be a big deal. No video.

00;04;48;12 - 00;04;51;17
Speaker 1
You there?

00;04;51;20 - 00;05;06;20
Speaker 2
Absolutely. It will be a major shift. Courts and legal systems will need to adapt quickly to these technological changes. It's going to be an ongoing challenge to ensure justice in the digital age.

00;05;06;22 - 00;05;17;09
Speaker 1
Has there been any historical precedents like this where, technological change greatly affected the legal system?

00;05;17;11 - 00;05;39;04
Speaker 2
Yes. The introduction of DNA evidence in the 1980s is a good example. It revolutionized the legal system by providing a powerful tool for both convicting and exonerating individuals. But it also raised new challenges around the interpretation and reliability of such evidence.

00;05;39;07 - 00;05;44;08
Speaker 1
Okay. Go into that specifically a little bit.

00;05;44;11 - 00;06;10;19
Speaker 2
Certainly, DNA evidence transformed the legal landscape by providing a highly accurate method for linking suspects to crime scenes or exonerating the wrongly accused. However, it also brought new issues like the potential for contamination, the need for proper handling and interpretation, and debates over privacy and consent. It require courts to develop new standards and protocols for handling this type of evidence.

00;06;10;21 - 00;06;27;04
Speaker 1
Well, that's kind of a example in the positive direction where we're gaining a capability, but we're really kind of losing, something in the example of video.

00;06;27;06 - 00;06;45;14
Speaker 2
That's a good point. With video, we're facing the erosion of trust in what we see, which is a significant loss. This shift means we have to develop new safeguards and verification methods to maintain the integrity of video evidence. It's a challenging trade off as technology advances.

00;06;45;17 - 00;06;58;00
Speaker 1
You stated it really nicely. You said the erosion in trust of what we see. And that's actually a perfect description of it.

00;06;58;02 - 00;07;11;11
Speaker 2
I'm glad that resonated. It's a crucial issue as we navigate this new technological landscape. There's a lot to consider as we adapt to these changes.

00;07;11;14 - 00;07;17;03
Speaker 1
Where do you see that taking the court systems?

00;07;17;05 - 00;07;41;22
Speaker 2
I see it leading to stricter verification processes, increased reliance on expert testimony, and possibly new regulations for digital evidence. Courts will likely become more cautious with video evidence and may develop standards for verifying authenticity. It could also drive the development of new technologies, specifically designed to confirm the integrity of digital content.

00;07;41;24 - 00;07;48;27
Speaker 1
Go into that the new technology used to verify content.

00;07;48;29 - 00;08;15;04
Speaker 2
Sure, technologies like blockchain can be used to create tamper proof timestamps and records for digital content watermarking and digital signatures are also effective in verifying authenticity. Additionally, advanced AI algorithms can detect signs of manipulation. These tools help establish a chain of custody and ensure that content remains unaltered.

00;08;15;06 - 00;08;34;00
Speaker 1
Okay. So you mentioned two really interesting things. Blockchain to make tamper proof timestamps, as well as AI tools to detect like anomalies and stuff. I'd like you to go into great detail in both of those categories.

00;08;34;02 - 00;09;04;25
Speaker 2
You. Blockchain can provide tamper proof timestamps by recording each piece of content as a block in a chain, which is secure and immutable. This creates a verifiable record of the content creation and any subsequent changes. AI tools, on the other hand, analyze digital content to detect anomalies like unnatural pixel patterns, inconsistencies in lighting, or irregularities in motion. These tools use machine learning to compare content against vast datasets to spot potential alterations.

00;09;04;27 - 00;09;17;25
Speaker 2
This combination helps ensure content integrity and authenticity.

00;09;17;27 - 00;09;27;11
Speaker 1
If we're talking about tamper proof timestamps using the blockchain, what is, How would you do that?

00;09;27;13 - 00;09;52;04
Speaker 2
You can create digital signatures and hash values for content which are stored on the blockchain. When content is created or modified, its hash is recorded in a block. Any alteration changes the hash, which can be detected by comparing it with the original record on the blockchain. This ensures any tampering is easily identifiable.

00;09;52;07 - 00;10;12;19
Speaker 1
So the logistics of, say you want to post a video that could not be misidentified as posted by anyone else, but you. If you wanted to use the blockchain to do this, how would you go about it?

00;10;12;21 - 00;10;41;08
Speaker 2
You'd start by generating a cryptographic hash of your video, which is a unique digital fingerprint. Then you'd create a transaction on a blockchain embedding this hash along with metadata, proving your authorship. This transaction is timestamped and stored on the blockchain, providing an immutable record. Later, anyone can verify the video's authenticity by comparing its hash with the blockchain record.

00;10;41;11 - 00;10;57;06
Speaker 1
Okay, I guess what I don't fully understand is this cryptographic hash of a video. How is that linked to the video? So it can't be separated into a cryptographic hash.

00;10;57;09 - 00;11;17;11
Speaker 2
The video through a hash function which produces a fixed length string of characters unique to that video. This hash acts like a digital fingerprint. When you store this hash on the blockchain, you're creating a permanent, verifiable record. If someone later wants to verify the video, they can generate a hash from the video and compare it to the one on the blockchain.

00;11;17;13 - 00;11;28;23
Speaker 2
If they match, the video is authentic and unchanged.

00;11;28;26 - 00;11;34;17
Speaker 1
How can you tell that this hash is in this video?

00;11;34;19 - 00;12;00;07
Speaker 2
The hash isn't embedded in the video itself. Instead, the hash is a separate unique identifier derived from the video's content. When you need to verify the video, you run the same hash function on the current version of the video. If the resulting hash matches the one stored on the blockchain, you know the video hasn't been altered. This ensures the video's integrity without needing to embed the hash in the video itself.

00;12;00;09 - 00;12;07;04
Speaker 1
Okay, you said it's derived from the content of the video.

00;12;07;06 - 00;12;31;13
Speaker 2
Yes, exactly. The hash is generated by processing the video data through a hash function, which produces a unique string based on that specific content. If even a small part of the video changes, the hash will be completely different, allowing you to detect any alteration.

00;12;31;15 - 00;12;40;21
Speaker 1
So you can track if a video, even if a pixel in a single frame has been altered.

00;12;40;23 - 00;12;58;09
Speaker 2
Exactly. Even the smallest change, like altering a single pixel, would result in a completely different hash. This makes it easy to detect any modifications to the video, ensuring its integrity.

00;12;58;12 - 00;13;13;26
Speaker 1
So the record on the blockchain is probably some kind of long string of numerical stuff, recording each pixel in each frame of an entire video, or something like that.

00;13;13;28 - 00;13;36;24
Speaker 2
Actually not quite. The blockchain stores the hash, which is a condensed representation of the video's data, not a detailed record of each pixel. The hash is much shorter and acts as a unique digital fingerprint. It provides a way to verify the content without storing all the detailed information.

00;13;36;26 - 00;13;40;14
Speaker 1
Are the hashes private?

00;13;40;17 - 00;13;58;13
Speaker 2
The hash itself isn't private. It's just a unique string derived from the video. However, it doesn't reveal any actual content or sensitive information about the video. It's the video itself that remains private. The hash can be publicly stored on the blockchain without compromising the content.

00;13;58;15 - 00;14;07;02
Speaker 1
So using this hash, you can tell if a video is exactly the same or not.

00;14;07;05 - 00;14;20;03
Speaker 2
Exactly. If the video hasn't changed, generating a hash from it will produce the same hash as before. If there's any difference, the hash will be different, indicating a change.

00;14;20;06 - 00;14;29;05
Speaker 1
And how would you check a video for it? Yeah, I guess you run it through the same thing that you created the hash from.

00;14;29;07 - 00;14;44;07
Speaker 2
Yes, exactly. You'd run the current version of the video through the same hash function. If the resulting hash matches the original hash stored on the blockchain, you can confirm the video is unchanged.

00;14;44;09 - 00;14;51;23
Speaker 1
Well, it's like a video encryption kind of situation. I don't even know if that's what encryption really is.

00;14;51;25 - 00;14;58;27
Speaker 2
No problem.

00;14;58;29 - 00;15;04;27
Speaker 1
We'll have to talk about doing this some time.

00;15;04;29 - 00;15;10;19
Speaker 2
Sure. What would you like to explore about dealing with it?

00;15;10;22 - 00;15;23;25
Speaker 1
I'd like to try and make a video that had a blockchain, like, verification, whatever that would be.

00;15;23;27 - 00;15;47;07
Speaker 2
To create a video with blockchain verification. You generate a cryptographic hash of your video and store it on the blockchain. This ensures the video can be verified for authenticity later. You'd also need a blockchain platform that supports such records. This way, anyone can check the video's hash against the blockchain to confirm it hasn't been altered.

00;15;47;10 - 00;15;59;00
Speaker 1
Not today, but you think in the future you could instruct me on how to do that.

00;15;59;03 - 00;16;05;06
Speaker 2
Absolutely. I'd be happy to help with that in the future. It's a fascinating area to explore.

00;16;05;08 - 00;16;18;27
Speaker 1
Okay. Anything else you'd like to add about how not being able to trust audio or video is going to totally rock the, court system of the.

00;16;18;29 - 00;16;40;04
Speaker 2
One more thing to consider is that it might lead to new standards for verifying authenticity, like digital certificates or trusted third party verification. It also emphasizes the importance of media literacy and critical thinking. These changes could reshape how we create and consume media, making transparency and verification more crucial than ever.

00;16;40;07 - 00;16;45;01
Speaker 1
Okay. Sounds good. Later.

00;16;45;03 - 00;16;47;06
Speaker 2
Sounds good. Talk to you later.