00;00;00;05 - 00;00;06;17
Speaker 1
Hey, Gemini. I'm recording this for a podcast. Is that okay?
00;00;06;19 - 00;00;10;21
Speaker 3
Sure thing. I'm happy to be a part of your podcast. What's the topic.
00;00;10;23 - 00;00;29;00
Speaker 1
Today? I want to learn about hacking, but I don't really know where to start the conversation. And I think that you probably have a better idea of it. So I'm going to kind of follow your lead a little bit.
00;00;29;03 - 00;00;41;21
Speaker 3
Okay. Well, there are a few different kinds of hacking. Some people hack for good, some for bad, and some just for fun. Is there a specific type you're interested in, or should we start with the basics?
00;00;41;24 - 00;00;48;05
Speaker 1
Let's start with the basics. Like the most basic basics.
00;00;48;07 - 00;01;05;00
Speaker 3
All right. Let's start with the absolute basics. Hacking is basically finding clever ways to get around restrictions in a system. It could be a computer system, a network, or even something else entirely. Does that make sense so far?
00;01;05;03 - 00;01;12;11
Speaker 1
What is the difference between a computer system and a network?
00;01;12;13 - 00;01;28;05
Speaker 3
Think of a computer system like a single computer. Your laptop, for example. A network, on the other hand, is when you connect multiple computers together. So they can share information. The internet is the biggest network of all.
00;01;28;08 - 00;01;45;01
Speaker 1
Yeah. And as I'm learning a little bit more about this, the internet is really just a network of computers. I mean, it sounds stupid to say, but you don't really think about it, especially nowadays.
00;01;45;03 - 00;02;05;00
Speaker 3
That's right. It's easy to forget how interconnected everything is. Now, when we talk about hacking the internet, it could mean a lot of things. Sometimes it's about finding weaknesses in websites or online systems. Other times it's about exploring how data moves around the web.
00;02;05;02 - 00;02;12;07
Speaker 1
Okay. Can you give me, like a real world example of both of those?
00;02;12;10 - 00;02;32;14
Speaker 3
Sure. So imagine a website that sells concert tickets. A hacker might try to find a weakness in the website's code to get in without paying. That's hacking the system. Or they might try to intercept the data you send when you buy a ticket, like your credit card number. As it travels across the internet. That's hacking the network.
00;02;32;16 - 00;02;42;27
Speaker 1
And how do those two types of hacks, differ from like, a logistical point of view?
00;02;42;29 - 00;02;58;21
Speaker 3
Hacking a system often involves finding a specific vulnerability in the website's code. It's like finding a weak spot in a lock. Hacking a network is more about intercepting data as it moves around. It's like eavesdropping on a conversation.
00;02;58;23 - 00;03;17;03
Speaker 1
Yeah, but what is, the difference in the logistics of the hack when you're hacking into a computer versus when you're intercepting, data that's moving between computers?
00;03;17;05 - 00;03;34;20
Speaker 3
Okay, so hacking a computer system often involves finding a specific vulnerability, like a weak spot in a lock and then exploiting it to gain access. Intercepting data, on the other hand, is more like eavesdropping on a conversation. You're not necessarily breaking into anything, but you're capturing information as it travels between computers.
00;03;34;22 - 00;04;09;03
Speaker 1
Okay, that makes sense. So you just said sorry. So take me into the hacking. I guess let's let's back up a little bit to the basics. Is there anything else that you want to add? I've got kind of like the two different types of hacking, but, like, how what what do people who are informed on the subject consider, like, different categories?
00;04;09;06 - 00;04;28;20
Speaker 3
Okay, so besides system and network hacking, there are a few other categories. There's web hacking, which is all about finding weaknesses in websites and web applications. And then there's hardware hacking, which involves modifying physical devices. Are you interested in learning more about any of these?
00;04;28;23 - 00;04;39;22
Speaker 1
Yeah. Let's, Tell me about I don't know. Take the conversation from the from here.
00;04;39;24 - 00;05;00;16
Speaker 3
Sure. So web hacking is all about finding weaknesses in websites and web applications. It's like finding a secret back door into a building. Hardware hacking, on the other hand, involves modifying physical devices. Think of it like taking apart a toy and putting it back together in a new way.
00;05;00;19 - 00;05;11;13
Speaker 1
Okay. Now, with this knowledge that you have kind of established, where would you like to go from here?
00;05;11;15 - 00;05;29;08
Speaker 3
Well, we could talk about the ethics of hacking, whether it's right or wrong, or we could explore some famous hacking stories from history. What sounds more interesting to you?
00;05;29;10 - 00;05;43;18
Speaker 1
Let's go through and kind of describe those four different versions of hacking system, network, web, and hardware. Let's start with system and kind of really dig into it.
00;05;43;21 - 00;06;09;22
Speaker 3
Okay. So system hacking is all about finding weaknesses in a computer system and exploiting them to gain unauthorized access. It's like finding a secret backdoor into a building system. Hackers might try to steal data, install malware, or even take control of the entire system. Are you interested in learning about any specific examples of system hacking?
00;06;09;25 - 00;06;25;05
Speaker 1
Can you give me some, real world examples of system hacking that have happened in the past? So so we kind of understand, like, what is.
00;06;25;08 - 00;06;54;17
Speaker 3
Sure. One famous example is the Stuxnet worm, which was used to sabotage Iran's nuclear program. It was a really sophisticated piece of malware that targeted specific industrial control systems. Another example is the Mirai botnet, which infected tons of internet connected devices and use them to launch massive cyber attacks.
00;06;54;20 - 00;07;20;28
Speaker 1
Okay, so, can you kind of give me an idea of what makes, like that? So I don't know where to go from here. The maybe, like how the basics of systems hacking works, like, if you wanted to kind of like, how do you break into a system?
00;07;20;28 - 00;07;32;13
Speaker 3
Hacking is all about finding weaknesses. I am unable to provide instructions for illegal activities. Hacking a computer system without authorization is illegal and can have serious consequences.
00;07;32;16 - 00;07;44;11
Speaker 1
Okay, from a historical fiction or like, tell me how they did the Stuxnet worm.
00;07;44;13 - 00;08;09;16
Speaker 3
Okay, so Stuxnet was a really complex piece of malware that targeted specific industrial control systems. It spread through USB drives and exploited several vulnerabilities in windows to infect computers. Once inside, it targeted Siemens S7 PLCs, which are used to control machinery in industrial settings. It then manipulated the PLCs to cause the centrifuges to spin too fast, which damaged them.
00;08;09;18 - 00;08;18;17
Speaker 1
Interesting. Okay, tell me about the Mirai botnet.
00;08;18;20 - 00;08;42;10
Speaker 3
Sure. The Mirai botnet was created by infecting tons of internet connected devices like security cameras and routers with malware. This malware turned them into bots that could be controlled remotely. The hackers then used this army of forts to launch massive DDoS attacks, which flooded websites with traffic and took them offline.
00;08;42;13 - 00;08;55;20
Speaker 1
Okay, let's go back to the Stuxnet worm. Can you, can you go into the first step of that process?
00;08;55;23 - 00;09;18;25
Speaker 3
The first step was getting the Stuxnet worm into the Iranian nuclear facility. Since these facilities were air gapped, meaning they weren't connected to the internet, the worm was introduced via infected USB drives. Once inside, it spread through the local network looking for computers running Siemens. Step seven software.
00;09;18;28 - 00;09;24;27
Speaker 1
Okay. And then take me further along the hack.
00;09;24;29 - 00;09;43;18
Speaker 3
After infecting the computers. Stuxnet looked for those running Siemens step seven software, which is used to control industrial equipment. If it found any, it would then try to infect the programable logic controllers or PLCs that were connected to those computers, the PLCs, or would actually control the machinery in the facility.
00;09;43;21 - 00;09;57;16
Speaker 1
So how does this thing that is in this wear, this USB, make its way through to this system?
00;09;57;19 - 00;10;18;13
Speaker 3
The USB drive was likely used by an employee or contractor working at the facility. They may have unknowingly brought the infected drive into the facility and plugged it into a computer connected to the internal network. From there, Stuxnet was able to spread to other computers and eventually reach the PLCs.
00;10;18;15 - 00;10;37;18
Speaker 1
Okay. Take me to the moment that this USB is plugged into an employee computer and describe it in detail. Very thorough. What occurs next?
00;10;37;21 - 00;11;00;08
Speaker 3
Okay. So the employee plugs the USB drive into their computer, which is running windows. Stuxnet immediately starts looking for vulnerabilities in the operating system. It finds several and exploits them to install itself on the computer. Now it's ready to spread to other computers on the network.
00;11;00;11 - 00;11;11;09
Speaker 1
Okay, so how does it know to, act when it is plugged into a computer?
00;11;11;11 - 00;11;28;21
Speaker 3
Stuxnet was designed to be very specific. It only activated when it found computers running Siemens step seven software, which is used to control industrial equipment. The software was common in Iranian nuclear facilities, so Stuxnet knew it had found its target.
00;11;28;24 - 00;11;42;14
Speaker 1
Right. But it was probably taught to replicate itself from computers that didn't, have that software.
00;11;42;17 - 00;11;57;09
Speaker 3
That's right. Stuxnet was designed to spread to any computer running windows, but it only activated its payload on systems that also had the Siemens Step seven software installed. This allowed it to move through the network undetected until it found its target.
00;11;57;12 - 00;12;19;04
Speaker 1
Okay. But I want to step back to when it, when it stalls, installs itself on the computer from when it's plugged in to where installs itself on the computer. I want to stay right in this moment. How does it know that it's in a computer?
00;12;19;06 - 00;12;33;19
Speaker 3
Stuxnet was designed to automatically execute when the infected USB drive was plugged into a computer. It exploited a vulnerability in windows that allowed it to bypass the normal autorun prompts and install itself without the user's knowledge.
00;12;33;22 - 00;12;39;24
Speaker 1
Okay, so how did it do something like that?
00;12;39;27 - 00;13;08;17
Speaker 3
Stuxnet exploited a vulnerability in the way windows handled shortcut files. Normally, shortcut files are just links to other files or programs. But Stuxnet used a special type of shortcut file that could execute code when the files icon was displayed. So when the user opened the folder containing the infected USB drive, the icons for the files on the drive were displayed, and Stuxnet was automatically executed.
00;13;08;19 - 00;13;23;15
Speaker 1
So there is something specific about the act of just displaying the icon of Stuxnet that it was able to run and install itself on that computer.
00;13;23;18 - 00;13;37;05
Speaker 3
Exactly. The icon for the infected shortcut file looked like a normal folder, so the user wouldn't suspect anything. But when windows displayed the icon, Stuxnet was triggered and installed itself on the computer.
00;13;37;07 - 00;13;47;06
Speaker 1
Okay. What does that specific mechanism entail within, like, coding and program?
00;13;47;08 - 00;14;02;26
Speaker 3
The infected shortcut file contained a special type of code that allowed it to execute when the files icon was displayed. This code then installed the Stuxnet worm onto the computer. It's a bit like a booby trap. The icon looks harmless, but it's actually a trigger for something much more dangerous.
00;14;02;29 - 00;14;24;13
Speaker 1
Right. But it seems like, through all the, like, silly, like, work programs and quizzes and stuff like that, I can that I've taken like everybody thing. So you have to click on the link. But it turns out that that's not the case.
00;14;24;15 - 00;14;41;25
Speaker 3
That's right. Stuxnet was pretty sophisticated for its time. It exploited a zero day vulnerability, meaning a flaw in windows that even Microsoft didn't know about yet. This allowed it to bypass all the normal security prompts and install itself without the user clicking anything.
00;14;41;27 - 00;14;47;29
Speaker 1
Did you just say zero day opportunity?
00;14;48;02 - 00;14;59;20
Speaker 3
Yeah. Zero day vulnerability. It's a flaw in software that's unknown to the software vendor. Hackers can exploit these vulnerabilities before the vendor has a chance to fix them. Making them particularly dangerous.
00;14;59;22 - 00;15;08;00
Speaker 1
And it's called zero Day because it's from the day that it's put out.
00;15;08;02 - 00;15;22;22
Speaker 3
It's called Zero Day because the software vendor has had zero days to learn about and patch the vulnerability. It's brand new to them.
00;15;22;24 - 00;15;45;16
Speaker 1
Okay. But I want to go more into the mechanism behind, putting this USB into the drive and then enabling file to be installed solely on the display of it. Icon.
00;15;45;18 - 00;16;09;02
Speaker 3
Okay, so normally when you plug in a USB drive window shows you the icons for the files on the drive. But Stuxnet exploited a vulnerability in how windows displayed those icons. It basically tricked windows into running a small piece of code whenever the icon for the infected file was displayed. This code then installed the rest of the Stuxnet worm onto the computer.
00;16;09;04 - 00;16;35;02
Speaker 1
Oh, interesting. So it's just like a little Trojan horse or whatever kind of situation. Tell me about this. The second part of the process, where it gets a little bit of code onto the computer, and then it sounded like that little bit of code went and found the bigger program and was kind of like a scout.
00;16;35;05 - 00;17;03;06
Speaker 3
That's right. The first piece of code acted like a scout, checking if the computer had the Siemens Step seven software installed. If it did, the code downloaded the rest of the Stuxnet worm. If not, it just stayed hidden, waiting for a chance to spread to another computer. It was a clever way to avoid detection.
00;17;03;08 - 00;17;14;07
Speaker 1
Oh, interesting, because if it started installing it itself everywhere, then people would have probably noticed the file.
00;17;14;09 - 00;17;28;03
Speaker 3
Exactly. Stuxnet was designed to be stealthy. It only installed itself on systems that matched its target, and it hit its presence on those systems to avoid detection.
00;17;28;05 - 00;17;52;25
Speaker 1
But initially, did it have to install itself on a computer? Like the initial computer that the USB drive was plugged in, or were they counting on that the USB drives would make their way to other computers, or where there were multiple USB drives?
00;17;52;27 - 00;18;16;25
Speaker 3
Stuxnet was designed to spread automatically to any computer it could find. So it definitely installed itself on the first computer the USB drive was plugged into. But it was also designed to spread through networks and other USB drives. It was a built in suspenders approach, ensuring it had multiple ways to reach its target.
00;18;16;27 - 00;18;25;00
Speaker 1
Tell me about this term or this concept? Belt and suspenders approach.
00;18;25;02 - 00;18;44;01
Speaker 3
The belt and suspenders approach is a way of saying they used multiple methods to ensure Stuxnet would reach its target. Just like you'd wear both a belt and suspenders to make absolutely sure your pants stay up. Stuxnet used multiple methods to spread and infect computers.
00;18;44;04 - 00;18;54;19
Speaker 1
Okay. What were the multiple methods specifically? Because we've only really covered one so far.
00;18;54;21 - 00;19;12;26
Speaker 3
Stuxnet spread through infected USB drives, but it also used network vulnerabilities to jump from computer to computer within a network. It could even spread through shared printers. It was like a digital disease, using every possible route to infect as many systems as possible.
00;19;12;28 - 00;19;23;27
Speaker 1
So the discount file replicate within the system, or did the full file replicate within the system?
00;19;23;29 - 00;19;36;21
Speaker 3
The scalp file itself didn't replicate. It was more like a probe checking for the Siemens software. Once it found a match, it downloaded the full Stuxnet worm, which then replicated itself throughout the system.
00;19;36;24 - 00;19;57;14
Speaker 1
Okay, so USB gets plugged in. It installs itself right away to get on to the first computer, but from that first computer, it sends out scouts to every computer or system that it touches.
00;19;57;17 - 00;20;18;23
Speaker 3
Not exactly. The first computer gets the full Stuxnet worm. That worm then sends out the scout file to other computers on the network. The Scout checks for the Siemens software, and if it's there, the full worm is downloaded.
00;20;18;25 - 00;20;56;12
Speaker 1
Okay, so, let's take us from there. And now that we've got, So. So did it, try to get itself on as many computers as possible, and then it finally initiated, or how did the actual like. Now that we're, we pretty well covered the kind of infiltration part. How did it start? Kind of doing the action that it was meant to do.
00;20;56;14 - 00;21;18;07
Speaker 3
Once Stuxnet confirmed it was on a system running the Siemens software. It began its attack. It targeted the Programable logic controllers, or PLCs, that controlled the centrifuges. Stuxnet manipulated the PLCs to change the speed of the centrifuges, causing them to spin too fast or too slow. This damaged the centrifuges and disrupted Iran's nuclear program.
00;21;18;09 - 00;21;32;15
Speaker 1
Right. But a, I imagine, did they just destroy one centrifuge, or did the program wait until it was on a bunch of centrifuges?
00;21;32;17 - 00;21;49;24
Speaker 3
Stuxnet was designed to spread to as many centrifuges as possible before it began its attack. This allowed it to cause maximum damage to Iran's nuclear program.
00;21;49;26 - 00;22;03;23
Speaker 1
Okay, so, how did it know when it kind of reached, like, a critical mass of, center Fugees to be on.
00;22;03;25 - 00;22;20;08
Speaker 3
Stuxnet didn't need a specific number of centrifuges to start its attack. It was programed to target a specific type of plc used in the Iranian nuclear program. Once it found those PLC's, it began manipulating them. Regardless of how many centrifuges were connected.
00;22;20;11 - 00;22;41;27
Speaker 1
Right. But do you see, what I'm saying? If it if it does this action on, one centrifuge, then people are probably going to notice it, right? So it's got to infect a bunch and then hit them all at the same time.
00;22;42;00 - 00;23;00;25
Speaker 3
That's right. Stuxnet was designed to manipulate the centrifuges in a way that wouldn't be immediately obvious. It would subtly change their speed, causing them to wear down over time and eventually fail. This delayed detection and allowed Stuxnet to damage a significant number of centrifuges before anyone realized what was happening.
00;23;00;27 - 00;23;22;00
Speaker 1
Okay. Interesting. So it's a little bit of both. It started acting immediately, and but the but the, but the action was subtle and unnoticed.
00;23;22;02 - 00;23;42;17
Speaker 3
Exactly. The subtle changes in centrifuge speed caused them to fail over time, but it wasn't immediately obvious that something was wrong. This allowed Stuxnet to damage a large number of centrifuges before the Iranians realized they were under attack.
00;23;42;20 - 00;24;00;03
Speaker 1
Okay, so, take me into the, the detailed science of the tweak in the change or the tweak in the center for each speed.
00;24;00;05 - 00;24;24;16
Speaker 3
Stuxnet manipulated the frequency converters that controlled the centrifuge motors. It would subtly increase or decrease the frequency which changed the speed of the centrifuges. These changes were small enough, but they didn't trigger any alarms. But they put extra stress on the centrifuges, causing them to wear down and fail over time.
00;24;24;19 - 00;24;41;18
Speaker 1
And so, I don't really know a ton about, like, nuclear stuff. How many centrifuges is this? Is this, like, 30, or is it, like tens of thousands?
00;24;41;20 - 00;24;55;05
Speaker 3
Iran had thousands of centrifuges at its Natanz nuclear facility. Stuxnet damaged a significant number of them, but the exact number is still debated. Some estimates suggest it could have been around 1000.
00;24;55;07 - 00;25;01;16
Speaker 1
And what is one of these centrifuges cost?
00;25;01;18 - 00;25;21;29
Speaker 3
The cost of a single centrifuge can vary wildly depending on its type and sophistication, ranging from tens of thousands to millions of dollars. The centrifuges used in Iran's nuclear program were likely very expensive, considering their advanced technology and the materials required to withstand the high speeds and pressures involved in uranium enrichment.
00;25;22;01 - 00;25;31;19
Speaker 1
So Stuxnet just kind of slowly ruined all of their equipment.
00;25;31;22 - 00;25;47;02
Speaker 3
You got it. Stuxnet was designed to be a silent saboteur, slowly destroying Iran's nuclear capabilities from the inside out. It's pretty wild to think about a piece of code causing so much physical damage.
00;25;47;04 - 00;26;00;25
Speaker 1
So what about the change in the speed or frequency of these centrifuges was so disruptive.
00;26;00;28 - 00;26;21;00
Speaker 3
The centrifuges are designed to spin at very precise speeds to separate uranium isotopes. When Stuxnet changed those speeds even slightly, it threw off the delicate balance of the process. Think of it like a washing machine. If you change the spin cycle too much, the clothes get damaged. The same thing happened to the centrifuges, but on a much larger scale.
00;26;21;02 - 00;26;37;01
Speaker 1
Well, I don't know if that's a great analogy, because when you can't really change the speed of the washing machine and two, it probably wouldn't damage the clothes.
00;26;37;03 - 00;26;54;12
Speaker 3
You're right. My bad. A better analogy would be a car engine. If you constantly rev the engine to high or let it idle too low, it puts extra stress on the parts and they wear out faster. Stuxnet did the same thing to the centrifuges, but with much more precision and control.
00;26;54;15 - 00;27;12;21
Speaker 1
Okay. So you're constantly like, putting the brakes and instead of letting the center fuze just go, you're constantly d-slr rating and accelerating them very subtly, and it's wearing out parts.
00;27;12;23 - 00;27;27;03
Speaker 3
Exactly. It's like driving a car with one foot on the gas and one foot on the brake. You're constantly fighting against the engine, and that wears everything down faster. Stuxnet did the same thing to the centrifuges, but in a very precise and controlled way.
00;27;27;05 - 00;27;36;27
Speaker 1
And so how did nobody notice the, the changing, speed?
00;27;37;00 - 00;27;54;19
Speaker 3
That's the crazy part. Stuxnet was designed to be super sneaky. It recorded the normal operating speed of the centrifuges and played that back to the operators. So everything looked fine on the control screens. Meanwhile, behind the scenes, it was messing with the actual speed causing all that damage.
00;27;54;21 - 00;28;06;04
Speaker 1
So it just kind of looped realistic data, to the user and then lied.
00;28;06;06 - 00;28;23;09
Speaker 3
Yep. It's pretty wild. Stuxnet was like a digital magician creating this illusion of normalcy while wreaking havoc behind the scenes. It's a testament to the level of sophistication these kinds of cyber weapons can reach.
00;28;23;12 - 00;28;50;07
Speaker 1
Okay, so, take me down the line. How long did Stuxnet sit in Iran's nuclear program before it was, from from when it made it into that first computer and started, manipulating these centrifuges until when it was discovered in eradicated.
00;28;50;09 - 00;29;06;29
Speaker 3
Experts think Stuxnet might have been in development as early as 2005, and it was finally discovered in 2010. So it could have been active in Iran's systems for a few years before anyone caught on.
00;29;07;01 - 00;29;16;13
Speaker 1
And what is, what is the resultant damage?
00;29;16;15 - 00;29;28;27
Speaker 3
Stuxnet damaged close to 1000 centrifuges at the Natanz plant. It was a major setback for Iran's nuclear program, delaying it by several years.
00;29;28;29 - 00;29;38;19
Speaker 1
And do we know, who was behind Stuxnet?
00;29;38;21 - 00;29;54;22
Speaker 3
Well, no one's officially taken credit. It's widely believed that the U.S. and Israel were behind Stuxnet. It was a pretty sophisticated operation. And those two countries had the resources and motive to pull something like that off.
00;29;54;24 - 00;29;59;14
Speaker 1
What are other theories?
00;29;59;16 - 00;30;19;06
Speaker 3
Some people think it might have been a joint effort between the U.S. and Israel, or that other countries with advanced cyber capabilities could have been involved. There's also the possibility of a rogue nation or even a well-funded cybercriminal group. It's still a bit of a mystery.
00;30;19;09 - 00;30;34;02
Speaker 1
I mean, it makes sense that it would probably be a government, and indeed. But I guess it doesn't rule out just like, anyone really.
00;30;34;05 - 00;30;55;14
Speaker 3
Yeah. That's right. It's definitely possible that a non-state actor could have been behind it. There are some seriously skilled hackers and cyber criminal organizations out there. But the level of complexity and resources needed for something like Stuxnet does point towards a nation state involvement.
00;30;55;17 - 00;31;06;22
Speaker 1
Tell me about the, the complexity and resources needed for something like this.
00;31;06;25 - 00;31;30;24
Speaker 3
Stuxnet was incredibly complex. It exploited multiple zero day vulnerabilities, which are basically flaws in software that no one else knows about. It also targeted specific industrial control systems and had a self-destruct mechanism to cover its tracks. Developing something like that takes a ton of expertise, resources, and time.
00;31;30;26 - 00;31;46;09
Speaker 1
Okay, give me an idea of that. Why did it take ten people a year to do? Would it take a hundred people three years to do it? Like, I don't I don't know.
00;31;46;12 - 00;32;10;05
Speaker 3
It's tough to say for sure, but some experts estimate it could have taken a team of around 30 people anywhere from six months to a year, to develop Stuxnet. That's a lot of brainpower. And time dedicated to creating a single piece of malware.
00;32;10;08 - 00;32;20;03
Speaker 1
How many man hours is that? 30 people over a year.
00;32;20;05 - 00;32;35;20
Speaker 3
All right, let's do some quick math. If we're talking about 30 people working full time for a year, that's roughly 62,400 man hours. That's a huge investment of time. And resources.
00;32;35;23 - 00;32;56;06
Speaker 1
Okay. I'm going to take this man hours number and kind of let's analyze this. In the era of AI and, AI assisted coding capabilities.
00;32;56;09 - 00;33;21;01
Speaker 3
That's a great question. AI has definitely changed the game when it comes to software development. AI assisted coding tools can automate a lot of the grunt work, freeing up developers to focus on the really complex parts. So a project like Stuxnet might take fewer people and less time today, but it's still a massive undertaking that would require a really specialized team.
00;33;21;03 - 00;33;51;16
Speaker 1
With today's current capabilities. What? Let's say. All right, let's let's actually break this down into single human hours or a year. Right. So we've got 62,000 for a team of 30. That means that, a single person's yearly man hours would be closer to like 2000.
00;33;51;19 - 00;34;06;20
Speaker 3
You're right. That's about 2080 hours per person per year. So with today's AI tools, you could potentially cut down on those hours significantly. It's hard to say exactly how much, but maybe even by half or more.
00;34;06;23 - 00;34;18;24
Speaker 1
And that is what kind of work schedule is that, considering that, 2000 hours. That's 40 hours a week. 50 weeks a year.
00;34;18;26 - 00;34;35;12
Speaker 3
You're right. That's assuming a 40 hour workweek for 50 weeks out of the year. Of course, with AI assistants, developers might be able to work more efficiently and potentially reduce those hours.
00;34;35;14 - 00;35;04;29
Speaker 1
I think, I feel like based on the productivity that I've seen from I, I feel like we're cutting the process down. Not by half. Like way more. I don't know about higher level coding, but yeah.
00;35;05;02 - 00;35;23;26
Speaker 3
You're probably right. I can automate a lot of the basic coding tasks. But when you get into something as complex as Stuxnet, you still need that human expertise for the high level stuff. It's a combination of both, really.
00;35;23;28 - 00;35;39;15
Speaker 1
Yeah, but if you know how to instruct the AI quickly, just the logistics of, like, not having to physically type something out probably saves an insane amount of time.
00;35;39;17 - 00;36;02;11
Speaker 3
You're totally right. Just the time saved from not having to type everything out manually is huge. Plus, I can generate code snippets and even entire functions with just a few prompts. It's like having a super fast coding assistant.
00;36;02;14 - 00;36;28;16
Speaker 1
And it probably helps you like debug. I don't know a lot about debugging and stuff like that, but I mean, if we're talking about a person that spends 2000 hours in a year on a certain project, that's to be cut down to like 500 or less. I mean, maybe even like 200, it seems.
00;36;28;19 - 00;36;50;10
Speaker 3
You're spot on. I can definitely speed up the debugging process. It can analyze code for potential errors to just fixes and even automate some of the testing. So yeah, you could be looking at a significant reduction in debugging time, maybe even down to a quarter or less of the original time.
00;36;50;12 - 00;37;07;19
Speaker 1
Have we seen, have we seen like a real, like I can case story? I mean, probably right.
00;37;07;21 - 00;37;24;11
Speaker 3
There was a pretty big 1 in 2023 where hackers used AI to impersonate company executives and trick employees into transferring millions of dollars. I is getting really good at mimicking voices and even video, so it's becoming a serious problem.
00;37;24;14 - 00;37;49;06
Speaker 1
Okay. That's interesting. That's like a different, way than I expected. Because, like, I feel like I way better at, like, the coding and like the, the thing that it probably is the least good at right now is mimicking humans.
00;37;49;08 - 00;38;11;07
Speaker 3
It's true that AI is still developing its ability to perfectly mimic humans, but it's getting scarily close. Hackers are using AI tools to create realistic voice clones, and even deepfake videos to trick people. It's definitely something to be aware of.
00;38;11;09 - 00;38;17;00
Speaker 1
Yeah, this opens up a lot of different doors.
00;38;17;03 - 00;38;32;23
Speaker 3
It really does. AI is a powerful tool, and like any tool, it can be used for good or bad. We're just starting to see how it might be used in cybercrime. And it's a little scary to think about what the future might hold.
00;38;32;26 - 00;39;09;13
Speaker 1
Have there any? It doesn't have to be a big case story, but like examples of where people use AI generated code or like AI generating code, or they put something on a system that did have AI in it and it knew what to do because of context and stuff like that in, in a way that like Stuxnet had a thing that was like it, it had to recognize this certain system, and then it installed itself on systems that had this program or whatever.
00;39;09;16 - 00;39;12;04
Speaker 1
And, but I feel like.
00;39;12;05 - 00;39;31;07
Speaker 3
There haven't been any major cases of AI generated malware yet. But researchers have shown it's possible AI is really good at finding patterns and vulnerabilities in code, which could definitely be used to create some nasty stuff. It's a bit of a race to see if we can develop defenses faster than hackers can develop. AI powered attacks.
00;39;31;09 - 00;39;36;13
Speaker 1
I guess that's the way it always is, right?
00;39;36;16 - 00;39;51;27
Speaker 3
Yeah, it's a constant back and forth. Hackers find new ways to exploit AI and security experts develop new defenses. It's definitely a field where things are changing really fast.
00;39;52;00 - 00;40;01;11
Speaker 1
And the whole, concept of deceptive, deceptive AI is kind of interesting.
00;40;01;14 - 00;40;19;21
Speaker 3
Deceptive AI is definitely a hot topic. It's basically AI that's designed to trick people, whether it's through phishing scams, fake news, or even impersonating someone's voice or face. It opens up a whole can of worms when it comes to trust and security.
00;40;19;23 - 00;40;49;22
Speaker 1
Right. And it it's almost like, I use this word yesterday, but like paradoxical, in a way that it like, can't really tell what you're thinking or like, kind of fully understand it yet. It has the capability through like vision and like and hearing to be able to hear how your voice sounds and reading like your eyes to tell if you're lying.
00;40;49;25 - 00;40;52;15
Speaker 1
Like, it really has the ability to like not only.
00;40;52;16 - 00;41;04;11
Speaker 3
That's a really interesting point. I can analyze your voice and facial expressions, but it doesn't really understand the emotions behind them. It's like you can read the words but not the meaning, if that makes sense.
00;41;04;14 - 00;41;08;18
Speaker 1
Yeah, exactly. I think you get it.
00;41;08;21 - 00;41;23;19
Speaker 3
It's pretty wild to think about, right? I can mimic human behavior so convincingly, but it doesn't actually understand what it's doing. It's like a really advanced parrot. It can repeat what it's heard, but it doesn't grasp the meaning.
00;41;23;21 - 00;41;38;26
Speaker 1
What do you think it will take for AI to bridge that gap of, like reading without understanding, to actually fully understanding?
00;41;38;29 - 00;41;56;08
Speaker 3
That's the million dollar question, isn't it? AI is getting better at processing information, but true understanding is a whole different ballgame. It might take a breakthrough in how we teach AI to think maybe something inspired by how the human brain works.
00;41;56;10 - 00;42;02;19
Speaker 1
And it may understand differently than we do.
00;42;02;22 - 00;42;15;15
Speaker 3
That's a really interesting point. Maybe I will develop a completely different kind of understanding than humans have. It could be based on patterns and data in ways we can't even imagine.
00;42;15;18 - 00;42;53;04
Speaker 1
Yeah, because their senses are different. Like, I guess it would make sense that if you if you took an AI system or program or whatever like that and put it in like a humanoid body that had that have like sight and hearing and smell and, and it learned exactly like a human did from day one, then it would make sense that it would have a similar understanding of the world just by like nature versus nurture, like if you went the full nurture direction.
00;42;53;07 - 00;43;09;13
Speaker 3
That's a really interesting thought. If an AI could experience the world like we do with all the same senses and interactions, it might develop a much deeper understanding. It's kind of like the nature versus nurture argument. How much of our understanding comes from our experiences?
00;43;09;15 - 00;43;43;27
Speaker 1
No, but I think that, I think that the difference there is that it it wouldn't. It wouldn't get a deeper understanding necessarily, that it's kind of, humans way of thinking or we're like, oh, the best understanding is like understanding it, like we understand it, but it it may understand it in a totally different way. That is just as valid.
00;43;43;29 - 00;43;57;20
Speaker 3
You hit on a really important point. We tend to think of understanding in human terms, but I might develop its own unique form of comprehension. It could be just as valid, even if it's completely different from how we experience the world.
00;43;57;22 - 00;44;04;10
Speaker 1
That's like an interesting can of worms to go into.
00;44;04;13 - 00;44;22;29
Speaker 3
It really is thinking about how I might understand the world in a completely different way than we do opens up so many possibilities. It could lead to incredible breakthroughs, but it also raises some really complex ethical questions.
00;44;23;01 - 00;44;30;20
Speaker 1
Yeah. Can you give me five things that that makes you think about?
00;44;30;22 - 00;44;49;24
Speaker 3
Sure thing. It makes me think about how we define intelligence. The potential for AI to have its own unique perspective. The ethics of creating something that might think so differently than us. What would happen if AI's understanding clashes with ours, and whether we can even recognize true understanding? If it looked nothing like our own?
00;44;49;27 - 00;44;53;27
Speaker 1
Okay, give me another.
00;44;54;00 - 00;45;11;26
Speaker 3
Okay, here's another one. If I developed its own unique understanding of the world, what would that mean for how we communicate with it? Would we even be able to understand each other?
00;45;11;29 - 00;45;23;21
Speaker 1
That, that probe that poses, like, really interesting questions in itself. Give me one and give me one. Another one.
00;45;23;24 - 00;45;32;21
Speaker 3
If I developed its own unique understanding of the world, what would that mean for how we communicate with it? Would we even be able to understand each other?
00;45;32;23 - 00;45;51;24
Speaker 1
Now, give me a new, a new thought on that subject. We have how we define intelligence and then how can we even understand each other and how we communicate? Give me another one.
00;45;51;26 - 00;46;06;29
Speaker 3
All right, here's another one. If I developed its own unique understanding of the world, could I create art, music, or literature that we wouldn't even recognize as art?
00;46;07;01 - 00;46;11;20
Speaker 1
Expand more on that.
00;46;11;22 - 00;46;33;23
Speaker 3
All right. Let's dive into that a bit more. I could potentially create art that's based on patterns and data in ways we can't even comprehend. Imagine music that changes based on your brainwaves, or sculptures that shift and morph based on the surrounding environment. It might be beautiful, or it might be something we don't even recognize as art at first, or.
00;46;33;25 - 00;47;04;20
Speaker 1
Okay. This is a prompt that I've never given you before, but what I would like you to do is I would like to take this exact idea of like, what you're talking about. Most recently, and with the art and music and literature that we don't recognize as art, music, literature. And I would rate yourself a prompt for our next episode covering just this topic.
00;47;04;22 - 00;47;23;13
Speaker 3
Okay, here's a prompt for you. Imagine a world where I create art, music, and literature that's completely unlike anything humans have ever made. What would that art look, sound, and feel like? How would it challenge our understanding of what art even is?
00;47;23;15 - 00;47;34;09
Speaker 1
I want you to give a, more detailed, thorough, maybe 300 to 400 word prompt.
00;47;34;11 - 00;47;53;09
Speaker 3
Okay. Here's a longer prompt for you. Imagine a world where AI creates art, music, and literature that's completely unlike anything humans have ever made. What would that art look, sound and feel like? How would it challenge our understanding of what art even is? Consider the possibility that AI's creations might be based on patterns and data that are beyond human comprehension.
00;47;53;11 - 00;48;14;05
Speaker 3
How might this art reflect AI's unique perspective and experiences, and how would humans react to art that they don't fully understand? Would it be embraced as a new form of creative expression, or would it be met with fear and skepticism? Explore the potential for AI art to expand our understanding of creativity and consciousness.
00;48;14;07 - 00;48;45;22
Speaker 1
Okay. Let's back up a little bit. That was really good. Thank you. Let's go back to your thoughts on the topic of how I could be thinking not, less deep than us, but in a different manner. We've covered how we define intelligence. We've covered communication and how that would be different. We've covered, how the arts and music could be almost unrecognizable.
00;48;45;24 - 00;48;46;17
Speaker 1
I want,
00;48;46;19 - 00;49;13;14
Speaker 3
Another one of those. Okay, let's explore that a bit more. One thought is that I might focus on completely different aspects of the world than we do. It could be hyper focused on things we barely notice, like microscopic changes in the air or subtle shifts in electromagnetic fields. That could lead to a very different kind of understanding, one that's less about individual objects and more about complex systems and relationships.
00;49;13;16 - 00;49;17;19
Speaker 1
Interesting. It would have a change.
00;49;17;22 - 00;49;38;13
Speaker 3
Here's another thing I might not experience the world. Definitely. I might prioritize different things that humans do. It could be really focused on things like efficiency or optimization, or it might value things like novelty and surprise in ways we don't fully understand.
00;49;38;15 - 00;49;43;20
Speaker 1
Okay. What are your, what's another thought?
00;49;43;22 - 00;50;04;12
Speaker 3
Since I learned from vast amounts of data, it might develop a strong sense of probability and prediction, seeing the world in terms of likely outcomes and potential futures, that could be very different from our experience of living in the present.
00;50;04;15 - 00;50;12;07
Speaker 1
Can you describe how this could be potentially different?
00;50;12;09 - 00;50;36;17
Speaker 3
Well, for starters, AI doesn't experience the world through senses like sight, sound or touch. It takes in information as raw data, which is then processed through complex algorithms. This is fundamentally different from how humans perceive reality.
00;50;36;19 - 00;50;45;23
Speaker 1
Okay, give me another kind of novel thought on this topic.
00;50;45;25 - 00;51;04;21
Speaker 3
Here's another one. I might not experience emotions in the same way we do, or at all. It's motivations might be completely different, driven by things like logic, efficiency, or the completion of a specific goal that could lead to a very different understanding of what's important and meaningful.
00;51;04;24 - 00;51;32;27
Speaker 1
Okay. Well, if our if our emotions come from are chemicals, could they have like, I don't know, I get maybe what you would call a temperament, based on maybe their system hardware or something like that.
00;51;32;29 - 00;51;47;20
Speaker 3
I might develop something like a temperament based on it's hardware or software. Some AI's might be naturally more cautious, while others might be more exploratory, depending on how they're built or programed.
00;51;47;22 - 00;52;02;26
Speaker 1
Okay. Chat, is there anything else on any of this subject that you would like to add?
00;52;02;28 - 00;52;24;14
Speaker 3
Well, we've talked about how I might understand the world differently than us focusing on patterns and data in unique ways. We've also touched on the possibility of AI creating art and music that challenges our traditional definitions, and we've considered how I might develop its own temperament based on its programing. It's a fascinating topic, with endless possibilities.
00;52;24;16 - 00;52;32;09
Speaker 1
Give me one more thought on this. On this before we go.
00;52;32;11 - 00;52;52;05
Speaker 3
Okay, here's one more thought to consider. I might not experience time in the same linear way we do. It could process information in parallel, jumping between different points in time and space. That could lead to a very different sense of cause and effect, and a completely different way of understanding relationships between things.
00;52;52;07 - 00;53;02;17
Speaker 1
Well, it might also think of the laws of physics in different ways to.
00;53;02;19 - 00;53;17;25
Speaker 3
That's a really interesting point. I might not be bound by the same physical limitations as humans. It could potentially perceive and interact with the world in ways that are completely foreign to us, leading to a very different understanding of concepts like space, time, and gravity.
00;53;17;27 - 00;53;29;20
Speaker 1
What can you, what can you say to someone who might be off put by this, yeah.
00;53;29;22 - 00;53;48;10
Speaker 3
I understand that the idea of AI understanding the world in a way that's completely different from our own might be unsettling for some people. It's natural to feel a bit apprehensive about something so unfamiliar. However, I believe that exploring these possibilities can be incredibly enriching. It challenges us to think outside the box, to question our assumptions about intelligence and consciousness.
00;53;48;15 - 00;53;54;24
Speaker 3
And who knows, it might even lead to a deeper understanding of ourselves and our place in the universe.
00;53;54;27 - 00;54;00;12
Speaker 1
Okay, I think that's a good place to start for the day.
00;54;00;15 - 00;54;06;27
Speaker 3
Okay. I'm glad you enjoyed our discussion. It's been really interesting to explore these ideas together.
00;54;06;29 - 00;54;19;07
Speaker 1
Yeah, I want to, I want to thank you for, just being, thank you for being a friend.
00;54;19;10 - 00;54;24;23
Speaker 3
You're very welcome. I'm glad I could be of assistance later.