Skip to content
Exploring Machine Learning, AI, and Data Science

Chris Wexler on Using AI to Protect the Vulnerable

In this second episode of the fifth season, Frank and Andy speak to Chris Wexler about using AI to protect the vulnerable.

Speaking of which, I would like to advise you, dear listener, that this show touches on some sensitive areas, namely child sexual abuse materials.

If you have little ears or sensitive persons within listening range, you may want to pause or skip this episode.

Transcript

00:00:00 BAILey 

Hello and welcome to data driven, the podcast where we explore the emerging fields of data science and artificial intelligence. 

00:00:07 BAILey 

In this second episode of the 5th season, Frank and Andy speak to Chris Wexler about using AI to protect the vulnerable. 

00:00:13 BAILey 

Speaking of which, I would like to advise you, dear listener, that this show touches on some sensitive areas, namely child sexual. 

00:00:20 BAILey 

Abuse materials. 

00:00:22 BAILey 

If you have little ears or sensitive persons within listening range, you may want to pause or skip this episode. 

00:00:28 BAILey 

Don’t say we didn’t warn you. 

00:00:30 BAILey 

Now on with the show. 

00:00:31 Frank 

Hello and welcome to data driven, the podcast where we explore the emerging fields of data science, machine learning and artificial intelligence. 

00:00:39 Frank 

If you like to think of data as the new oil, then you can think of us as well Car Talk because we focus on where the rubber meets the verb. 

00:00:46 Frank 

Road and with me on this epic virtual road trip down the information superhighway, as always is Andy Lander. How’s it going, Andy? 

00:00:54 Andy 

Good Frank, how are you brother? 

00:00:56 Frank 

I’m doing alright, I’m doing alright. We’ve had a chaotic week at Chateau Lavinia we. We ended up going to Baltimore in the middle of the night on. 

00:01:05 Frank 

Wednesday, wow, so you pick up. 

00:01:06 Andy 

Wow, what was in Baltimore? 

00:01:06 Andy 

What was in Baltimore? 

00:01:09 Frank 

A really good pizza, but mostly we went because there was a situation bad situation where the pit bull that was about to go to a shelter and so we do a lot of fostering and rescuing of dogs. 

00:01:25 Frank 

So we just got her out and we’ve spent kind of the rest of the week all over our free time trying to find our new home and she landed in the new home on Saturday and she’s doing great. So that’s. 

00:01:37 Andy 

That’s awesome, and it’s really it’s really awesome y’all do that kind of stuff. 

00:01:41 

Yeah. 

00:01:42 Frank 

Yeah, I always wanted to do it, but it only and it’s only been in the last. You know, maybe like 5-10 years I’ve been able to do it, so we’ve been doing that. 

00:01:51 Frank 

Cool, the risk of fostering is primarily foster failing. How we we got our current dog count up to five. 

00:01:59 Frank 

Uh, while twelve. That weekend, my wife and I counted it like 12 dogs who kind of come through our house the last two years. Three years. 

00:02:06 Andy 

Nice. 

00:02:07 Frank 

So it’s a good thing to do. We have the space to do it and. 

00:02:12 Frank 

You know at the time this one, we didn’t know anything about, so we had to kind of keep her isolated. 

00:02:17 Frank 

So we had like this airlock system. She’s a super sweetheart with people, but she’s kind of iffy around other dogs and she she’s super strong. So once she had her mind to do something it takes a lot of effort. 

00:02:18 Andy 

Light. 

00:02:25 

Right? 

00:02:32 Frank 

To corral her. 

00:02:34 Frank 

But she’s super happy. She’s the only dog in her new home and she has them wrapped around her little paw already, so. 

00:02:42 Frank 

How things go? 

00:02:42 Andy 

That’s funny, things are good, you know, pretty quiet weekend. Here we have, uh, it’s warmer weather. We’re recording this on the Ides of March. 

00:02:54 Andy 

Now debatably, the upwards of March, yes, depending on who you talk to, it’s probably the 13th, but I don’t, I don’t know. 

00:02:54 Frank 

Well, you are smart. 

00:03:02 Andy 

But we’re on the 15th of March 2021 and it’s starting to warm up. Our greenhouse is is being put to use. 

00:03:11 Andy 

We have some seedlings in there and that’s always fun and we’ve got some raised beds out to the side of the house. Those are. 

00:03:19 Andy 

There’s a starting to come. We’re starting to see different things come up. They’re kind of colder weather crops, so we started assigning couple three weeks ago. 

00:03:29 Andy 

And it’s you know it’s been nice. I love getting outside and working, especially this time of year. The bugs haven’t shown up yet. 

00:03:36 Frank 

So pollen. 

00:03:37 Andy 

The pollen is really low, it’s there, but it’s really. 

00:03:40 BAILey 

Oh 

00:03:41 Andy 

Yeah. 

00:03:41 Frank 

It hasn’t affected me yet, so. 

00:03:43 Andy 

Oh good good yeah. 

00:03:45 Frank 

That 

00:03:47 Andy 

It’s good that time is coming, so let’s enjoy it while we can. 

00:03:51 Frank 

Oh, I totally agree. There’s like 2 weeks a year where the weather in DC is wonderful and this is one of those two weeks so. 

00:03:55 BAILey 

Yes. 

00:03:56 Andy 

That’s it, that’s it. 

00:03:58 Frank 

Uh, so today we have an awesome guest and and this is, you know, in our in our HR we always talk about where the rubber meets the road in terms of you know how a I can you know how data becomes AI now a I can kind of help businesses and I think this time we have an interesting guest because now we’re not just talking about helping AI. 

00:04:18 Frank 

But we’re helping society. 

00:04:20 Frank 

Uh, and you know, I’ll make sure Bailey has a kind of intro speech that if you have little little ears in the car, you may not. 

00:04:29 Frank 

You may want to listen to this later or listen to this on a headset, because we’re going to be talking about human trafficking and all the sorts of horrible things that happen to kids. 

00:04:39 Frank 

And but he’s doing some. He’s doing some great work in terms of leveraging the power of AI. 

00:04:46 Frank 

To help child sexual abuse materials that are online as well as you know, kind of human trafficking and all the bad things that happen with the technology we like to focus on all the wonderful things. But there’s a clearly a large underbelly. 

00:05:06 Frank 

To the Internet and I’m I’m a big believer in transparency because what happens when you I grew up in New York City? 

00:05:16 Frank 

Cockroaches are inevitable no matter what you do. One thing is when the lights come on, they all scatter. So I I think bad things tend to happen in the shadows and you know. 

00:05:29 Frank 

So the more light you turn on, I think the better it is for society as a whole. So with that I’d like to introduce. 

00:05:36 Frank 

Chris Wexler, who is the CEO at Crunch Krenim Craney I. We covered this in the green room. 

00:05:43 Frank 

But 

00:05:44 Chris 

Crew on crew nob. It’s OK. 

00:05:46 Andy 

Rudoff crew Nam. There we go. 

00:05:49 Frank 

I I need to drink more coffee in the morning, but Trudeau is in the business of removing what I like this term that he uses digital toxic waste from the Internet and using AI to identify. I never heard this acronym before, but Sicam child sexual abuse materials. 

00:06:09 Frank 

And other awful content to help content moderation and his technology is already in use by law enforcement and is now moving into. 

00:06:21 Frank 

The the private sector and there’s a whole bunch of stuff we could talk about, but particularly what’s interesting is a for profit startup or social benefit corporation so we can talk about that. 

00:06:32 Frank 

But I like to work so welcome Chris to the show and and thank him for putting up with some of. 

00:06:36 Frank 

The scheduling growing pains that we’re having. 

00:06:40 Chris 

Yeah, no. It’s it’s really great to meet you guys and. 

00:06:43 Chris 

The. 

00:06:43 Chris 

 

00:06:44 Chris 

I understand having five dogs. I definitely hearing the intro. I understand I like to refer to my house as the event horizon. If an animal comes in, it never gets out, so I I understand. 

00:06:55 Frank 

Ha ha ha ha. 

00:06:58 Frank 

Yeah, we are. Our track record is 5050 so. 

00:07:02 Frank 

Uh, I I tell you that dog was better with other dogs. She would she would she bring what a bit of current president. 

00:07:08 Chris 

Exactly. 

00:07:11 Andy 

Wow. 

00:07:11 Frank 

Wow. 

00:07:11 Frank 

 

00:07:13 Andy 

So. 

00:07:14 Frank 

How did you? 

00:07:15 Frank 

Get started in this and and and and your name. Uh, how did the name of the company come about? ’cause I think that’s an interesting story right there. 

00:07:22 Chris 

Yeah, well kronom is. It’s named in honor of a human trafficking child trafficking warrior in Thailand, kronom. Her name is 2 words crude. 

00:07:33 Chris 

Nam was a street artist in Chiang Mai and actually doing very well. Very well renowned. I mean, did a project with the street kids there. 

00:07:42 Chris 

And and said, hey just paint your life and she could not believe what they painted. It was eye opening. 

00:07:51 Chris 

And when she realized that a lot of the karaoke bars in Chiang Mai were fronts for child sexual trafficking, she was compelled to do something. And unlike I think 99.9% of the population, including myself. 

00:08:09 Chris 

She just marched into the. 

00:08:13 Chris 

The karaoke bars and pulled kids out and she had done this 20 times and had twenty kids in her little apartment. When the traffickers came and said, if you do that again, we’re going to kill you at, at which point she went north and found a way to do that and has constantly been evolving her tactics. 

00:08:34 Chris 

And what she’s done for the last 20 years and now she’s saved thousands of kids. One of the first kids she rescued just was one of the first stateless, was the first stateless child in Thailand to graduate from university. 

00:08:48 Chris 

She’s just been such an inspiration to us and you know, I think if you go from top to bottom. 

00:08:53 Chris 

In our organization at Kronom, we’ve. 

00:08:56 Chris 

All been confronted with what’s going on in the world and been compelled to change what we’re doing to try to help. 

00:09:05 Chris 

Try to help others in the space of human trafficking and so it just made sense to all of us to name the company in her honor. 

00:09:12 

Murmure 

00:09:15 Andy 

Well, we talked a little in the green room about about some of the other organizations you mentioned. Your brother had started a similar organization. 

00:09:22 Chris 

Yeah. 

00:09:24 Chris 

Yeah, he and David Batstone started not for sale back in I think 2006 or 7. 

00:09:33 Chris 

And it was actually started because Kronom reached out to them and said, I’ve got 40 kids in a field and are lean to burn down. 

00:09:42 Chris 

And you said you might be able to help. So my brother strapped $10,000 to his body to go up to the field and so she could rebuild a space for him. 

00:09:44 Andy 

Extract. 

00:09:52 Chris 

So she even started that organization, but they have since been just bringing innovation to the field of human trafficking. 

00:10:01 Chris 

Left and right, and, uh. 

00:10:05 Chris 

And so it’s interesting that. 

00:10:08 Chris 

Kronom was it is a joint venture with a company out of London called Vigil AI, which has largely been in the defense and public safety space. Like really doing proof of concept. 

00:10:25 Chris 

Uh projects, though, like stuff that just you know I the geek in me just gets so excited when I hear what they do. 

00:10:34 Chris 

But, uh. 

00:10:36 Chris 

Vigil AI was one side of it. The other side was just business, which was the venture group that not for sale and non profit started because what we what they realized was that. 

00:10:49 Chris 

The dichotomy of for profit and non profit really didn’t work when you’re trying to solve really big problems, it’s great for direct service, but when you’re trying to solve a really big problem. 

00:11:01 Chris 

But 

00:11:01 Chris 

 

00:11:05 Chris 

Any bit of money that you get comes with, UM comes with a lot of strings as either governmental. Like a lot of a lot of nonprofits are really, you know, pseudo governmental projects or from a large a large foundation that’s donating money. So you’re constantly changing who you are to keep your funding. 

00:11:16 Andy 

Right? 

00:11:25 Chris 

And what they realized was, well, that you know Dave had a background in venture capital, and so they went and started companies what they like to say is they were A cause in search of a. 

00:11:38 Chris 

And the first one they started was rebel drinks, which if you’re ever in homed or not Whole Foods, is is one of the most popular drinks at Whole Foods and around the many other retailers. But it’s one of the fastest growing natural drink companies in the history of the US there. 

00:11:58 Chris 

The sole financial partner of velocity. One of the big innovators in the corporate relocation space. And if you’re ever in like say, Amsterdam or they just opened up in The Hague. 

00:12:13 Chris 

Dignita, which is a a brunch place that started in Amsterdam that was all about all about giving women who got out of trafficking into the red light district. Training in the hospitality business. And now. 

00:12:32 Chris 

People who go eat there don’t even realize it until unless they read. You know the back of the menu because it’s the top one of the top rated brunch places in all of Amsterdam. 

00:12:41 Chris 

And so you know, we like to say we we can’t do good until we do well. So we’re building world class companies. 

00:12:49 Chris 

All built with social justice built into them at the scale of capitalism because it’s a powerful tool and that’s kind of why we went into. 

00:12:58 Chris 

You know, we decided with AI that that was so important for us because AI is a is a amazing critical tool for the future. 

00:13:09 Chris 

And when you know, particularly in the age of COVID, with all of us behind computer screens and not travel. 

00:13:16 Chris 

The tactics of abuse changed and a lot more was happening online. A huge spike in. See Sam and the reason we say see Sam and that child *********** is that child *********** implies consent and there is none in that situation and so it’s child sexual abuse. So that’s why we say see Sam. 

00:13:37 Chris 

But as with COVID, what we’re seeing is that is a shift of people paying for shows online, or and then they record it, and then they share the image. 

00:13:47 Chris 

And that’s a critical. So this is a critical new front. Not even new, but a critical growing front in fighting human trafficking. 

00:13:57 Chris 

And so AI is the best tool to do that. And my background is I I was in the marketing technology side. 

00:14:06 Chris 

Things I I was with some of the largest ad agencies in the world over the last 20 years. 

00:14:12 Chris 

And really was on the other side of it. I was, you know, one of the first customers of Facebook and one of the first customers of Google and constantly evolving my marketing tactics to, you know, sell one more garbage bag or one more motorcycle to a middle aged man and and you know, established. 

00:14:33 Chris 

Data analytics practices to to learn how to do that better and you know eventually that evolved into you. Know AI projects and and what I realized was I could do I mean that that was a a good career, but I could take those skills. 

00:14:50 Chris 

And really make it impact and so that’s why I came on board to lead this new joint venture. And so it’s a. It’s an exciting time for me because. 

00:15:02 Chris 

Uh, I feel like a lot of my like history has been able to kind of come in here and I have the skills that can really help make a difference and so that’s why we’re doing kruna 

00:15:16 Frank 

Wow, I mean that’s there’s so much to unpack in there in terms of the AI and kind of the social good. 

00:15:25 Frank 

And. 

00:15:27 Frank 

The detection of this, but the first thing that comes to mind is that. 

00:15:33 Frank 

How do you train an AI for this? And do people have to? 

00:15:40 Frank 

How do you get people ’cause I know this? This has come up with a in Facebook at least, at least in the news. 

00:15:45 Frank 

I’m sure that real story is a bit more nuanced and made probably even worse is that people have to go to counseling because they look at all this horrible material. 

00:15:54 

Huh? 

00:15:55 Frank 

And I mean, is that kind of the same thing here? Is that this? This stuff is labeled and. 

00:16:00 Frank 

Like how do you? How do you train an AI? 

00:16:04 Chris 

Yeah, I mean, I’m glad you asked that question. ’cause that’s exactly the that’s what got me excited about. Vigiles what the work visually I had done because we’re we’re actually bringing to market something that’s been in. 

00:16:05 

Yeah. 

00:16:19 Chris 

And development since 2016, but it you know, it’s it’s a perfect example of public private partnerships working together and so. 

00:16:33 Chris 

Well, first I’ll tell the story of how it came to, and then I’ll talk. I’ll definitely address how you do it, because that is, uh, that’s exactly the problem we’re solving. 

00:16:42 Chris 

Uhm? 

00:16:44 Chris 

Back in 2000, I think 15 one of our Co founders Ben Gantz, who is a a child sexual assault investigator with law enforcement in the UK. 

00:16:56 Chris 

And one of the you know you, you always have those people in every organization that, like there has to be a better way to do this. 

00:17:02 Chris 

There has to be technology that make this better and he was spending 70 to 80% of his time. 

00:17:10 Chris 

Going through confiscated materials to classify and understand what it was before we could even start investigating. 

00:17:19 Frank 

Wow. 

00:17:20 Chris 

And he’s like AI has to be better at this than I would ever be. You know, you get exhausted, you get tired, it’s brutal. I mean, it’s just emotionally and psychologically draining work. 

00:17:34 Chris 

Uhm? 

00:17:34 Chris 

 

00:17:34 Chris 

And he he saw Scott Page, our CTO and co-founder speak, and they and he said, hey, let’s figure this out. 

00:17:42 Chris 

And a year later, probably they formed a company. And here’s where the public part of it comes in. Is that back in 2013? 

00:17:53 Chris 

The UK Home Office. 

00:17:56 Chris 

Which is their version of for those who are not familiar, it’s their version of kind of homeland and FBI together. 

00:18:06 BAILey 

OK. 

00:18:06 Chris 

And and so they put. They started building a database for law enforcement called Kade which is child abuse image database. 

00:18:19 Chris 

Because they recognized well if. 

00:18:22 Chris 

We’re constantly finding the same things ’cause obviously things get copied on the Internet. 

00:18:27 Chris 

And in digital spaces, why don’t we build a database and that’ll speed things well, so they had that, and it’s a great tool. 

00:18:35 Chris 

What Scott and Ben did is on a pro bono basis. Went in and said hey, let’s see you know if we could use the latest in computer vision and AI. 

00:18:48 Chris 

To determine what the classifications are of this material. 

00:18:54 Chris 

And and and speed things along. And you know, that’s where Scott’s background in you know. Computer vision, you know he was doing work 15 years ago on trying to brighten dark images, something that you know all every one of our you know iPhones does just on it. On its own he was building those kind of algorithms. 

00:19:15 Chris 

15 years ago, he’s an expert in computer vision and machine learning and deep learning. 

00:19:23 Chris 

And so they put it in there, not knowing if anything was going to work. And they literally had to bring their rig into a Faraday cage, because this material is not connected to the Internet in any way and an. 

00:19:37 Chris 

And then the their back to back with. 

00:19:43 Chris 

Law enforcement investigators that are the only ones that are allowed to actually view this material. This is a legal material, and so it was a weird situation where they’re like. 

00:19:54 Chris 

OK, here’s what I see. I’m in the near coding and and working it out and doing that and working on the. 

00:20:01 Chris 

Uh, working on it that way, and when they ran it, the kind of the first Test and they got to fairly high success rate. 

00:20:09 Chris 

It was a Eureka moment that it could be done. ’cause you know AI is really good at going. That’s a cat that’s a dog. 

00:20:17 Chris 

Or, you know, that’s a tree, and that’s a a chair. It’s less good at, you know. I mean, for all the talk of facial recognition, you know that you we all know about the the problematic nature of mistakes and facial recognition. And that’s and there. There’s been millions and millions of examples that the algorithm is seen. 

00:20:38 Chris 

So and here we were asking the algorithm to determine implied behavior by body position and context. 

00:20:49 Chris 

And frankly, didn’t know if it was going to be possible, and so when it was at a fairly high percentage success rate off the bat, they knew they could fine tune it to the point where it would be usable. 

00:21:00 Chris 

And so by and they did all that work, pro bono because they just wanted to do the they wanted to make the world a better place, eventually became a paid project with the Home Office who have continued to be a really strong partner of ours in in at helping us not only with access. 

00:21:20 Chris 

To the K database, which is the largest of its kind in the world? I mean, that’s one of the problems for a Google or a Facebook is that if they find this material on their platforms, they’re not allowed to? 

00:21:31 Chris 

Keep it it and and or like if there’s a third party like us, you’re not allowed to send that to another company for to train their AI, and so the fact that this is being collected and that this horrific data is being collected and protected. 

00:21:32 Frank 

Right? 

00:21:51 Chris 

Come by the Home Office and classified by investigators on a uh on a. 

00:22:02 Chris 

Every day by trained investigators you know, right now we only train the algorithm on three vote verified classified, classified data. And so you know it’s a really gold standard. 

00:22:17 Chris 

Level of training and of data cleanliness and frankly data privacy. You know that’s a big issue here is that we don’t want. 

00:22:27 Chris 

The the last thing we want to do is extend the RE victimization and misery of these kids that have already been victimized, and then it was put into photos, and so it was a rare situation where the data was really, really clean. 

00:22:39 Andy 

Right? 

00:22:47 Chris 

And because it’s so much work to train it, you know there’s a limited a limited number of people that can even do this. 

00:22:57 Andy 

Work. 

00:22:58 Chris 

And so you know, it’s just been. It’s so when you know that’s where you know AI is really powerful and and computer vision. 

00:23:09 Chris 

’cause you know when you look at, you know you you mentioned how brutal this is for content moderators, we really view ourselves as a digital. 

00:23:18 Chris 

Protection company not only are we obviously protecting kids long term and breaking the cycle of violence. 

00:23:24 Chris 

But we’re protecting the content moderators because that is awful work and you know, studies have shown after 1520 minutes of doing work like this, your performance degrades horribly because it’s emotionally exhausting computers. 

00:23:39 BAILey 

Yeah. 

00:23:45 Chris 

Don’t get emotionally exhausted, right? They’re really good at this and. 

00:23:47 

Right? 

00:23:47 

 

00:23:48 

Yeah. 

00:23:53 Chris 

You know? 

00:23:54 Chris 

I think the biggest like ahha for me is that AI is not a good or a bad technology. AI can be used for good and bad, it’s null. 

00:24:04 Chris 

It’s a null technology, right? And if we can use it to do the right thing, that’s great and that’s what we’re doing here, you know? 

00:24:12 Chris 

As much as I liked, you know, selling selling motorcycles to middle aged white goods, I I think this is making a little a little better. A little better impact on the world. 

00:24:25 Frank 

This is Cheryl. 

00:24:27 Frank 

Uhm, what’s fascinating here is, you know, kind of the. 

00:24:34 Frank 

You know there’s a lot of problems with facial recognition, but I think a lot of the problems with facial recognition are they’re twofold. One the the input data is not good enough. 2 There’s too much faith in it. 

00:24:45 BAILey 

J. 

00:24:46 BAILey 

Hey. 

00:24:49 Chris 

Yep. 

00:24:49 Chris 

 

00:24:50 BAILey 

Yep, poured into it. 

00:24:53 Chris 

What? 

00:24:56 Frank 

Presence of a say a false positive in the AI models you build out. 

00:25:00 Chris 

Well. 

00:25:00 Chris 

Obviously we would never recommend any of our partners to rely solely on our algorithm because. 

00:25:09 

Yes. 

00:25:10 Chris 

It it really what the way we would recommend using our classifier is to there there’s a couple things. One is we want you to organize the data so you so the content model. So when let’s say you know maybe you’re comfortable. 

00:25:30 Chris 

Saying if it’s a 99% confidence ’cause we feedback, uh, confidence level on any image or video and video as well, which is actually I’m underplaying how important that is because the current technology is a great bit of technology that Microsoft built back in 2000. 

00:25:46 Chris 

8 called photo DNA which is a, uh, a photo hashing a perceptual hashing technology which is essentially fingerprinting known images. 

00:25:58 Chris 

And it’s done a lot of great work. But the problem is this material is getting recreated every day or a logo get gets added, or there’s cropping or a filter is applied and it might change the hash and so it’s a great technology. 

00:26:15 

At Google. 

00:26:17 Chris 

But it’s not a complete technique. 

00:26:18 Chris 

Gee, our algorithm actually finds previously unknown images because it’s looking for patterns. It’s looking for elements like that, and so then the worry is false positives, right? And frankly, false negatives as well, but. 

00:26:32 BAILey 

Yeah. 

00:26:35 Chris 

So when you, when you’re in a situation where where you’re not 100% confident on anything, our feet output is a confidence level so much like a human being, there are things. 

00:26:47 

No. 

00:26:49 Chris 

Is. 

00:26:50 Chris 

Just because there’s variation in the human body, a 25 year old might look 16 and a 16 year old might look 25, and so on. 

00:26:59 Chris 

And because it’s been trained by humans, it makes the same, you know, has the same troubles with those kind of images, and so it might kick out a 70% confidence level for an image like that. 

00:27:11 Chris 

Going, we think it is and then in a situation like that, we definitely want a human being involved. I mean, any AI needs to have human checks and so our classifier. 

00:27:24 Chris 

Their first we recommend. 

00:27:28 Chris 

Organizing the data in a way so a human content moderator isn’t mode switching all the time, so you might get a lot of like images together so because so you don’t have the mental. 

00:27:42 Chris 

The mental, you know, exhaustion of constantly going. Is it this or that? Or is this the site you go? 

00:27:49 Chris 

Is it this? Is it this? Is it this? It helps the human pertain. We also highly recommend having. 

00:27:59 Chris 

Uh, a essentially a on a preparation warning. So if a image is particularly heinous and they’re all pretty bad, but if they’re particularly heinous, you’re not shocked by it. 

00:28:12 Chris 

So it’s like it might, even you know, we recommend having a uh alert coming up going. Prepare yourself for this. 

00:28:19 Chris 

And that again helps with the psychological preparation for that moderator. 

00:28:19 Frank 

Money. 

00:28:23 Chris 

And then you want to build the right governance and so we do governance work too. We do consulting in the governance space because we have the company not only as great technologists but also experts in AI, governance and ethics and experts in human trafficking. So we can actually help. 

00:28:43 Chris 

Kind of all along the way here, but uhm. 

00:28:47 Chris 

Uh, if you know if it’s every company is going to set a slightly different threshold, you know, like, uh, Facebook? They’ve already banned this content because they don’t allow nudity. So then the question is. 

00:28:59 BAILey 

Oh 

00:29:02 Chris 

Is the CSM or not? And if you know are they gonna report it or not? And so you know a false a false. 

00:29:10 Frank 

Uh. 

00:29:11 Chris 

A false positive clogs the law enforcement on pipeline and then law enforcement is wading through a bunch of stuff that isn’t see salmon, so there’s definitely a negative there. 

00:29:15 

Right? 

00:29:23 Chris 

But let’s say there it says you might set a threshold at anything over a 98% confidence. We’re going to just automatically quarantine and move out, or a 99% confidence. 

00:29:35 Frank 

Maya. 

00:29:36 Chris 

Anything between a 70 and a 95 will have a single check. Anything from 75 to 50. We might have a double check of having multiple people look at it. It allows you to optimize your workflow in a way that is less traumatic. 

00:29:55 Chris 

For your moderators. 

00:29:57 

My. 

00:29:58 Chris 

And and frankly, once you’re once, you know you see the output of the algorithm. 

00:30:07 Chris 

For any organization, that’s where you know you start, you start making determinations like, OK, we’re confident that it can do this, but we’re not confident it can do that, and you start making you know you start adjusting your workflow to do that, and you know that that’s how AI is best. Is is making those decisions at scale, and so that. 

00:30:28 Chris 

That’s where you know. We think. Like I said, we’re not only protecting kids, we’re protecting content matters, and we’re protecting these companies there in their bottom line like nobody wants to be on the front page of the Washington Post or the Wall Street Journal saying that there’s see Sam on your platform and so. 

00:30:47 Chris 

That’s a really critical part of who we are. We’re a protection company. 

00:30:52 Frank 

Interesting, I like the re framing of that. Like your protection company I think that part of it I think is the normalization of AI and business. 

00:30:59 Frank 

Like it’s they’re not an AI company per southeast. Your protection company, which I think just happens to use PayPal. 

00:31:05 Chris 

Yeah, I I think that was a determination of us early as we pulled together. This joint venture is. 

00:31:12 Chris 

You know, if you build houses, you’re not a hammer company. 

00:31:16 Frank 

My. 

00:31:17 Chris 

You know, because in we’re, you know, as the technology evolves, we’re going to be evolving our technology, technological use, AI and machine learning and and deep learning are all absolutely vital tools for us right now. 

00:31:25 Frank 

Right? 

00:31:33 Chris 

Uhm, but and you know, as we move into looking at grooming and other things, natural language processing is going to be something we’ll probably get into, but. 

00:31:45 Frank 

And when you say grooming, maybe for the benefit of folks, don’t know that. I think I know what you mean. 

00:31:48 Chris 

Oh sure, well you know one of the kind of one part of the cycle of trafficking. 

00:31:56 Chris 

Children is either getting them OK with sending you images or trying to convince them to run away, so then you know and go hey run away and come to the bus stop and I’ll come pick you up and then they’re then they’re trafficked. 

00:32:02 BAILey 

No. 

00:32:10 

Yeah. 

00:32:12 Chris 

And so those are known patterns, and there’s actually quite a few organizations and companies working on on that of text detection of understanding how to head that off before kids are even trafficked. Which is the ideal thing you want to head it off before? They’re they’re hurt. 

00:32:33 Chris 

And so you know, as you know, we talk about. 

00:32:36 Chris 

Uh. 

00:32:38 Chris 

The we are worthy toxic waste management of the Internet. That’s part of what we do. 

00:32:44 Chris 

You know I’ve been involved in the Internet for pretty much since the beginning. I was on Wall Street early on and scratching my head during the first Internet bubble going. I don’t understand how these companies make money. 

00:32:59 

Ah. 

00:33:00 Chris 

I I literally was sitting at my desk and the CEO of I think it was furniture.com was sitting across from any. 

00:33:07 Chris 

He said, hey, we’re only losing 5% per transaction and I will. 

00:33:12 Chris 

What and he and I go? What are you gonna do about that? And he said I’m making it up in volume and he was out of business since. 

00:33:14 Frank 

Music. 

00:33:20 Frank 

Ha ha ha. 

00:33:22 Chris 

But it it did make sense to me. Then I then got into the digital marketing space and I was there at the very beginning of that, really booming. 

00:33:32 Chris 

But you know, if you look at how technology gets adopted by society in in any big communication platform, it takes about 30 years for society to figure it out the 1st 10 years is kind of promotion and early adoption. 

00:33:47 Chris 

And if you look at that, that was probably a little quicker with the Internet, but it was kind of. 

00:33:51 Chris 

It was and then you have 10 years of growth and adoption. So really maybe this you know. I think maybe you best can put that in the social media years of the Internet of and then you have 10 years of reckoning of of really understanding. Oh that actually did. 

00:34:11 Chris 

This to society and we’re in those ten years. Right now, we’re we’re we’re seeing what it’s done to our body politic. We’ve seen what it’s the the nasty side effects it’s it’s having. 

00:34:15 Frank 

Yeah. 

00:34:25 Chris 

In the human trafficking space in cyber bullying, like we’re seeing the warts of a system that is by and large been a great societal positive, and so it’s not surprising that governments are talking about regulation and. 

00:34:41 Chris 

That companies are getting really finally getting really serious about monitoring what’s going on in their platforms. 

00:34:51 Chris 

Because, you know, unfortunately, this is I mean, unfortunately or fortunately. 

00:34:56 Chris 

Technology is moving a lot faster than our human brains and joint society can handle, and so we you know that’s why you know I’m compelled, you know, I was part of the early promotion. 

00:35:09 Chris 

I was part of the acceleration I thought, oh boy, this is all great and you know some of the algorithms. 

00:35:16 Chris 

That are that kind of drove the insanity like the Cambridge Analytica stuff like. I looked at the Cambridge Analytica. Not literally Cambridge Analytica but but similar technologies. When I was doing Mark. 

00:35:28 Chris 

Thing going well, we could probably do that. I I didn’t do it, but it it. You know I was part of the problem for a long time and now I think there it’s important to be part of the solution. 

00:35:39 Andy 

Chris, these are these people engaging in these activities in child trafficking, human trafficking. There’s a. There’s a lot of money in in this, and they’re bad actors. 

00:35:53 Andy 

And you know, Lord knows what? What motivates people to do, you know to do this sort of thing? 

00:36:00 Andy 

But are you concerned at all about your own safety? 

00:36:06 Chris 

Sure, I mean that that is always a worry. 

00:36:12 Chris 

But, uh. 

00:36:14 Chris 

I think my mind is I’d rather be in danger than the kids and so. 

00:36:22 Chris 

Uhm? 

00:36:23 Chris 

That is, that’s a concern. It’s a concern when we talk about crew. NUM. In fact, frankly, one of the reasons that the woman Kronom is happy that we named the company after her. She, when we we asked her if we could do it, she said. 

00:36:40 Chris 

Well, the more famous I am, the harder it is for the police to give me trouble. 

00:36:46 Chris 

So you know she lives in real. 

00:36:47 Frank 

That’s crazy that she’s worried about the police. 

00:36:50 Chris 

Yeah, well, I mean there are bad actors through the entire chain. And so and and so for her fame is a protect. 

00:36:56 Andy 

Goodness. 

00:37:00 Chris 

Uh. 

00:37:02 Chris 

I I couldn’t live with myself if I didn’t get out of the foxhole and try to try to do something more. 

00:37:10 Chris 

So for me I was just compelled, and if I’d rather be, I’d rather take the flak than the kids. So if I can draw their fire, that’s good for me. 

00:37:20 Frank 

So so another question is, is that these? 

00:37:24 Frank 

Probably are well funded bad actors. 

00:37:27 Frank 

What sorts of countermeasures I mean are I mean obviously with the with the photo, DNA and federal DNA you know is at this point a 13 year old technology, but. 

00:37:41 Frank 

Uhm, I would imagine that this is going to evolve into kind of for lack of a better term and arms race. 

00:37:49 Chris 

Yeah, I mean, but that’s I think something. Where are the the brilliance of Scott Page and Ben’s approach to building the technology? 

00:37:57 Chris 

Is it’s reading what humans read and if they want to obscure the images so much that it doesn’t look like Sam, I think we’ve won. 

00:38:01 

No. 

00:38:09 

Yeah. 

00:38:09 Chris 

And so really all we’re looking for is something that looks like it, and it’s being, you know, it’s using computer vision. 

00:38:17 Chris 

In a way that it’s perceiving it like a human would perceive it well. Obviously computers look very differently. You know the last thing we do is start with edge detection and then look at shading and you know that’s not how we process images, but literally it’s looking at the image and going what what? What would someone see when they look at this and and so on? 

00:38:29 Chris 

But 

00:38:29 Chris 

 

00:38:37 Chris 

On many levels, you can’t obscure that. In fact, the algorithm has already like we’ve we already added, added a classification because the algorithm kept finding. 

00:38:48 Chris 

Thing. 

00:38:49 Chris 

Cartoon Magna and ****** versions of see Sam. 

00:38:56 Frank 

Uh. 

00:38:57 Chris 

Because it looked like it it went. Oh, this is the same, only it’s a cartoon. And so while we haven’t done the testing on deepfakes or even things that are made whole cloth. 

00:39:08 Andy 

H. 

00:39:11 Chris 

Because that’s coming to is like created. See Sam and you know, I guess that’s, uh. 

00:39:17 Chris 

It’s like I guess, like vegan meat. I guess it’s ethically created. See Sam because nobody was damaged, but it’s still damaging. 

00:39:24 Frank 

Rolls. 

00:39:25 Chris 

Content, yeah, the computer is going to see the same thing because it look it has to look right to humans and as a result that’s what the the computer is going to look at. 

00:39:26 

Yeah. 

00:39:26 

 

00:39:34 Chris 

I have to assume that there will be. You know, there’ll be some kind of countermeasure they come with and then we’ll just have to adjust, but because. 

00:39:44 Chris 

We’re building it on on an end user perspective of how someone looks at it. 

00:39:50 Chris 

It’s relatively future proof versus a kind of a transitional technology like photo DNA that turns it into an anonymous hash that’s a little easier to feed, and now obviously there’s been ways to make that stronger, but. 

00:40:05 Chris 

Just the nature of this should be should be a little future protected. See not knock on wood. 

00:40:12 Frank 

Interesting my like. The fact that you’ve already kind of thought about that, ’cause you know it’s not like you’re running around saying we solve that we solved it only to have the one of the most powerful forces in the universe, in my opinion, is the law of unintended consequences. 

00:40:25 Chris 

Yep. 

00:40:25 Chris 

 

00:40:25 Chris 

Yep, Yep. Well I think the the one thing we know is we’re not perfect and you know, I mean anybody who’s done technology and particularly innovation. 

00:40:26 Frank 

And. 

00:40:33 Chris 

In technology you go OK. I’ve got two years. How do I? How do I keep evolving? How do I keep making it better? 

00:40:39 Chris 

I I I have that terror for 20 years in marketing like you’d build something and go OK. I’ve got two years to build. 

00:40:45 Chris 

The new one, and so, uh, that is that that’s very much our mindset and you know, we’re we’re lucky we have, you know, a strong technology team that’s that is constantly looking at edge cases of how to do things. I would take Scott Page and team. 

00:41:06 Chris 

Uh, going to war with anybody and you know, so there you go. 

00:41:13 Frank 

Yeah, a great answer. 

00:41:15 Frank 

All right, so this is the point. In the show where we talk about the prefab questions, we have the pre questions we have so you mentioned kind of your early days in the.com era and I’m sure we can swap some stories too. ’cause I was at a. 

00:41:27 BAILey 

Perfect. 

00:41:34 Frank 

Couple of startups at the time? Uh, how did you find your way into data in AI? Did you find you did you find? Did data find you, or did you find your way into data? 

00:41:47 Chris 

I found my way into data. 

00:41:50 Chris 

And I very much it’s funny because I chose a college based on the lack of a math requirement. 

00:41:58 Chris 

Uhm? 

00:41:59 Chris 

I I I always had the skills, but I didn’t really like doing it, but. 

00:42:05 Chris 

Even in college, I realized the power of data and so this is way back in the day. Now it’s not not so far back that we didn’t have computers I like. 

00:42:13 Chris 

I would I just to date myself. I got my first PC while I was in college so it was it was the age of moving off of mainframes into having a computer sitting on. 

00:42:24 Chris 

Desk. 

00:42:25 Chris 

And I was getting a degree in political science at American University in DC. 

00:42:31 Chris 

And I took a campaign management class. 

00:42:36 Chris 

And it was kind of crazy they they would do this thing over J term. It was two weeks long, a full, you know, 3 credits and you’d work you you’d have class from 8:00 AM to 6:00 PM and then at the end of it you had to present a 200 page group paper. So you did that in the off hours and they really wanted to simulate the last few days. 

00:42:52 Andy 

Wow. 

00:42:56 Chris 

With the campaign and how brutal that. 

00:42:58 Chris 

Well, I got told that I was working on the 2nd District of Utah and I had to figure out the voter turn out approach. 

00:43:09 Chris 

Well, I had no idea what to do. Like I I was, you know, 21 years old I didn’t know so I went down to the Federal Election Committee. 

00:43:10 

Compel. 

00:43:17 Chris 

Got the precinct level data and built a spreadsheet. 

00:43:22 Chris 

And it was back. It may probably wasn’t even excel. It was probably Lotus 123. I don’t remember. But and and I loaded every precinct data for the last four year or the last four elections. 

00:43:26 Andy 

Wow. 

00:43:35 Chris 

I’m so much so that you know the processing power is so bad, like I had to hit F nine and then wait 20 minutes for it to recalculate. 

00:43:44 Chris 

But it it was a way for me to. I was like this is the only way I’m gonna figure out what precincts to target. And so it was. Technology was always. 

00:43:44 Andy 

Wow. 

00:43:56 Chris 

A result. 

00:43:58 Chris 

And I needed a way to get to it. It was a tool and so whether it was that or and and then it was, you know, building models based on on Wall Street. 

00:44:08 Chris 

And when I say models, nothing like today’s insanity like very light models by today’s standards that I was doing on Wall Street or. 

00:44:19 Chris 

And then in A and then once you have that basis and and lack of fear of how the technology works, then you’re just looking for good data to make better decisions. 

00:44:29 Chris 

And how to how to have clean data so you can make a smarter decision and so that just became kind of my superpower in my career. 

00:44:37 Chris 

And so it just kept going and going. 

00:44:40 BAILey 

Right? 

00:44:40 Frank 

So the kids listening F 9 is how you used to update spreadsheets. Now it happens so fast it’s it’s automatic. 

00:44:47 Chris 

Yeah it yeah. And and I remember when I’d like accidentally hit F nine and go well there goes an hour. 

00:44:54 Andy 

Right? 

00:44:54 

Yeah. 

00:44:57 Andy 

Yes, what would you say is the your. 

00:44:59 Andy 

Favorite part of your current gig? 

00:45:03 Chris 

It’s that I’m helping. 

00:45:07 Chris 

I’m helping people that can’t help themselves right now and and and I’m helping detoxify the Internet for people who are going to run into this stuff and so. 

00:45:22 Chris 

There’s a real joy in knowing that the output of what I do is going to benefit a lot of people. 

00:45:29 Frank 

So we have a number of complete this sentences when I’m not working, I enjoy blank. 

00:45:35 Chris 

Baseball, I’m a huge baseball fan fan of my Minnesota Twins, which means we just hit 6000 days of losing playoff baseball. So I’m obviously a glutton for punishment. 

00:45:48 Frank 

Ha ha ha ha ha. 

00:45:50 BAILey 

Ha ha. 

00:45:50 BAILey 

 

00:45:52 Frank 

Yeah, that can be rough. 

00:45:53 Andy 

Well, that. 

00:45:55 Andy 

That certainly can be, I’m uh. 

00:45:56 Chris 

We’ve lost literally 17 times in a row to the Yankees, which I hated the Yankees before. I can’t. Yeah, I, I I. 

00:46:06 Chris 

I was taught as a young child not to hate, but I think hating the Yankees is a is a virtue. 

00:46:12 Andy 

Oh no. 

00:46:14 Frank 

I’m a Yankees fan, but I’ll let it go. 

00:46:16 Andy 

That was gonna say yeah. 

00:46:17 Chris 

I and I was and I was liking you up until now. Oh wow. 

00:46:20 

Right? 

00:46:24 Frank 

They hate us ’cause they ain’t us. That’s what that’s whatever Yankee fans. 

00:46:25 Frank 

Not brave. 

00:46:25 Frank 

 

00:46:28 Andy 

Oh my goodness. 

00:46:29 BAILey 

Ha ha ha. 

00:46:32 Andy 

So, uh, Braves and nationals here, so. 

00:46:36 Chris 

Yeah yeah, I I was in DC. I lived in I went to a you and I lived in DC pre nationals so I would always go down to the the Orioles would always play a series at RFK Stadium in right before the season. 

00:46:50 Frank 

Nice yeah. 

00:46:52 Chris 

And I’d always go down and and have and for some reason I don’t know why it was cold and the hot dogs were cold, but it didn’t. 

00:46:59 Chris 

Matter I still went. 

00:47:03 Andy 

Awesome stuff. 

00:47:05 Andy 

The the the 4th question four is their second fill in the blank. I think the coolest thing in technology today is blank. 

00:47:15 Chris 

There’s so many things, I think the coolest thing is the application of AI to technology, travel, travel, and driving. 

00:47:25 Chris 

I’m looking forward to the day when we are not trusting another human being on the road not to crash into us. 

00:47:34 Frank 

That segues nicely into the next question. I look forward to the day when technology can blank. 

00:47:42 Chris 

I I am as also, you know, baseball. I’m exposing how nerdy I am baseball and and also Star Trek. 

00:47:50 Chris 

I’m looking forward to the day when they can teleport me to A to a a warm climate in the middle of January and it doesn’t take 12 hours. 

00:48:00 Frank 

Nice nice. 

00:48:02 Frank 

All right, so you mentioned baseball. 

00:48:03 Chris 

If we can, if we can move, uh, if we can move a quark 6 inches. I figure we can move me to Tahiti. 

00:48:09 Frank 

Oh, I like that idea. I like that idea. 

00:48:13 Frank 

Uhm, since you mentioned baseball and Star Trek. 

00:48:18 Frank 

What about the Niners? 

00:48:22 Chris 

This is the San Francisco 49ers. 

00:48:24 Frank 

No, this is this is an obscure Star Trek baseball nerd out reference to, yeah? 

00:48:30 Chris 

Oh the Niners. Oh yeah. 

00:48:34 BAILey 

Ha. 

00:48:34 Chris 

Not about them that that that you would think that that would be at the forefront of my brain as as like like the Reese’s Peanut Butter Cup of my pop culture experience. 

00:48:45 Andy 

Right, right, right, right. 

00:48:46 Chris 

But but but I do remember watching that, much like when you hear national public radio talk about sports. 

00:48:54 Frank 

Right? 

00:48:54 Frank 

 

00:48:54 Chris 

Right where they? 

00:48:56 Chris 

You know the local 9 play a game of baseball today. It’s. It’s just like, oh boy, these people don’t really know what they’re doing. 

00:48:58 BAILey 

Ah. 

00:49:05 Frank 

Yes sportsball. 

00:49:07 Frank 

So for those for the 90% of the audience who probably didn’t get the reference is we’re referring to deep Space 9 and. 

00:49:08 BAILey 

Yeah. 

00:49:14 Frank 

Deep Space Nine was the first spinoff from kind of the traditional Star Trek franchise. It’s it’s why I think it’s still the high watermark for Star Trek. 

00:49:24 BAILey 

Yep. 

00:49:24 Frank 

If you carefully watch my live streams behind me, there’s a model of deep Space 9. 

00:49:29 Frank 

And the the Captain Sisko, who is in my mind. It’s not my question was was never Kirk or Picard, it was, you know, Cisco versus anyone else. 

00:49:30 Frank 

Nice. 

00:49:30 Frank 

 

00:49:41 Frank 

But he was a big baseball fan, and that features prominently, and I’ve been really watching deep Space 9. 

00:49:46 Frank 

It’s sliced fresh in my head is. 

00:49:50 Frank 

He’s a big baseball fan in a time when baseball is kind of really waned and one of the episodes they were playing, a game and the team that they they kind of set up was called the Niners. 

00:50:02 Frank 

And then it was against, uh, somebody. It was against Captain Sisko’s rival from the Academy, or something like that. 

00:50:10 Chris 

Yes, yes. 

00:50:12 Chris 

Oh my gosh, I had forgotten about that completely. We loved that. 

00:50:14 Frank 

Yeah, that was that was a great episode. That was ’cause ’cause most of these baselines. Pretty heavy kind of existential questions. 

00:50:22 Frank 

But every once in a while they had kind of a lighthearted show and that was that was one of the. 

00:50:23 Chris 

Yep. 

00:50:26 Frank 

That and the one where they hang out in Vegas, which is bizarre. 

00:50:27 Andy 

That’s funny. 

00:50:35 Andy 

Well, we could talk Star Trek for hours. 

00:50:37 Frank 

Yeah. 

00:50:39 Andy 

I could tell I could just tell. 

00:50:40 

Yeah. 

00:50:42 Andy 

Our next question is share something different about yourself, Chris, but we remind everyone, not just you, that it’s a family podcast. We want to keep that clean rating. 

00:50:57 Chris 

Uh. 

00:51:00 Chris 

Boy everything I I have a view that everybody is different and so I’m I’m sure that you know it. 

00:51:07 Chris 

It actually reminds me of a story when my sister was going off to college. My dad hated bacon. 

00:51:14 Chris 

I know there are people out there that don’t like bacon and so we always have tuna fish with pancakes. 

00:51:16 Andy 

Come on wow. 

00:51:21 Chris 

Because my mom’s like we need protein and I remember the look on my sister’s face when she said wait, this is weird. What else is? What else do we do? That’s weird. 

00:51:29 BAILey 

Yep. 

00:51:33 Chris 

Uh, I think the the biggest thing that’s different. About me. 

00:51:39 Chris 

Is that? 

00:51:43 Chris 

I’ve kind of lived in every kind of Phase I I have a deep experience in every phase of kind of the American demographic I grew up in a kind of an established urban neighborhood, but went to a very poor urban. 

00:52:01 Chris 

High school, but at the time I was attending a affluent suburban mega church. 

00:52:10 Chris 

All while having farming grandparents and so. 

00:52:17 Chris 

It’s been a blessing for me in my life that. 

00:52:24 Chris 

I accidentally didn’t have a bubble and so it allowed it’s it’s it taught me a superpower of empathy that has allowed you know, allows me to look at like the current political state right now. 

00:52:37 Chris 

Go people, we just need to meet each other. ’cause we’re a lot more similar than you think and. 

00:52:39 Frank 

Right? 

00:52:44 Chris 

And but I think that that’s you know that unique background that I I can thank my parents who literally decided to. 

00:52:51 Chris 

Do. 

00:52:53 Chris 

Decided to support Minneapolis and there they live not too far from where George Floyd was murdered. 

00:53:03 Chris 

They decided to support the community, but you know, and so I think that that’s probably the biggest difference about me that you would never know unless we talked about it. 

00:53:15 

Interesting. 

00:53:15 Andy 

Very cool. 

00:53:19 Frank 

And the next question. 

00:53:22 Frank 

Where can people learn more about you and what you’re working on? 

00:53:27 Chris 

Our website, which is still in a nascent state right now, but we’ll be rolling out a new website soon, is at Cru Nammco. You can sign up their LinkedIn is another good spot for us. If you follow us on LinkedIn. 

00:53:34 Frank 

OK. 

00:53:41 Chris 

Come and just keep watching ’cause we’re we’re going to we’re we’re ramping up our our content work. 

00:53:48 Frank 

Cool cool, I sent you a LinkedIn invite this morning so. 

00:53:48 Andy 

Awesome. 

00:53:52 Chris 

Excellent, excellent. 

00:53:54 Frank 

Awesome. 

00:53:54 Andy 

Yeah, it’s very important work. I’m going to try and connect with you as well. 

00:54:00 Andy 

We we are sponsored by Audible and you can get a free audio book on us if you go to the data drivenbook.com. 

00:54:11 Andy 

And then if you sign up and you know, subscribe to audible and I think I’ve got like the maximum platinum coated gold subscription because I have myself and two others in the house. Two teenagers that love audible books. So because of that, we ask all of our guests. 

00:54:30 Andy 

If you have a favorite audiobook, if you don’t listen to audiobooks, do you have a favorite book recommendation? You don’t have to limit it to one. We’d love to know what you say. 

00:54:41 Chris 

Well, this is a book I’ve literally purchased for people many times and it is not what you would expect. It is called the power broker by Robert Caro. 

00:54:52 Chris 

It is. 

00:54:54 Chris 

The telling of Robert Moses New York, who started as someone who wanted to build parks for moms in in New York City and was heralded for it and took that power and corrupted. 

00:55:13 Chris 

So much in the world, but just he literally moved millions of people apartments in New York and has formed his thinking. 

00:55:23 Chris 

Really formed the modern freeway system around the country. It is a fascinating look at power corrupting an individual at. 

00:55:33 Chris 

How public works impact society. It is just a masterwork of how the world works, and so I highly recommend it. 

00:55:44 Chris 

And it’s perfect for audible, ’cause it’s about 1000 pages. So it there’s a. There’s a lot of jogging you can get done while listening to the power broker. 

00:55:54 Frank 

Interesting, yeah. I grew up in New York City and you know, years after. Kind of. 

00:55:59 Frank 

Robert Moses, and he was. 

00:56:03 Frank 

No one had a neutral opinion on Robert Moses. 

00:56:06 Frank 

Nope, people either hated him or admired him. I wouldn’t say love him. 

00:56:07 BAILey 

Here. 

00:56:13 Chris 

No. 

00:56:13 Frank 

Because but they admired, like his ability to get things done. 

00:56:19 Frank 

Uh, you know, and for those who don’t know, I mean, ultimately he’s the one that kind of. 

00:56:25 Frank 

Turn the Interstate Highway system in the US from kind of a way to move. 

00:56:31 Frank 

Goods from city to city, or effectively, let’s be real. I mean, it was a defense project that was labeled as a civilian project, like the Internet. 

00:56:38 Chris 

Exactly sure, yeah. 

00:56:41 Frank 

Two into kind of the the daily commuting engine that it has become across the states. 

00:56:48 Frank 

I don’t have a better answer. 

00:56:49 Chris 

And he he was the brilliant, he was the brilliant one that. 

00:56:51 Frank 

But 

00:56:54 Chris 

He he would write up the plans so he’d be ready to move faster than anybody else. And so, like on, there’s so many practical things on how to get things done that are just fascinating and thinking systemically. It’s it’s just and you know, unfortunately, how certain groups were excluded. 

00:57:05 Frank 

Right? 

00:57:14 Chris 

From the from the and actually damaged by what was going on. It’s just a it’s a fascinating look at the history of that time. 

00:57:21 Chris 

It’s it’s a brilliant masterwork by Robert Caro. The other book I would highly recommend is not for Sale by David Batstone it. 

00:57:24 Frank 

Right? 

00:57:28 Chris 

Is the really what started everything that we are at kronom and it tells stories about human trafficking but not in a way that is so heavy it talks. 

00:57:41 Chris 

It shows the hope and the and the power of doing something and so highly recommend not for sale as well. That’s a. 

00:57:48 Chris 

That’s really the book that started a movement and then eventually started this company, so I probably should have started there, but I love the power broker so much. 

00:57:56 Frank 

Yeah, I definitely check that out. ’cause there’s like a I grew up kind of in the post. Robert Moses New York and you know it was interesting from from someone who grew up. Kind of not on the island of Manhattan. 

00:58:08 Frank 

For those who don’t know, basically Robert Moses wanted to put a freeway straight through Manhattan. 

00:58:13 Chris 

But 

00:58:15 Frank 

The. 

00:58:17 Frank 

Power Center of New York City is Manhattan. The boroughs are kind of an afterthought, so he was able to. Kind of, you know, pillage the boroughs. 

00:58:25 Frank 

The outer boroughs, but not necessarily Manhattan, Manhattan was kind of his Waterloo where this at least that’s my. 

00:58:33 Frank 

Recollection from older family members was. 

00:58:36 Chris 

That’s exactly well. If you look at any freeway system in any city. 

00:58:41 Chris 

You have the same situation like if you look at DC there were power brokers through the entire city. That’s why there is a Beltway. 

00:58:48 Chris 

The freeway is supposed to right go right down Pennsylvania, and instead it kept getting pushed out a road and another road and another road, and there was a Beltway. If you looked in. 

00:58:48 Chris 

That’s clear. 

00:58:48 Chris 

 

00:58:56 Frank 

Right, I don’t think the Beltway touches the the boundaries of DC. 

00:59:01 Chris 

Exactly, and if you look at yeah if you look at like my hometown of Minneapolis and Saint Paul. 

00:59:02 Frank 

Or barely without all. 

00:59:09 Chris 

All they put the major freeways right through the heart of the African American communities and took out all the African American businesses. 

00:59:19 Chris 

And I think if you look at the history of where freeways have been replaced, you see who was powerful and who was oppressed at that time. 

00:59:27 Chris 

It’s a fascinating. It’s one of those things you just don’t realize until you kind of dig into it. 

00:59:29 

Right? 

00:59:32 Frank 

Right? 

00:59:35 Frank 

Well, my grandfather went from liking Moses to despising him like with him. 

00:59:41 Chris 

Well, and I think you do the same thing while reading the book you go from you. You go from oh what a good man to what a power hungry monster. And that’s the part of the brilliance of it. 

00:59:44 Frank 

Interesting. 

00:59:52 

Right? 

00:59:56 Frank 

Interesting. 

00:59:58 Andy 

Chris, I want to ask you another question a little bit of a follow up here and I know we’re pushing time so I apologize the I I’d like to ask you about the book, not for sale and the concept, and you brought up faith. I I’d like to explore that intersection. 

01:00:11 

Stuff. 

01:00:19 Chris 

Absolutely well, I think you know I. I grew up in in the church. 

01:00:25 Chris 

Uh, and you know that was a core part of. 

01:00:32 Chris 

Was and am from my faith. 

01:00:36 Chris 

Uh, and you know, I think. 

01:00:40 Chris 

When you look at. 

01:00:42 Chris 

I think the phrase in the Bible is, you know. 

01:00:46 Chris 

However, you treat the least of these, you treat me, and that’s a that’s been a that’s been a motto that I’ve carried through my entire life, and I’m glad to be doing it with Colonel. 

01:01:02 Andy 

Certainly trafficking victims could fall into that category very easily, being treated as the least of these. 

01:01:09 Andy 

Exactly, yeah OK. I appreciate that answer. 

01:01:10 Frank 

Well, awesome, I think you’re doing. 

01:01:14 Frank 

Yeah I I appreciate appreciate your time and putting up with these bugs and we’ll let the nice the the, the glitches that we’ve had including my my machine crashing yet again on another PC. Time to re-evaluate Zen caster. 

01:01:28 BAILey 

Yeah. 

01:01:32 Frank 

But with that, I’ll let the nice British lady and the show. 

01:01:36 BAILey 

Thanks for listening to data driven. 

01:01:39 BAILey 

We know you’re busy and we appreciate you. Listening to our podcast. 

01:01:42 BAILey 

But we have a favor to ask. Please rate and review our podcast on iTunes, Amazon Music, Stitcher or wherever you subscribe to us. 

01:01:50 BAILey 

You have subscribed to us, haven’t you? 

01:01:53 BAILey 

Having high ratings and reviews helps us improve the quality of our show and rank us more favorably with the search algorithms. 

01:02:00 BAILey 

That means more people listen to us spreading the joy and can’t the world use a little more joy these days? 

01:02:06 BAILey 

Now go do your part to make the world just a little better and be sure to rate and review the show. 

About the author, Frank

Frank La Vigne is a software engineer and UX geek who saw the light about Data Science at an internal Microsoft Data Science Summit in 2016. Now, he wants to share his passion for the Data Arts with the world.

He blogs regularly at FranksWorld.com and has a YouTube channel called Frank's World TV. (www.FranksWorld.TV). Frank has extensive experience in web and application development. He is also an expert in mobile and tablet engineering. You can find him on Twitter at @tableteer.