2: 2. AI Whistleblower: We Are Being Gaslit By AI Companies
Um, Sebastian, who's the CEO of Cler, has actually just called me.
Hello, Sebastian. You're right.
Hey, how are you?
I'm good. How are you? It's been a while.
It has been a while since you're on the show. I was just saying we do need to get you back on.
I I just I just had a couple of simple questions cuz you know I do a lot of interviews and um Clan has always mentioned because I think the media has said that you like double down on AI then you reversed because it didn't work out. So I know I spoke to you a while ago and we exchanged a couple of DMs about it but that was more than a it was almost a year ago now.
So I just wanted to get an update on Cler's business AI agents and all of that if possible. First and foremost, we were early on uh released um AI uh to support our customer service which had that uh initial uh benefit of uh more calls being dealt with by AI which customers liked because those calls or chat messages were much much faster and more qualitative. Then since then that has actually expanded slightly.
Karen Hao:Um what we did however try to communicate as well is that we believed in a world of where AI is cheap and available the value of human interaction will be regarded as higher. So the future of customer service VIP is a human um we have then hence doubled down on providing more of that but at the same time the efficiency gains within the company has continued. I mean we used to be about 6,000 people and and now we are less than 3,000 which is 2 3 years since we stopped recruiting and at same point in time our revenue has doubled right so you can clearly see that AI has allowed us to be do more with less people but we have avoided layoffs and instead relied on natural attrition when people kind of move on to other jobs. I mean from my perspective we will continue to be very you know not really recruit much. I mean we recruit a little bit here and there but we expect that kind of natural attrition of 10 15% per year to continue and to become fewer. I think the big breakthrough was really in November December last year where even the kind of more most skeptical uh engineers who were like very well-renowned and and appreciated like the founder of Linux and stuff like that basically said that coding has now been resolved and hence is not you know uh you don't need to code anymore and that was kind of a common sentiment. So I think in in coding that's definitely an engineering work that has been a tremendous shift in the last six months.
Host:What do all these people go do Sebastian?
Karen Hao:I am optimistic. I mean I think obviously people will have a lot of opinions about this topic but I still believe that we are going to move towards a richer society. Now in the short term there could be more worry about what happens if people don't get a job and and so forth. But I think in the longer term, I I am optimistic what it means for society and humanity.
Host:Thank you so much, Seb. I'll chat to you soon. Thank you for taking the time. I appreciate you, mate. Thanks.
All right. All right. Byebye. Byebye.
You know the little traditional SIM card that goes inside of our phones. They haven't changed at all since they were invented in the '90s. You have this physical piece of plastic that means you're locked into one carrier, one network, and the second you cross a border, that carrier can start charging you whatever they want. But there are alternatives and today's sponsor SY is one of them. It's an eSIM app that gives you a safe and secure data connection in over 200 destinations. All of their ESIMs have built-in cyber security which is great if you're traveling for work and looking at confidential material. I've been using SY whenever I travel because the connection is always reliable and it saves me a ton of roaming fees. It also means I don't have to deal with all of the faf that surrounds sorting out a SIM everywhere I go. If you want to give it a try, download the sale app from the app store now and scan the QR code on screen. And if you want 15% off your first purchase, use my code D O A when you get to check out. That's D O A for 15% off. Keep that to yourself.
This is something that I've made for you. I've realized that the Dio audience are strivals that we want to accomplish. And one of the things I've learned is that when you aim at the big big big goal, it can feel incredibly psychologically uncomfortable because it's kind of like being stood at the foot of Mount Everest and looking upwards. The way to accomplish your goals is by breaking them down into tiny small steps. And we call this in our team the 1%. And actually this philosophy is highly responsible for much of our success here. So, what we've done so that you at home can accomplish any big goal that you have is we've made these 1% diaries and we released these last year and they all sold out. So, I asked my team over and over again to bring the diaries back, but also to introduce some new colors and to make some minor tweaks to the diary. So, now we have a better range for you. So, if you have a big goal in mind and you need a framework and a process and some motivation, then I highly recommend you get one of these diaries before they all sell out once again. And you can get yours at the diary.com. And if you want the link, the link is in the description below.
Any thoughts? Well, I actually had thoughts on something that you said before he called,
which is you were saying that the Jenzers like there's this trend that they're actually disconnecting from technology. So, they're becoming more in person. And then there's this other class of workers that are actually leaning into the technology, but then becoming more human because they're leaning into the technology
because they're realizing that they should actually just be spending more time doing inerson interactions rather than staring at a spreadsheet. And so they're no longer doing the typing, whatever. I really want to go back to this New York Magazine piece that just came out
because what you're describing is true for a very specific category of people, which is often like the business owners and leadership within companies that actually can make these decisions on how they spend their time and what they ultimately do with their time. But what the piece talks about is the working class like people like people who are not business owners that are then having to experience being laid off and then working for the data annotation industry which is now one of the top jobs on LinkedIn by the way. Um the yeah so LinkedIn had a report that showed the top 10 jobs with the highest growth in the last year and data annotation is on that list.
And for anyone that doesn't know what data annotation is.
Karen Hao:Yeah. So data annotation is the process of teaching these chat bots or or any AI system to do what they ultimately are able to do. So the fact that chat GBT can chat is because there were tens of thousands or hundreds of thousands of people that were literally typing into a large language model and showing it. This is how you're supposed to then respond when a user types in a prompt like this. Before they did that work, chatgbt didn't exist. Like it just it would just you would prompt the model and the model would generate some text that was not in dialogue with the person. It would kind of generate something that was adjacently related.
Host:Is this what they call reinforcement learning where you kind of you give it like a
Karen Hao:it's a part of the process of reinforcement learning. So you do data annotation which is literally um showing lots of different um you know examples of things that you want the model to know and then reinforcement learning is getting the model to then train on those examples iteratively in a way that then
gives the model some of those capabilities. And what the New York Magazine piece highlighted is many many of the people that are getting laid off now or or or are struggling to find work. And these are highly educated people. They're college graduates, PhD graduates, law degree graduates, doctors, um and again like award-winning directors that are that are then struggling to find employment in the economy because the economy has been very much restructured by AI. they are then finding themselves being serving this industry and the industry is designed in a way that is extremely inhumane because what the companies the companies that use these data annotation services like there's these third party providers that are data annotation firms an open AI a gro um a Google they will hire these firms to then find the workers to perform the data annotation tasks that they need for these These firms, these third party firms, they are incentivized to pit workers against each other because they want this data annotation to happen at speed and as cheaply as possible so that they can also compete with one another in this middle layer to get the the the bid the the contract from the the client. And so all of these workers that were interviewed for this New York Magazine story talk about how they actually no longer have an ability to be human because they are waiting at their laptop to be pinged on Slack for when a project is going to open up for data annotation because they've tried job hunting. They literally can't find anything else. This is the thing that's going to help them put food on the table for their kids. And there was this one woman who said like, "I have so much anxiety about when the project is going to come, when it's going to leave that when the project came, it was right when my kid was coming off of off of school." And I just started tasking furiously because I don't know what's going to go and I need to earn as much money as possible in this window of opportunity. So then my when my kid came home and tried to talk to me, I screamed at my child for for distracting me. And then she was like, "I've become a monster and I'm not even allowed to go to the bathroom or take care of my kids, let alone myself, because this industry that is absorbing more and more of the workers that are being laid off, is mechanizing my life, atomizing my work, devaluing my expertise, and then harvesting it for the perpetuation of this machine that all of these AI executives are saying is then going to come for everyone else's jobs. And so what you were saying about these this class of workers, the business owners that get to become more human because there are all of these AI models now doing the tasks that they don't have to do anymore. It is at the cost of the vast majority of people who are not business owners that are struggling to find work getting absorbed into the work of then providing these technologies that the business owners can use
and instead of becoming more human they feel like their humanity has been squeezed and diminished and they have no ability to have control, agency and dignity in their lives anymore. I think this is a big I think this is a big question that kind of pertains to this graph here which is you know all of these people if we believe anthropics prediction of who will be disrupted these people in these industries like arts and media legal um life and social sciences architecture and engineering computer and maths business and finance and management and also office and admin. These people if we believe this would have to retrain at something else and unlike the industrial revolution where you might get 10 20 years to retrain because factories take a long time to build. The distribution layer that AI sits on top of is the open internet. So this is why chat can go and get hundreds of millions of users in no time at all and become the fastest growing company of all time. Um one of my fears is that this disruption takes place at a speed where we can't transition. And that was you know that I think you you you said that sentence in the passive voice the transition would happen at a speed but who is driving that speed?
Host:Um
Karen Hao:it's the companies
and their race with one another.
Host:Yeah. And so they are driving the transition to happen at a speed at which it would be really hard to take care of all of the people that would be bulldozed over by
this is one of the crazy questions that no one can answer for me when I sit with these people that are AI CEOs. So I go, "So what happens to the people if this is if you agree that this is going to happen at super speed?" You know, I spoke to that CEO of Uber, Dar, who said very similar things to what you're saying is, you know, there'll be data labeling jobs, for example, for the drivers. But um they can't all become data labelers. And there's a question around meaning and purpose and fulfillment. And that comes from losing your meaning in life. I s also sit here with so many people who talk about how their father lost their job in Iran or some some other country and came to the United States and had to be a a toilet cleaner on particular case was a doctor in Iran but came to the US and was a toilet cleaner and had to deal with the sense of shame that that particular person felt and the lack of dignity that that caused and how that made that person's self-esteem feel and the depression alcoholism that transpired from that. um if this happens at a large scale across society, there's going to be a ton of consequences like that.
Karen Hao:I mean, this is this is like the core themes of my work. And the reason why I'm critical of these companies is that they are creating technologies in a way that creates the halves and have nots in an extreme form that we have. It's it's exacerbating the inequality that we already see in the world. Like the people who have things will have way more riches. they'll have way more free time. They'll be allowed to be more human. But the people who don't have things are even being squeezed even more. And it's not just from a work perspective. I mean, I talk in my book also about the environmental and public health crisis that these companies have created where they are building these colossal supercomput facilities. there and and in in comm community like communities all around the world and they specifically pick some of the most vulnerable communities. We're sitting in Texas right now. Open AAI's largest one of its largest data center projects is being built in Abalene, Texas as part of the Stargate initiative which was an effort announced at the beginning of Trump's second administration to spend $500 billion on AI computing infrastructure. This facility consumes will when it's finished will consume more than a gigawatt of power which is over 20% over 20%.
Host:So this is actually a little bit inaccurate now. Um this was something that circulated online for a while but there's updated numbers
just for someone that can't see cuz they're listening on Spotify or something. It's a picture of the size of this facility.
Karen Hao:So this is not the Abene Texas one. This is a meta facility. Yeah. So, let's first talk about opening eyes facility in Texas. That one would be the size of Central Park and it would run a million computer chips and it would require the power of more than 20% of New York City.
Host:Do you know one of the things which I found confusing, so I'd like to like alleviate the dissonance is I thought you were saying earlier that you didn't think the job disruption promises were real.
Karen Hao:No, what I was saying is that when we talk about what these executives predict about the future, we need to understand that they are ultimately trying to influence the public in a way that allows them to continue maintaining control over the technology.
Host:But objectively, do you think that the job disruption that they talk about where
Karen Hao:Yeah. Yeah. I mean I I mentioned
Host:real
Karen Hao:well I
I don't want to comment specifically on like this chart but it's like we've already seen in job reports that there is a restructuring of the economy happening right now.
Host:Yeah. But but going back to like the data center. So this supercomputer facility it's a meta supercomputer facility
is being built in Louisiana
and it would be four times the size of the Abene Texas one and use half of the average power demand of New York City. So it's one the size of Manhattan. This makes it seem like almost all of Manhattan, but it's it would be 1/5 the size of Manhattan.
When these facilities go into these communities, what happens? Power utility increases, grid reliability decreases. The facilities also need fresh water to generate the power for powering them as well as fresh water to cool. And there have been lots of documented stories of communities that are already really constrained in their freshwater resource. they're under a drought when a facility comes in and then there are people the community is actually like competing with this facility for fresh water. I talk about one of those communities in my book and also sometimes these facilities instead of connecting to the grid they instead a a power plant pops up next to it. So in Memphis Tennessee where Musk built Colossus the supercomputer for training Grock he used 35 methane gas turbines to power the facility. This is a working-class community, a black and brown community, a rural community that was not even told that they would be the hosts of this facility. And they discovered it because they literally smelled what seemed like a gas leak in all of their living rooms. And that's when they discovered that these methane gas turbines were taking away their right to clean air. And this is a community that's already been facing a history of environmental racism. They had already had lots of struggles to access their right to clean air. And now there's this huge supercomput that's landed in their midst that is pumping thousands of tons of toxins into their air, exacerbating the asthmatic symptoms of the children, exacerbating the respiratory illnesses of other people. that it's it's one of the communities that has the highest rates of um lung cancer and so
and that supercomputers taking their jobs
and then they also have supercomputers taking their jobs. So, so this is what I mean is like the halves and have nots are fundamentally being pulled apart even further. Like if you in this version of Silicon Valley's future are in the misfortunate category of being a have not, we are talking about you now getting a job that is way worse than what you had because you might be doing data annotation
and you might be treated as a machine rather than as a human to extract value the value of your labor for perpetuating this labor automating machine that these people are building. You might be competing with these facilities for freshwater resources. They're also polluting your air. Your bills have increased. So, the affordability crisis is getting worse. Like, how is that making people able to be more human?
Host:What do we do about it?
Karen Hao:Yes. Okay. So, one of the analogies that I always use is AI is like the word transportation. Transportation can literally refer to everything from a bicycle to a rocket. And we have nuanced conversations about transportation where we always say we need to transition our transportation towards more uh sustainable options. We need a transition towards you know public transport, electric vehicles. And we don't we don't ever say everyone should get a rocket to do every to serve all of their transportation needs, right? Like we're in Austin. If you use a rocket to fly from Dallas to Austin, like that would just make not no sense. It's just a disproportionate use of resources to get the benefit of getting from point A to point B. This how we should think about AI. So all of the models that we've been talking about, I like to think of them as the rockets of AI. They use an extraordinary amount of resources and they provide benefit some dramatic benefit to some people but they're also exacting an extraordinary cost on a large swath of people because of the like the costs of developing this technology. Why don't we build more bicycles of AI? This is things like deep minds alpha fold which is a system that predicts how proteins will fold based on amino acid sequences. It's really important for accelerating drug discovery for understanding human disease and it won the Nobel Prize in chemistry in 2024. And the reason why it's a bicycle of AI is because you're using small curated data sets. you're just you just have data that has amino acid sequences and protein folding. So that means you need significantly less computational resources to develop the system, which means significantly less energy, which means less emissions, so on and so forth. And you're providing enormous benefit to people.
Host:It feels like the horse has left the stable in this regard because they've already taken people's IP, they've taken media, they they train on this podcast. We know they do because it it shows that they do. Um I think there's a button actually in the back end of YouTube now that allows you just to click it and it says we will train on your YouTube channel. Um so the horses kind of left.
Karen Hao:Here's the thing. If the horse truly had left the stables, they wouldn't have to train on anything anymore. Why is it that their appetite for data has actually expanded? It's because in order to build the next generations of their technologies, in order to have the technologies continue to be relevant and continue to update with the pace of new knowledge creation and society's evolvement, they need to train again and again and again and again. And why are they employing actually more and more and more data annotation workers over time? It's because they need more and more of that work over time. I mean, I've been reporting on data annotation work for over 7 years now, and it's not gone down. It's gone it's increased.
Host:Do you think there's any chance of it going down? Do you think there's any chance of this sort of brute force scaling approach where you take data, you take computational power, energy, and you, you know, you have um the data labelers and, you know, building out more and more parameters for the models. Do you think there's any chance it's going to stop or go in a different direction other than the one it's going in now?
Karen Hao:I would love to reframe the question and say what should we be doing in this moment where it's not going down where we do recognize that actually these companies in this moment need continued resources, inputs and labor to perpetuate what they are doing.
Host:Yeah. because this sounds like stop
and I just feel like stop is like a HUD. It feels like I just think you know with the government in place they're supporting these companies like crazy. Globally this is happening. So I'm like stop doesn't feel
Karen Hao:I always say we need to break up the empire and we need to develop alternatives and we are already seeing a flourishing of incredible grassroots movements that are applying an enormous amount of pressure to the way that the empire is trying to unfold its agenda. 80% of Americans in the most recent poll think that the AI industry need to be regulated.
Host:Yeah.
Karen Hao:When was the last time that 80% of Americans were on the same side of an issue?
Host:No. Yeah. When I have these conversations on the podcast, the comment section are clear.
Karen Hao:Yeah.
Host:There's no there's no disagreement. There's no one in there going, "Oh, no. I think they should crack on."
Karen Hao:Yeah. Dozens dozens of protests against data centers have broken out all around this country and the US, all around the world.
Host:So, what do we do about it?
Karen Hao:So, these are thing people that are doing something about it. They are actually reasserting their agency and exercising democratic contestation against the ways that the empires are going about their business.
Host:What goal should we be aiming at? So, if I said to my audience, Janet at home, because this is kind of what I see in the comments, it's hopelessness. It's like, what can I do? I'm just a
Karen Hao:Yeah. Well, well, well, the goal is not that we completely get rid of this technology. The goal is that these companies need to stop being empires. And the way I define like a typical business versus an empire is that the empires are predicated on this idea that they do not have to provide a fair exchange of value with the workers who work for them or the people who use them or all of the other people that are involved in like the supply chain of producing and deploying these technologies. They can extract and exploit and extract and exploit and get more value than what they offer. Whereas typical businesses, there's a fair exchange. you you buy a service, you feel like you got the same amount of value as the service that you provided. But like for these data annotation workers, for example, they do not feel in any way that they're being paid the same value that they provide to these companies. So that's like for me the north star is like we should be pushing back and holding accountable these companies when they operate in an imperial way. And that's what we've seen with all of these people that are now literally protesting in the streets against data centers and having an enormous effect, by the way, actually stalling data center projects and also completely banning data centers from being developed in their localities. We're seeing that with artisan writers that are suing these companies for intellectual property infringement and creating a huge public conversation about what is it that we actually how do we actually want to protect our intellectual property? It's like I three weeks ago I met Megan Garcia who is the mother of Sul Settzer III who is the 14-year-old who died by suicide after being sexually groomed by a characterized chatbot. And she when that happened I mean obviously was incredibly devastated by what had happened to her son. She also decided to do something about it. She sued the companies and that lawsuit then sparked many other parents and families who were actually experiencing similar things to sue these companies as well. That has created an enormous public conversation about what these companies are actually doing when they exploit and they extract. What is the cost to the lives of people around the world including children? So, what do you think my audience should do if they if they agree with everything written in your book, Age Empire of AI, Dreams and Nightmares, and Sam Mortman's Open AI? If they agree with everything said here, if they agree with everything we've discussed today, they're concerned about their kids, they they don't want everyone to become data labelers, they don't think that's a, you know, particularly great solution, what what can they actually go and do?
When I was writing the book, the only discourse that was happening was this is the best thing since sliced bread.
Host:Mhm.
Karen Hao:because of all of the actions of these people like saying when they're comp they're they're not happy with the things that these companies are doing. We now have 80% of Americans that want to regulate this industry. And so I would say to people, think about all of the ways that your life intersects with the resources and the that the AI industry needs to perpetuate what they do and also the spaces that they would need to deploy these technologies to continue having broad-based adoption
in their work. So you're a data donor to these companies. You could withhold that data. And that's what those artists and writers are are doing. like they're suing these companies to withhold to try and create mechanisms by which that data would then be withheld. You probably have a data center popping up around you. If you're at a school environment or a company environment, you're probably having a discussion in those environments right now about what should the AI adoption policy be? And these companies they like I was talking with some open air employees just the other day and they were telling me that it's understood internally that the revenue targets for the company are extraordinary and they need things to go flawlessly for it to all work out. And so they would need every single person to adopt this, every single space to adopt this. They would need to be able to build their data centers at the speed that they're trying to build them. And so what I would say to everyone of your viewers is let's not make it go flawlessly if we don't agree with what they are doing.
Host:Ah, okay. I got you.
Karen Hao:And then let's build alternatives. Because the thing is what I'm saying is not that these technologies don't have utility. It's that specifically the political economy that has emerged to support the production of these technologies right now
is exacting a lot of harm on people. But we have research that shows that the very same capabilities could be developed with much more efficient methods with much less resource consumption. And we have a lot of different other AI systems at our disposal that are like the bicycles of AI that we also know provide extraordinary benefit at very little cost. So let's break up the empire and let's forge new paths of AI development that are broadly beneficial to everyone.
Host:It's strange. I'm quite I think I'm I'm I've trained myself to deal with dichotoies in my head. And this for me is such is a dichotomy where I as a CEO and as a founder, as an entrepreneur and someone that loves technology, I think it's incredible. It's absolutely incredible AI. It's just so amazing and incredible the things it's enabled me to do and create.
Karen Hao:Yeah. Because it's designed to enable people like you.
Host:And my car driving in the morning and being safer. Incredible. Um I think you know the billion odd people that use AI tools or chat or whatever it might be, they'd probably say that it's added value to their life. But and this is the part that people find confusing that you can and I like I invest in companies that are you know heavily using AI but and the big butt is is it possible to think that is true and also think that there are significant unintended consequences which technology in the history of technology should have taught us to take a moment to pause to talk about because
Karen Hao:I think this is absolutely like you can have both of these things in your head and what I'm saying is that this tension doesn't have to be a tension because we could actually preserve the utility and benefits of these technologies but actually develop and design them in a different way that doesn't have all of these unintended consequences.
Host:Yes. And I think there needs to be a big social conversation which is why I have so many conversations about AI in the show like there needs to be a big social conse uh conversation about being intentional about the social impact um the social and environmental impact and that conversation is not being had in the in government. From what I can see, the conversation takes place in the industry and actually trying to pull it out of the industry and and open people's minds to it is hopefully what we've been doing over the last couple of months with this subject because
Karen Hao:I think it's actually been it it has been been happening everywhere outside of the industry and for local governments and state level governments there have been huge conversations about this everywhere. Like I've been on book tour, I've been to dozens of cities around the world. People are having these crucial conversations everywhere. I have not gone to a single city.
Host:Yes. Everywhere. Even here in South by.
Karen Hao:Yeah. I haven't gone to a single city where the room is not packed and people are not wrestling with the same exact questions as every other person in every other room that I've been in.
Host:Speaking of packed rooms, I know you've got to go cuz you've got you've got to talk today. So, I'm going to we've got a last question which is the closing tradition on this podcast. How would your advice to a friend with a terminal diagnosis differ from what you would do yourself?
Karen Hao:That's a great question.
Host:Differ from what you would do yourself?
Karen Hao:Oh my god. I have I I would tell them like enjoy like live life for yourself. Um you wouldn't do it
and take it easy. And yeah, I I I am not taking it easy.
Host:Well, I think it's a good thing you're not taking it easy because you're leading a conversation which is incredibly important. And I think that's the thing. I think the conversation is the important thing. And so, you know, because of algorithms and echo chambers, it's so rare to have a conversation
these days, especially a long form one. I agree.
Like this. So, I think they're so important. And your book is for anyone that's curious about