In 2013, software engineer Laura Nolan gave a talk on “cool jobs” to a group of transition students at the Science Gallery in Dublin.
As a Site Reliability Engineer at Google, this is for you. In a YouTube video of her speech, she tells the young audience: “I didn’t really know what I wanted to be growing up. In fact, I’m still not 100% sure, but I really like where I am right now.”
For millennials like Nolan, who was studying computer science just to graduate when the dot-com bubble burst in the early 2000s, a job at Google was like winning a career lottery. It was a chance to work at the forefront of a new economy with a company whose founding motto was “Don’t be evil.”
Watching the video of her talk today, in which Nolan speaks about her work with insight and wit, there is no foreshadowing of the events that would lead her to leave the company five years later. She describes it as “quite a story arc”. It started with cloud computing and ended with a campaign against killer robots.
Nolan joined Google in 2013 after 10 years of her career, at a time of transition in the technology sector. The problem has been growth, a constant concern for Silicon Valley giants, which are expanding rapidly but must find new ways to maintain momentum in a highly competitive market. This is driving companies to diversify their offerings, as with Facebook’s acquisition of Instagram and its new focus on virtual reality, or Spotify’s entry into podcasting.
After an ill-fated attempt to launch a social media platform (who remembers Google+?), around 2015, Google began focusing on cloud computing and storing and accessing information. “There was a lot of technical work [to be done] and there was also a big cultural adjustment,” says Nolan.
Some Google employees felt, she says, that the company was turning away from its original mission.
Speaking of founders Larry Page and Sergey Brin, Nolan says, “They talked a lot about how Google isn’t just another company and that they might choose to forego certain opportunities because they think things are better for Google and better in the long run in general is .”
From this perspective, the shift to cloud computing was something potentially ominous. Google became, as Nolan puts it, “just a monster that wants to eat the world.”
Those fears appeared to be confirmed when it was revealed in 2018 that the company was collaborating with Project Maven, a controversial Pentagon plan to boost the US Department of Defense’s big data and AI capabilities. It was, in the words of journalist Eyal Press, “a surveillance program designed to automate and expedite the process of reviewing the vast amount of footage drones recorded as they hovered over distant war zones.” Nolan feared this would lead to an increase in fatal beatings.
She learned of Google’s involvement during a work trip to San Francisco a few weeks before the story broke in the media. The prospect that their good faith work could be used directly or indirectly in support of military operations alarmed them, and not without reason.
Surveillance systems, such as those used to conduct drone strikes, represent an inexhaustible source of revenue for cloud computing companies, says Nolan. “They’re talking about sucking up huge amounts of data, processing it to try to extract statistical models, and continuously matching that with input data.”
She describes this process as “never-ending,” “intense,” and “perfect for trying to make money from cloud computing.” When she shared her concerns with one of her directors in Dublin, she said she was told, “We have to do this for shareholder value.”
After the Project Maven revelations, about 4,000 Google employees (out of 80,000 worldwide at the time) signed an open letter condemning the company’s involvement. “Dear Sundar,” it begins, turning to Sundar Pichai, chief executive of Alphabet, the conglomerate that owns Google: “We believe Google should not engage in war business.”
Some employees, including Nolan, resigned. As a result, Google did not renew its contract with the Pentagon. It announced a set of new AI principles stating that projects falling outside of “internationally acceptable norms” would not be considered. (Nolan points out that there are no established “norms” with such projects.)
Nolan is one of several tech interviewees featured in the new book from Eyal Press Dirty Work: Basic Jobs and the Hidden Toll of Inequality. He examines the impact of “dirty work” on employees’ mental and emotional well-being in contexts where dirt is less about dirt and more about the shame and judgment that certain types of workers experience.
Unlike Press’s interviewees from slaughterhouses or the prison system, Nolan did not experience the extreme disadvantages associated with “dirty work” because she was in a well-paid, unstigmatized profession that offered her other employment opportunities. Nonetheless, her distress in realizing her abilities could be used in ways she never intended to harm others, affecting her in very real ways.
“It shocked me,” she says, recalling when she first heard about Project Maven. “Honestly, I felt physically ill. I got heartburn. It was the first time in my life that I suffered from heartburn. Personally, I found it very, very distressing.
“I felt like work I had been doing innocently over the last few years was being weaponised, I wanted to say ‘warfare’, but warfare isn’t even the right thing to say about drone strikes that will kill [people]. In my opinion it is mechanized slaughter.”
Although the leak of Google’s involvement with Project Maven allowed Nolan to speak about her experience, not everyone was receptive.
“Most people didn’t criticize my actions,” she says. “Some people think I’m an idiot… I don’t agree with those people. You have a right to your opinion and I have a right to mine, but I certainly have no obligation to work on these projects that I disagree with.” While Nolan was confident that stepping away from Google was the right thing to do to remove, the decision was by no means easy. “It was quite an emotional thing. On my last day I felt pretty torn. I enjoyed my team, I enjoyed my work, but I didn’t have that trust anymore [the] Guide.”
Regarding Google’s motto “Don’t be evil,” Nolan says: “The problem is, define ‘evil’.”
By voicing her concerns about Big Tech, Nolan joins a growing number of employees who are becoming increasingly vocal about the prospect of their work being used in ways they never intended or over which they have little control . These concerns are also at the heart of Nolan’s recent work with the Stop Killer Robots campaign.
While it may sound like a black mirror Parody, the campaign’s work is serious and urgent, targeting what it calls “growing digital dehumanization,” exemplified by the development of weapons controlled not by humans but by artificial intelligence. Nolan points out that this lack of human oversight can have disastrous consequences. “It’s impossible to build an automated system that respects the rules of war,” she says.
It’s a chilling prospect, and another example of what Nolan has seen first-hand in her career: the dark side of technological advances.
https://www.independent.ie/life/why-i-had-to-leave-google-and-fight-against-killer-robots-41407541.html “Why I Had to Leave Google and Fight Killer Robots”