“The thing you should consider,” she says, “is what happens in your absence. Using your knowledge and skills, you have a chance here to be in the room when critical decisions are being made and have a say in what happens. If you walk away now, who knows how things will end up in Xinjiang? Yang will just bring in another team to head the project and it could eventually all end in famine or genocide, especially given the direction things are currently going. You could prevent that.”
I search my conscience. Do I care that countless millions of total strangers from a province that I’ve never set foot in could possibly die if I did nothing? Indonesia in 1965; the Khmer Rouge in in the 1970s; Rwanda in 1994? Am I swayed?
As soon as Shu tries to guilt me along this tact, my brain immediately formulates its own counter-rationalizations: First, it’s not immediately clear to me at all that with my help, things will not end up actually being worse. It’s perfectly possible that Yang and his puppeteers behind the curtain take my work and bastardize it, using it for some even greater evil. These massive machine learning models and pipelines that data scientists build are tools. And once you’ve built the actual tool, how one uses it is an entirely different matter altogether. For example, the same image recognition software that helps a mother find her lost child in a crowded mall could be the same software that helps a totalitarian dictator hunt down and assassinate political opponents. To a computer, a human face is just a face. In all those old 007 secret agent movies, there’s always a “head scientist” who works for the Bond supervillain and if I’m not careful, I could totally unwittingly become that scientific accessory to evil. A supervillain-enabler. Most definitely not a good look and a categorically, maximally undesirable outcome.
Second, in all honesty– I feel a sense of detachment. I know that makes me a horrible and heartless human but, unfortunately and inconveniently, it’s simply just true. (At least if I’m being honest with myself.) I’d grown up in America all of my life and had led an extremely sheltered and privileged existence. For the most part, two very big oceans had separated me from most of the world’s concerns. And thus, for better or worse, again, if I’m being genuine: In my heart of hearts, I’d grown numb and apathetic to the headlines, especially international headlines, that I’d seen on endless repeat, looping again and again over the many years and decades now. Mass starvations in Darfur or thousands dying from drought in Ethiopia and Somalia. Even when I read about those events, they just felt like they were far away, in another galaxy and solar system.
Appealing to my desire to possibly prevent genocide is a losing argument. And, give her credit, Shu seems to read from my expression that she’s failed to persuade me or move the needle at all. Apparently, this card wasn’t the ace that she’d thought she’d had.
I see her flash a quick glance at Vanessa, who’s still standing with Alan across the room. It’s probably imperceptible to most, but I notice that Vanessa gives a smallest of nods back. Apparently, the Queen Bee has given her underling some kind of greenlight on, well, something.
“Very well then,” Shu says, sighing. “I guess there’s really only thing left to show you at this point.”
Chopra and I look at each other.
“Oh, come on,” Shu says as she puts her hand on my forearm and bats those long, alluring lashes again. “You’ve come all this way, from so far. Aren’t you at least a little curious? It’ll only take a bit.” She looks at all of the others, assembled in the room. “You can all come to see. I promise it’ll be worth your while. You won’t want to decide on anything yet without seeing this first.”