In the 2000s, an algorithm had been developed in the US to identify recipients of donated kidneys. But some people were unhappy with how the algorithm had been designed. In 2007, Clive Grawe, a kidney transplant candidate from Los Angeles, told a room full of medical experts that their algorithm was biased against older people like him. The algorithm had been designed to allocate kidneys in a way that maximized life years saved. This favored younger, wealthier and whiter patients, Grawe and other patients argued.
This bias in algorithms is common. What is less common is for the designers of these algorithms to agree that there is a problem. After years of consulting with laypeople like Grawe, the designers found a less biased way to maximize the number of years saved by, among other things, taking into account general health in addition to age. One of the key changes was that most donors, who are often people who have died young, would no longer be matched only with recipients in the same age range. Some of those kidneys could now go to otherwise healthy older people. As with Scribner’s committee, the algorithm still wouldn’t make decisions that everyone agreed with. But the process by which it developed is more difficult to criticize.
“I didn’t want to sit there and give myself the injection. If you want it, press the button.”
Nitschke, too, is asking tough questions.
A former doctor who burned his medical license after a years-long legal dispute with the Medical Board of Australia, Nitschke has the distinction of being the first person to legally administer a voluntary lethal injection to another human being. During the nine months between July 1996, when Australia’s Northern Territory introduced a law legalizing euthanasia, and March 1997, when Australia’s federal government overturned it, Nitschke helped four of the his patients to commit suicide.
The first, a 66-year-old carpenter named Bob Dent, who had suffered from prostate cancer for five years, explained his decision in an open letter: “If I had to keep a pet in the same condition I am in, I would be prosecuted”.
Nitschke wanted to support his patients’ decisions. Still, he was uncomfortable with the role he was being asked to play. So he made a machine to take its place. “I didn’t want to sit there and give myself the injection,” she says. “If you want it, press the button.”
The machine wasn’t much to look at – it was essentially a laptop connected to a syringe. But he achieved his purpose. The Sarco is an iteration of this original device, which was later acquired by the Science Museum in London. Nitschke hopes that an algorithm that can perform a psychiatric assessment will be the next step.
But there’s a good chance those hopes will be dashed. Creating a program that can assess someone’s mental health is an unresolved and controversial issue. As Nitschke himself points out, doctors do not agree on what it means for a healthy person to decide to die. “You can get a dozen different answers from a dozen different psychiatrists,” he says. In other words, there is no common ground upon which to even build an algorithm.