What social responsibility do you have as a software developer?
(Adapted from an editorial by Edmund Berkeley published in January 1958 in the journal “Computers and Automation”.)
One day, a mysterious stranger walked into a locksmith’s shop. The locksmith had never seen this man before, but he could tell from the way he was dressed that the stranger was well-to-do. He came to the locksmith with a proposition.
‘I have a job that needs doing, and it requires someone with your highly specialized skills,’ he said. ‘I’ve done my research and you are one of the smartest and most capable locksmiths I’ve ever heard about.’
He felt very flattered, and more than a little intrigued. The man continued.
‘I want to hire you to open a certain safe. Never mind whose safe it is, that’s none of your concern. Just do the job I hire you to do, and I will make you rich beyond your wildest dreams.’
The locksmith was excited at the proposition of such a lucrative job, but also a bit nervous about not knowing who the safe belonged to. It seemed suspicious.
The stranger went on. ‘There are certain conditions you will have to agree to. First of all, I will blindfold you and take your phone before bringing you to the safe’s location. Secondly, you can never tell anyone that I hired you.’
This struck the locksmith as very odd, but he thought about what the man had said about making him rich. He felt like he had struggled all his life, but was never properly rewarded for the hard work he put in day after day.
The stranger continued, ‘You can have all the tools you need to do the job. The very best tools. I will spare no expense. If there’s something you need that you don’t have, I will buy it for you.’
‘Take your time. I’ll be back tomorrow for your answer.’
Despite his hesitation about the nature of the job, the locksmith spent all night thinking about his crummy apartment, his shabby furniture, his daughter’s dream of one day going to college. From the beginning, his family had learned to scrimp and save just to get by. ‘Anyway,’ he thought to himself, ‘if I don’t take this job, he’ll just go to another locksmith. The second-best locksmith.’ He smiled to himself.
The next day, when the stranger returned, the locksmith agreed to take the job.
Keep the locksmith in mind. We’ll come back to him.
The Association of Computing Machinery’s silver anniversary
The year is 1972. The Vietnam War is raging. It’s also the 25th anniversary of the Association for Computing Machinery (ACM), the first professional organization of computer engineers and scientists. The ACM planned an anniversary dinner with two prominent speakers, both founders of the organization: Franz Alt and Edmund Berkeley.
Franz Alt was the first to address the attendees. His topic was ‘Reflections’, and he told a compelling story of the evolution of the computer industry: from its nascent beginnings in World War II through to the present day. It was a celebratory talk, a commemoration of just how far the technology and technologists of computing had come in 30 short years.
Edmund Berkeley took the stage next. His topic was ‘Horizons’, and the ACM organizers had intended for him to talk about his vision for the future of computing. Berkeley, however, had his own ideas.
His speech was fiery and confrontational. Berkeley had campaigned for ethical standards in the computer industry for 20 years, and the idea never got traction. His frustration (or even anger) at his colleagues came to a head that night.
He told the audience that anyone who was working to further the unethical use of computers, including in the development of weapons technology, should immediately quit their jobs. He called out members of the audience by name. People who worked for the military. People who worked as government consultants. People who worked in research laboratories. It was a shock to the staid and esteemed professionals who were assembled there that night.
Many of his colleagues were so upset by his accusations that they stood up and walked out in the middle of his speech. Admiral Grace Hopper was among those who left. She and Berkeley had worked together 28 years before, during World War II, in Howard Aiken’s revolutionary lab at Harvard. Now she turned her back on him.
Berkeley concluded his speech by saying that it was a ‘gross neglect of responsibility’ that computer scientists were not considering their impact in terms of societal benefit or harm.
Edmund Berkeley saw very early on in our industry’s history that “computer people” carried a heavier-than-average burden, the responsibility for how their work was being used for social good or social ill. This was clear not only in the use of technology for war but also in human rights abuses: after all, the Nazis used early mechanical computers to collect census data that would later be used to organize the Holocaust. Punch cards from these machines were even sent along with the trains transporting people to concentration camps. The technology was sold to the Nazis by IBM.
Following the travesties of two world wars, scientists in other fields were becoming aware of their social responsibilities, too. World War I saw the first large-scale use of chemical weaponry. When the horrors of the death experienced by people exposed to these weapons became more widely known, several international conferences were held to enact policies to limit or abolish these inhumane weapons.
And after the devastation of the atomic bomb at the end of World War II, physicists banded together to mitigate the bomb’s threat to human civilization. The most visible legacy of their work is the Doomsday Clock, maintained continuously for decades, and used to track the number of “minutes” we are away from world-destroying nuclear war.
Berkeley hoped for similar efforts to arise in the field of computer science and engineering.
He developed his ideas on ethical responsibilities in part through his work with the Committee on the Social Responsibility of Computer Scientists. The work of this committee culminated in a revolutionary report published in 1958.
In section two of the report, the committee laid out the four primary, ethical responsibilities of the computer scientist:
- He (sic) cannot rightly ignore his ethical responsibilities.
- He cannot rightly delegate his ethical responsibilities.
- He cannot rightly neglect to think about how his special role as a computer person can benefit or harm society.
- He cannot rightly avoid deciding between conflicting ethical responsibilities. He must choose.
The committee went on to show how the scientist’s credo, ‘knowledge for knowledge’s sake’, easily comes into conflict with our other responsibilities. ‘Given human society in our century, and the ethical value system we are using in our century, it is possible to decide definitely some classes of work which can be labeled as obviously socially undesirable, and other classes of work which can be labeled as obviously socially desirable, even if there is a large middle ground which cannot be clearly classified.’
Unfortunately, the work of this committee and other efforts by Berkeley throughout the 1950s and 60s failed to move his fellow computer scientists to action, which resulted in him calling down righteous fury on his peers that August night in 1972.
This widespread apathy toward our ethical responsibilities as an industry continues to this day, and is nowhere reflected so dramatically as in the practice of open source.
Is there something fundamental about open source that perpetuates this shirking of responsibility? And if so, what needs to change to encourage – or even allow – open source practitioners to live up to their ethical obligations?
We’ll explore the answers to these questions in part two of the series, which will be published on October 9.