Monday’s post inspired some fun discussion in the comments and IRL. At the risk of too much academic navel gazing for one week, let me surface a few of those interesting bits and pieces today.
Academic disciplinary history gets reinvented with each cohort of PhD students. Most people don’t want to be disciplinary historians, so for most scholars “history” ends up being the hodgepodge of inspiring apocryphal tales told in intro classes and related work sections. However, if your discipline matters, it’s vital to seriously engage with its intellectual roots. No academic department arises from the well of intellectual purity. Many have far less noble origins than even computer science. The fathers of statistics all cut their teeth in departments of eugenics. That’s not even being hyperbolic: Ronald Fisher was the chair of the Department of Eugenics at UCL.
In that spirit of grappling, let’s talk a bit more about the history of computer science departments and their construction of a pure intellectual endeavor. Shreeharsh Kelkar shared an excellently spicy post he wrote a decade ago about Alan Turing’s influence. You should go read his post for all of the historical tidbits. When you read the papers and personal letters from the early computer pioneers in the United States, you almost never see Turing’s name mentioned before 1960. Shreeharsh’s post examines how Turing was canonized by the discipline to carve out a sense of intellectual purity in a very applied, service-oriented field.
Sheeharsh points to an excellent article by Thomas Haigh in CACM. Haigh asks, “Must theoretical breakthroughs precede and guide practical ones?” I might ask a similar question: Can science be a post-hoc rationalization of technology? This might be the best way to make sense of the “science” in “computer science.” And for those who think I’m just being mean to my colleagues, look at the history of the steam engine and thermodynamics to see how this phenomenon is not confined to the interplay between the science and technology of computers.
With its embrace of theory, computer science carefully carved out an identity as a pure intellectual exercise closer to mathematics than to business. Don Knuth seems to be one of the central figures in creating this identity. Knuth credits George Forsythe for envisioning a purer academic version of computer science. Forsythe was instrumental in establishing Stanford’s CS department in 1965, and university politics demands that all departments tell a story about their pure, ivory intellectual purity. So you’ll find Forsythe quotes like:
“The learning of mathematics and computer science together has pedagogical advantages, for the basic concepts of each reinforce the learning of the other. The question "What can be automated?" is one of the most inspiring philosophical and practical questions of contemporary civilization.”1
Once you tell this story, everyone in the department wants to believe it. No one wants to fess up to being part of a mercenary training program. And by the mid-1980s, with the theoretical, algorithmic backbones built upon the beatified Turing, some computer scientists even thought that any appeals to industry were misguided cynicism.
Along such lines, Max Raginsky pointed me to notes of Edsger Dijkstra, whose memorial website hosts handwritten screeds grumbling about, among other topics, computer science’s servility to industry. I’ll admit, they are very fun to read. Dijkstra would have been a great blogger. Related to this post, check out 1995’s “Why American Computing Science seems incurable,” where he describes how “how anti-intellectualism, amathematicism, and ‘industrial acceptance’ poison computing science.” Woven into the ranting, he makes some compelling points about software’s complexity and inscrutability. Another gem is 1999’s anti-industry screed “How they try to corrupt us.” I raise Dijkstra here not only because hand-written rants from the 90s are fun to read. At the time when personal computers had become ubiquitous and everyone was getting access to the internet, there were faculty in computer science departments who strongly believed that university computer science should be separable from industry.
Those days are over, of course. Trends that Dijkstra lamented, like justifying your research by arguing that some company was using it, are now requirements for CS job talks. Many of my colleagues spend most of their time at their startups or are employed 80% of the time as vice presidents of companies. Some unashamedly run their labs as startup incubators. You could side with Dijkstra and say this is an abomination and spells the end of academic purity. But the record shows the boundary between university and industrial computer science has always been nebulous. Perhaps we should just be arguing about the degree.
What should be the role of computer science at the university? What should be the role of the university in society? In case you hadn’t noticed, the US is in a heated debate about these questions. A discipline as (relatively) new as computer science does provide an interesting case study for how this role can evolve and change in short periods. Thinking about what it should be for the next decade is an important political question, as computing companies control more influence over political systems, and computing technology inescapably molds how we interact and make decisions. We may have deluded ourselves into thinking that we can say “I teach computer science, and that is all,” but that option is neither viable nor veridical in our current environment.
Uh, Shamik Dasgupta and I conceived our class on the philosophy and history of automated decision making without having seen this Forsythe quote. I swear!
I just returned from teaching a seminar on Phil of CS and this posts pops up - great timing! I made my students read Turing's 1936 paper, which probably no aspiring CS student would ever get subjected to otherwise. Much has been written about that paper, but what stands out is that Turing is not really interested at all in practical computations.
My last note on that paper was:
"Perhaps Turing's notion of computation appeals to us because it rests on a purely theoretical analysis? What about all the people who have _constructed_ computing machines in the centuries before him? Should we place more emphasis on the engineering aspects of CS? (neither Leibniz, nor Babbage, nor Zuse knew about TMs!)"
I think I need to add political aspects to the last question now too.
The possibility that "science be a post-hoc rationalization of technology" is definitely often more real than we are happy to admit!
In the shameless-self-promotion corner: I wrote a short post about Leon Cooper and the commercialization of neural network technology in the early 1990s which I think offer another historic case study for some points discusssed here
https://open.substack.com/pub/liorfox/p/for-more-information-call-intels
(These were the days were you first had to get a physics nobel prize and _then_ switch to do neural network stuff, rather than doing neural networks and then get a physics nobel prize...)