On universities in software engineering – 10 years behind, getting worse

Hi friends!

In a recent talk Robert C. “Uncle Bob” Martin mentioned as a side note, that in the field of software engineering college education is limping 10 years behind the industry. Having studied at three different German and one Irish university, this observation matches exactly my experience. To be clear, I acknowledge that there are exceptions, and I would be happy if you tell me that you know a lot of colleges that are different. I still fear that the regular case is as follows:

  • The default programming language college students are expected to learn is Java. Invented in 1995, Java is still highly relevant in the industry, but lectures are often still going for Java desktop applications instead of web services.
  • If students get in touch with JavaScript, it’s mostly a self-made framework the professor built over years instead of Angular or React.
  • There are separate classes on programming, databases and (if you’re lucky) html-css-webdesign-usability-something, but the integration of a standard real-world stack frontend/backend/database is missing completely.
  • Software engineering is mostly thaught as the discipline of modelling classes and interactions in UML instead of building running systems. Modeling isn’t a bad thing, but it only makes sense if you have some experience from implementing a couple of systems end-to-end, so that you have a clue what kind of classes and patterns you need.
  • Waterfall is introduced as the default process model, agile methods are mentioned as some exotic alternative.
  • On the topic of Testing, the Java-based classes will introduce the syntax of JUnit to write tests, without teaching how to write good tests and how to apply testing to a real-world system

As a result, there are two kinds of students in computer science: The ones that learn software development by internships in the industry in parallel to their studies, and the ones that leave college after years of education without any skills in developing software. The first kind has hundereds of open job positions available; the second kind will have a hard time to get hired for a first developer position and probably not even apply, but rather become consultants or join government agencies that require degrees and not skills.

Why is this the case? Why is college education in software engineering that broken?

First of all, building software is only one area of the field of computer science. I assume that 75% of the jobs for computer science graduates are in the software industry and only small parts in hardware, security, research and IT, so it should have an emphasis in education. But in practice, if you read through the list of chairs at any faculty for computer science, you will find that at most ten or fifteen percent of professors declare to be in the field of software. And even those who do, if you look closely, focus on modelling and verification of software. They try to formalise software development so that one can write and discuss and reason about it whithout actually having to do it.

What college professors teach about software development is ten years behind, but what they research is not even on the topic!

If you are a professional software developer, you are probably using three to five different programming languages in your current project, ten frameworks (including the testing frameworks), some dozen libraries, build scripts and tools. You remember how to use them, you are fast because of routine, and still every month you are exchanging a tool or libary for a different one or upgrade something to a new version and learn how that works.

Software development is complex. It’s a lot. It’s a full time job to learn it, and you can only learn it on the job.

Whoever becomes a professor at a German university, has well-known credentials in academia from working at a university chair for years and publishing papers in scientific journals. Whoever has worked on becoming a professor did not have time to develop production-grade software for years, but spent his career in a university reading and writing and teaching. Being a professor and being a software developer excludes each other.

For that reason, professors are not even to blame: They have to work on their scientific career by publishing papers, and you cannot publish on how you learn and use existing stuff; you have to publish insights that are new for the scientific community, and since the industry is way faster to innovate on fields that are in focus of the industry, your only way is to go for fields that are not.

The bad part of this arrangement is the fact that professors are responsible for the education of the next generation of softwave developers. In fact, software development is a craft, not a science. Craftsmen are not trained at college; to become a craftsman, you want to be the apprentice of a master craftsman. You attend the daily work, starting with small supportive tasks, observing the master and the more senior fellows to learn the techniques and understand their decisions. Your tasks become more advanced, first under supervision of the master, over time with more and more responsibility on your own, until you’re a professional on your own. It’s a good idea to have the apprenctice attend some kind of school parts of the time to learn theory in addition to the practical training, just as it works for opticians and electricians and carpenters – but in the current system, the software developer is trained by scientists only.

Here is the problem: Society agrees that intellectually difficult jobs can only be done by people who have degrees and university education. People who are smart enough to become software developers are also smart enough to have college degrees, and naturally don’t want to miss out the prestige and salary of a college graduate. The HR department of a hiring company relates the offered salary to the college degree of the applicant, and will not consider candidates without college degrees for important positions.

Although the currently common college education is computer science is a waste of time for software developers, there are strong forces preventing a change of this system. Since the complexity and the amount of required practical knowledge for software developers is increasing over time, the gap between required skills and college education will grow even more in the next decade. I’m curious to see if future graduates manage to aquire these skills somehow, if fewer graduates will be hired by companies or if there is finally a change of the system.