The UK tech sector is the fastest growing sector in the nation, increasing at 2.5 times the rate of the rest of the economy. Research suggests there will be 800,000 unfilled IT jobs in the UK by 2020. And yet, computer science has one of the highest rates of unemployment of any degree. So where is the disconnect?
Computer Science degrees are, quite simply, out of touch with what the industry needs.
It’s not that these degrees are bad; there is a lot to be said for understanding the theoretical underpinnings of computer science. But the things they teach are not what the industry requires right now; what the industry is, in fact, crying out for. While students are learning about the most efficient search algorithms or designing more performant programming languages, the industry needs developers who can write code in current languages to create programs quickly, cost-effectively and to a high standard. And they need them now.
Let’s take a look at some of the areas where degrees are not delivering what the industry needs.
Academia over industry
Theory over practice
Ask any Computer Science graduate how much time they spent building applications versus talking about computer theory and you will be amazed at how little programming an average student does in their three years at university. Some of the best Computer Science degrees in the country only contain a handful of projects in the three year course. Many thousands of hours are spent studying theory and algorithms, yet the answer to many of these problems are already provided by modern day languages – the same ones not being covered at university. While there is a requirement for people to understand this level of theory for specific roles, these positions are few and far between. The overwhelming shortage is in programming, but the problem-solving challenges programmers have to face in the average workplace is being diluted with an unnecessary level of detail in their formal university education.
Maths over results
Along the same lines, often the mathematics or theory offered by academia is counter productive to achieving the best solution in real life. Computer Science professors often chase the most performant or elegant solution to a problem using complex maths, which can make sense in a deeply academic context, but are often difficult to understand, apply, and upkeep. The complexity of these solutions makes them difficult to maintain and hence more costly in the long run, along with being more time-intensive (something any company is reluctant to invest in without a compelling business case). They also often require perfect datasets without anomalies – which as anyone working outside a classroom will tell you, rarely happens.
Future over now
Big data, AI and machine learning are popular subjects in the tech industry at the moment. With increasing access to large amounts of data and more powerful computers used by consumers, we are able to build ever more powerful programs. These concepts have been around for 15 or 20 years in academic theory, but only recently have started to be practically applied in industry. Similarly, quantum computing is 5 – 10 years away from being consumer ready, meaning it is not yet ready for the industry and can only be studied as a somewhat abstract concept. While it is important that universities study these new technologies in order to progress our understanding and prepare them for consumers, should this really be done at undergraduate level?
Overall, Computer Science students are graduating with knowledge of outdated programming languages in industry decline, cutting edge technologies not yet ready for market, and mathematical theory that distracts from pragmatic solutions to actual problems. It is no surprise that many tech companies have switched to hiring physics or maths graduates, or from programming bootcamps, instead of Computer Science graduates, as the skills they really need are practical problem solving, not academic theory.