A brief history of computers
Since the invention of the computer (at first, mechanical) humans have always sought to increase speed of computation (and later, storage capacity). Performing more computations in the same amount of time meant not only more work could be accomplished, but also more complex problems could be solved.
The twentieth century has brought with it in short succession a series of inventions that have led to the design of the modern computer as we know it today. A CPU, a chip capable of performing hundreds of millions of computations per second, connected to memory, chips holding the operating memory of computations being done, connected to a hard disk for long term storage of data, and connected to a device that allows exchanging data with other computers over a network.
What is “super” about a supercomputer?
So much for an introduction to computers. But what are supercomputers and how are they different from “normal” computers? Put it simply, a supercomputer is hundreds or thousands of computers connected together with an ultrafast network, housed in a large hall. Supercomputers allow us to solve the hardest, most complex problems in science and engineering. The trick is that when a problem is to be run on a supercomputer, we first have to think about how the problem can be solved by many computers at the same time.
Consider the following simple example: We want to find the word “lazy” in the sentence “The quick brown fox jumps over the lazy dog”. Doing this with a serial program running on a single computer we would go word for word through the whole sentence until we find the word “lazy”. Now consider we have 4 computers that can work on this task together. They might first split the sentence into 4 pieces. Here is what each computer would have to analyze:
- The quick brown
- fox jumps
- over the
- lazy dog
The word we’re looking for can be found in quarter the time!
Supercomputing (or high-performance computing) is concerned with finding ways to solve problems on supercomputers (or clusters of computers). But what are those problems we’re trying to solve today on supercomputers?
Weather forecasting
Weather forecasting deals both with very complex models as well as enormous sets of data. First, the physics underlying our weather is highly non-trivial and in order to get to a precise prediction for the weather tomorrow in Paris we need to consider the various effects of ocean flows, air flows, the earth’s topography, the sun’s radiation, etc. Second, weather stations and satellites provide us today with a wealth of measurements that need to be integrated in our calculations. Neither the amount of data, nor the level of detail of predictions required could be processed on a single computer. Instead, meteorologists turn to supercomputers and continuously run their weather simulations to provide us with up-to-date forecasts.
Beating the virus
As the world was firm in the grips of the Coronavirus scientists feverishly worked on any possible lead for a cure or a vaccine. But, to devise a battle-plan against the virus we needed to first better understand it. How does it look like? How does it enter our cells? How does it make us sick? To answer such questions, scientists built large detailed models of the SARS-CoV-2 spike protein (the master key the Coronavirus uses to get into some of our cells) and simulated them for one microsecond. Even on one of today’s largest supercomputers such a simulation took 50 days to accomplish!
Climate models
If weather forecasting wasn’t hard enough, how about attempting to predict average temperatures over Antarctica in 40 years? Clearly, in order to have any confidence in such predictions scientists must integrate an incredibly large number of variables modelling various natural and man-made phenomena. Climate scientists have gained a lot of confidence in their models and today we have a clear (and terrifying) picture of how the future will look like if we do not take action, all thanks to supercomputers. Nevertheless, scientists continue to refine their models, incorporating more physics and more data to get to even more accurate predictions and this naturally requires more computing!
Simulating the universe
Since the earliest days of humanity we have looked up into the sky and wondered. Today, astrophysicists use supercomputers to help them answer some fundamental questions. How are galaxies formed? What happened in the first seconds, minutes, hours of our universe? And where will our universe finally go to? Again, both the complexity of the models and the number of involved unknowns, force us to move away from desktop computers onto supercomputers to handle the amount of necessary computations.
There is no free lunch
Of course all this computing comes at a cost. Running a supercomputer requires lots of energy, both for powering the computer units and for keeping them cool. Reducing the amount of energy needed for running these systems is in fact an active area of research. It saves cost and opens the doors for even larger computers (heat being one of the limiting factors in chip design). Not astonishingly, such systems are expensive too, the European Union is investing 700 million euros for the next three years in building the next generation of exascale supercomputers in Europe that are able to perform 10^18 operations per second (if every man, woman and child in Europe would perform one arithmetic operation per second it would take them more than 42 years to achieve the same!). The benefits to our everyday lives, medicine, engineering and our understanding of the world around us make it a worthwhile investment.