The first image that comes to mind at the mention of radio astronomy is a desert landscape with large dishes pointed towards the sky ‘listening’ for any celestial signals of interest. The new generation of radio telescopes that are currently being built differ somewhat from this, in that they are composed of thousands of small antennas connected together to act as one large instrument.

Data from these receivers is transported to a central station where they are digitised, combined and processed on standard computers. The focus has shifted from building larger and larger dishes to combining clusters of antennas in software. Modern radio telescopes are thus referred to as software telescopes.

The culmination of current efforts is the Square Kilometer Array (SKA), a telescope so large that it will span multiple continents. It will be composed of hundreds of thousands of antennas and thousands of dishes, all of which have to be connected together. These elements will produce an enormous amount of data, more than 100 times the current global internet traffic.

Transporting the data to a central station is also a significant endeavour, requiring enough fibre optic cable to wrap around the earth twice. In order to process all this data,a supercomputer equivalent to about 100 million PCs is needed, three times more powerful than the current most powerful supercomputer.

These requirements are not easily achieved. A conventional cluster of CPU-based PCs would take up too much space to house, and require the equivalent of half the electricity generation of the Maltese islands. To tackle these issues, alternative technologies are being investigated, one of which are graphics processing units, which are generally used for gaming.

GPUs are powerhouses containing thousands of processing cores and capable of performing the work of approximately 10 high-end PCs; they are also more power efficient and use up less space. They are used as off-chip accelerator cards, where processing intensive tasks are offloaded from the CPU to the GPU for better performance. The current top 10 green supercomputers all use GPUs to increases their processing capabilities while minimising power use.

GPUs are being put to good use for building software telescopes. The Institute of Space Science and Astronomy at the University of Malta is heavily involved in designing and writing GPU-based software pipelines capable of processing the vast amount of data generated by radio antennas.

During the course of my PhD I specialised in writing software for detecting transient phenomena. This is a generic term classifying all astrophysical objects and events whose electromagnetic emission varies greatly in time, unlike standard sources like the sun, generally originating from some of the most extreme objects in the universe, creating conditions which are impossible to simulate in any laboratory on earth, thus acting as cosmic laboratories for extreme physics.

To process all this data, a supercomputer equivalent to about 100 million PCs is needed, three times more powerful than the current most powerful supercomputer

Searching for transient phenomena is not an easy task. As signals travel through interstellar space, they get dispersed, resulting in higher frequencies arriving before lower ones. This effect needs to be reversed to acquire the original signals. The extent of this effect depends, among other factors, on the distance from the source to the telescope.

When searching blindly for new objects, this distance is not known beforehand. Dispersion removal has to be performed on a wide range of distances in real-time, such that only signals of interest are saved for future analysis.

Several processing stages are required to properly identify transient signals: combining the signals from each radio antenna to generate multiple simultaneous fields-of-view on the sky; making sure that no manmade signals contaminate the data; splitting the observing band (range of frequencies observable by a radio telescope) into multiple smaller ones to enhance dispersed signals; apply dispersion removal for multiple distances; post-process data to extract interesting events; and automatically classifying such events for future follow-up observations.

The result of my PhD work is a software tool which can perform all the above in real time on any radio telescope, with lower processing, power and space requirements. This pipeline is being investigated as a potential transient detection system for the SKA.

The PhD was carried out following the award of a STEPS scholarship, partly financed by the EU – European Social Fund (ESF) under Operational Programme II – Cohesion Policy 2007-2013, Empowering People for More Jobs and a Better Quality of Life.

Sign up to our free newsletters

Get the best updates straight to your inbox:
Please select at least one mailing list.

You can unsubscribe at any time by clicking the link in the footer of our emails. We use Mailchimp as our marketing platform. By subscribing, you acknowledge that your information will be transferred to Mailchimp for processing.