Quick read

How Radar Changed The Second World War

The use of radio waves to detect objects beyond the range of sight was first developed into a practical technology by British scientists and engineers in the 1930s. This new equipment, known as radar (‘radio detection and ranging’), would play a major role during the Second World War and in subsequent conflicts.

Radio waves are used to detect an object at a distance by transmitting a burst of radio energy and measuring the time it takes for the ‘echo’ caused by hitting the object to reflect back to the receiver. The height and bearing - or direction of flight - of targets can also be identified.

By the outbreak of the Second World War in 1939, a chain of early warning radar stations, called Chain Home (CH) stations, had already been built along the south and east coasts of Britain. Radar could pick up incoming enemy aircraft at a range of 80 miles and played a crucial role in the Battle of Britain by giving air defences early warning of German attacks.

The CH stations were huge, static installations with steel transmitter masts over 100 metres high. But the invention of the cavity magnetron in 1940, which produced much more powerful radio waves with a shorter wave length, allowed far more compact, powerful and sensitive radar units to be produced. This gave the Allies an important technological advantage over designs used by the Axis forces, and new equipment was developed rapidly for use in aircraft and ships and in land warfare.