Deep Learning

Using 120-billion IoT sensors to improve London traffic

Transport for London (TfL) wanted to explore innovative uses of traffic data to find new ways to reduce road congestion. To make this happen, they invited Datatonic, and some of London’s brightest data teams to a week-long, traffic-beating hackathon.

  • 14,000 road sensors
  • 120 billion data points
  • LSTM prediction model

Designed to gather data from all over London, TFL’s Urban Traffic Control (UTC) system collects car activity records via 14,000 individual road sensors located throughout the city. We were given 3-months worth of these records to create our model — totalling over 120 billion data points.

Our solution

 We began by building a live visualisation of driver activity in the city, converting TfL’s raw sensor data into common traffic engineering metrics — occupancy and flow, and using them to infer the volume, frequency, and location of traffic throughout the city, at any given time.

Next, we designed a deep learning model capable of accurately predicting traffic conditions 40 minutes into the future. We did this by identifying the traffic conditions associated with congestion, and using machine learning to understand and identify patterns in the vast dataset.

In the end we built a robust model to predict road congestion independent of road network layout, enabling proactive identification (and potentially prevention) of congestion, quickly respond to road incidents, and better coordinate the flow of rush-hour traffic on a daily basis — with the potential of saving countless travel hours for drivers all over London!