Modelling pedestrian movement with SENSE™: Old science, new approach

November 28, 2016

When a client challenged us to model the mass evacuation of a wide tourist area in London with many potential scenarios and the ability to detail the evacuation per building, the question of which tool to employ needed to be addressed.

This article has been written for the fourth edition of the periodical Impact, published by the Operational Research Society (OR Society).

Movement Strategies, a company that has specialised in the analysis of crowd movement for over 12 years, have created their own simulation tool to apply their knowledge of crowd movement, human behaviour and fluid dynamic principles.

In the aftermath of 9/11, the question of how to evacuate multiple buildings safely and quickly was first raised. When planning for this scenario, there are a number of questions that need to be considered, such as the approach to develop, the criteria for defining safe evacuation, which tools to use and so on. Simon Ancliffe, Chairman and founder of Movement Strategies, is a recognised crowd movement specialist, spanning observation, data collection, analysis, modelling and consulting. This knowledge of people’s movement and behaviour, from both a qualitative and quantitative perspective, is something unique and one that lends itself to a number of other applications, in addition to large scale mass evacuation plans. Music festivals, stadiums (through football and rugby clubs or architects), major events, cultural sites, amongst others, became key areas of focus.

In the last decade, there have been many opportunities in the UK and worldwide to apply and further develop these specific skills. The real catalyst for the business was the London 2012 Olympic and Paralympic Games. Movement Strategies helped the London Organising Committee of the Olympic and Paralympic Games (LOCOG) deliver a safe, enjoyable and welcoming games, accommodating hundreds of thousands of spectators and staff. This work started in 2006 with the analysis of the design of the Olympic venues and continued throughout the games, analysing the movement of people to and from the Olympic venues in real time. Throughout this project, many questions were answered, e.g.: “There are X people in A, willing to go to B. What routes will they use? How long does it take to walk from A to B? Is the route wide enough to accommodate this demand? Are there any pinch points along the route? Are queues formed? How long are the queues? How long does it take for it to dissipate? What Level of Service are these pedestrians experiencing?

In small scale operations, these questions could be answered with basic assumptions, including average walk speed and distance and a simple division. In more complex cases, modelling tools need to be used to understand the various permutations and scenarios. See the box for discussion of different potential modelling tools.

When a client challenged us to model the mass evacuation of a wide tourist area in London with many potential scenarios and the ability to detail the evacuation per building, the question of which tool to employ needed to be addressed. A dynamic model had been developed previously at Movement Strategies called SENSE™, built using VBA for Excel. It was a smart way of computing pedestrian movement across a network and applying specific pedestrian parameters. The weakness of this approach was the calculation time for big networks and the limited editing tools provided by Excel. This was the first option. Other established agent based modelling tools such as LEGION were assessed as being unsuitable for this large modelled area.

Description of Sense Modelling

Movement Strategies investigated the use of simulation software designed for vehicle modelling and applying pedestrian characteristics to the modelled network and flow. Whilst this was considered an option, there are significant differences between roads modelled for cars and footpaths modelled for pedestrians. In the latter case, crossing flows share the same capacity (the width of the pavement) and the network capacity in terms of throughput depends on the uni- or bi-directional characteristic of the flow (when density reaches a critical level, people don’t walk at the same speed when following the flow or facing it). This could be modelled with a simulation tools for which an Application Programming Interface (API) was available and enabled the modeller to dynamically program the characteristics of the network so it fits pedestrian properties. We chose not to pursue this option.

After internal discussion, and considering the skillsets available in our team- urban planners, crowd modellers, software programmers, mathematicians…- we decided to build our own tool, designed for our client’s specific needs. Whilst most of the techniques are not new, by combining our knowledge from different fields we felt we were able to create an in-house tool that we could customise according to our needs. And more important than customisation, we would be able to explain the outputs at any level of analysis, without having to manage a ‘black box’ such as in some commercial tools.

The requirements were as follows: the tool needed to be able to run a dynamic model that could handle millions of people, across areas of several square miles, with a time resolution lower than one minute. It also needed to be suitable for a wide range of environments and the model outputs should be of high quality. It was identified that a Geographical Information System (GIS) framework was an intuitive way to model a network of routes with properties defining the way people can move through those routes. A GIS tool enables the modelling of a network configuration and demand scenario and communication of the results graphically. This way, both the modeller and the client can associate, for example, a pinchpoint of the model to a real physical location they recognise. QGIS (http://www.qgis.org/en/site/), a free, open source, stable GIS Package that provided an API allowing customised plugins was selected as the core package. The core algorithm that computes the evolution across the network had to be fast and be able to handle big tables in memory and C++ was selected.

The resulting tool, also called SENSE™, like its predecessor, is a network based simulation tool. The network link (footpath, road, pavement, stairs, escalators etc.) is modelled as a line with two main properties: a throughput capacity and a storage. To understand this, one can imagine people as a fluid, and the network they walk through as an upside down bottle. The throughput capacity will be the diameter of the neck whereas the storage capacity will be the volume of the bottle itself.

Taking this simple example, it is possible to understand the model behaviour as follows: if water is poured into a pipe which has a varying diameter, what is the throughput at the end of the pipe? Will the water spill out its container? These are exactly the questions the SENSE™ tool answers when water is replaced by pedestrians and the bottle by a network of streets and spaces.

The SENSE™ tool is split into two main modules:

  • A route choice module, based on Dijkstra’s algorithm (first conceived in 1956)
  • A dynamic assignment module, based on basic flow conservation principles, which have been applied in fluid mechanics since the 19th century.

The route choice module enables a user to find, for any Origin Destination demand, a set of routes that minimise the cost (usually the walk time). The dynamic assignment module computes the evolution over time of the demand across the routes provided by the route choice module, based on the simple principle described below:

In any section of the network:

Population at time t = Population at time (t-1) + section inflow at time t – section outflow at time t

The core algorithm is based on Finite Difference Method (which is the Finite Element Method in one dimension), that gives a framework to transform a continuous problem into a discrete problem in both time and space (and one that is therefore computer friendly).

Examples of outputs provided by the tool can be seen in Figures 1-3. Here we analyse the egress phase of an event in London’s Olympic Stadium. Figure 1 indicates that many more people use the northern entrance, as opposed to the southern entrance. Figure 2 indicates that the time taken for the northern entrance of Stratford Station to ‘clear’ is much higher than for Stratford International and the southern entrance. Knowledge of the network, local constraints etc. will then enable us to understand the reasons for this imbalanced demand and work out with our client the best solutions to optimise the system.

Olympic Stadium Pedestrian Modelling: Static assignment of the pedestrian flow across the Olympic Park
Figure 1: Static assignment (route choice module output)
Olympic Stadium Pedestrian Modelling: Dynamic assignment of the pedestrian flow across the Olympic Park
Figure 2: Dynamic evolution of the density on the network (time displayed at the top left of each picture).

Olympic Stadium Pedestrian Modelling: Maximum density experienced by the fans travelling from the Stadium to their transport nodes
Olympic Stadium Pedestrian Modelling: Population profiles at stadium gates
Olympic Stadium Pedestrian Modelling: Population profiles at public transportation stations
‍Figure 3: Comparison between exit profile from the stadium and arrival profile at the public transportation gatelines.

Figure 3 shows profiles of population versus time at the northern and southern stadium gates as well as at the three different station entrances. One can see that the profile showing the population at Stratford Station – Northern entrance lasts more than 1h30 whereas the vicinity of the stadium is cleared in less than 30 minutes and the other station entrances are cleared in less than 50 minutes.

To conclude, we have built on the innovations and work of Taylor (mathematics), Navier-Stokes (fluid dynamics), Lighthill and Whitham (traffic models), Dijkstra (route choice) and combined this with our knowledge of people movement, physics, maths and open-source development frameworks to design a tool capable of answering most of our client’s questions with regard to the movement of crowds. SENSE™ is now a mature software tool, with further developments carried out in partnership with the University of Southampton, in the areas of route choice optimisation. We are also focusing on the modelling of dynamic events that occur within the network (e.g. stop and go situations), and we are undertaking further research on flow-density curves (including the analysis of mobile phone data to better understand how the crowd density affects the average walk speed) and implementing these as enhancements in the tool.

Daniel Marin is Managing Consultant at Movement Strategies. Previous to joining Movement Strategies in April 2015, he worked for seven years in Paris in public and private transportation modelling. His background is applied mathematics, modelling and simulation.