Planning by numbers: can technology help us make better places?

4 Sep 2018

The urban planning industry’s digital transformation has been a long time coming, says Euan Mills, head of Future Cities Catapult’s Future of Planning programme. But data-driven placemaking is now here.

This essay is one of a collection of articles created by Lendlease and the Centre For London, written by a selection of built environment experts. These thought-provoking commentaries explore the great power and great responsibility that comes with placemaking – and how to create a city space that enhances the wellbeing of everyone who experiences it.
You can read the other articles in the series here.



In the last century cities have become the most common habitat for humans. But despite the massive growth in people living in urban environments, we are still unable to consistently create places that are conducive to our happiness and wellbeing. 

While we’ve seen significant advancements in our tools and technology across the spectrum of human activities, the way we plan and design our own habitats remains archaic. As a result, we continue to build and grow cities which undermine our physical and mental health, and lack the resilience and sustainability to allow them to change and adapt in the future. 

Jan Gehl famously exclaimed: “We know more about good habitats for mountain gorillas, Siberian tigers, or panda bears than we know about a good urban habitat for homo sapiens.” Despite being in an age of ‘Big Data’, the quality and relevance of the data we have about our cities is surprisingly poor. Our data lacks granularity and is mostly out of date: yet it is from this that we formulate the rules and policies that guide the quality of the cities we build.
 
However, we are at a point in time where the opportunities presented by new technology can radically change how we build our urban environments. We can now create tools to allow us to monitor and understand what makes good quality places faster and better than we ever could. Our policies can be less ambiguous and can also be designed to be regularly adjusted based on outcomes; decision-making can be fully inclusive and accountable. 

The trend 

In 1965, Gordon Moore observed that the number of transistors we fit into an integrated circuit was doubling every year – this became known as Moore’s Law. His observation proved to be surprisingly accurate to this day. The price and size of our computers have decreased exponentially while their power has increased. Today we have in our pockets computers with similar processing power to that of a mouse brain. 

The low price and miniaturisation of processors means we have deployed sensors in nearly everything: a mesh of interlinked passive sensors in our watches, clothes, shoes, rubbish bins, cars, mobile phones and homes collect and create a constant stream of data unbeknownst to us. We actively add to this web of data through the 2.9 million emails we send per second, the 20 hours of videos we upload to YouTube per minute, the 50 million Tweets per day, as well as the plethora of social media posts, Google searches and Amazon purchases that are continuously tracked. The combination of these vast quantities of data and increasing processing power mean our machines become smarter and learn for themselves. At the start of 2016, DeepMind, a British company acquired by Google, taught a general-purpose software program how to play the ancient Chinese game of Go. 

Despite its simple rules, Go has so many variables that it is technically impossible to calculate, and the best players are known to use intuition to play. This general-purpose software program not only mastered the game: it beat the South Korean professional Go player Lee Sedol, one of the best players in the world. 

Through the learning capabilities of machines, we now have computers that mark school essays, provide legal advice, drive planes and cars and have proven to be more effective at identifying some forms of cancer than clinical pathologists. Ray Kurzweil predicts that by 2025 we will have machines that can pass the Turing test – a test which looks to differentiate computers from humans through language – and by 2050 we will have reached “technological singularity”, where humans and computers will have become indivisible. 

To put this into perspective, most of our housing targets are based on population demographic projections to 2050, and the policies in our Local Development Plans plan for a period of 25 years. 

Today 

These technological advancements may seem far-fetched and irrelevant to urban planners, but their early manifestations have already changed the ways we use cities. As well as the often cited Airbnb and Uber, there are more subtle but equally impactful innovations, such as live traffic information provided by Waze, the multi-modal transport integration of Citymapper, or the enabling of car clubs, bike sharing and co-working spaces, unscalable without our current Information and Communications Technologies (ICT). 

While digitisation is well on its way to transforming how we live in and use cities, the way we design and plan them remains firmly in the 20th century. The McKinsey Industry Digitization Index for Europe shows the construction industry at the bottom of the list. One glance at the work flow of a Local Planning Authority, both in their plan-making and development management roles, will reveal archaic ways of working that make the construction industry look relatively advanced. 

Local Development Plans take years to put together and cost hundreds of thousands of pounds and, when they eventually get through examination, their evidence and policies are out of date, and the whole process needs to start again. 

The policies themselves are based on evidence from a motley collection of rarely read research reports and out-of-date surveys, where the original data is hidden and inaccessible. These evidence reports are mostly outsourced to the same consultants that work for developers, who repackage their findings in support of the developers’ planning applications. 

Occasionally evidence is collected in-house: for example, in ‘call for sites’ – where out of date methodologies, such as ‘voluntary call for sites’, are used to estimate the total amount of developable land in a city, resulting in huge error margins. Yet, critical policies on housing density, design and building height are based on this. 

The monitoring of our plans and their associated development are done through oversimplified Key Performance Indicators (KPIs). These are formulated only around available metrics, rather than what we’d like to know. For example, we measure the number of homes built, and quantity of greenbelt retained; but not whether homes are occupied, provide good-quality living conditions, or whether our open spaces are liked and well used. 

The planning applications process also relies on time-consuming analogue processes, ripe for error and obfuscation. Planning submissions still consist of boxes of printed drawings and supporting documents, sometimes accompanied by a CD with hundreds of ambiguously titled, non-machine readable PDF documents – which under-resourced officers rarely have the time to read. 

Officers trawl through boxes of printed scale drawings from which to manually measure areas of flats, or distances between bedroom windows, in hopeless attempts to fully understand the complexity of major development proposals. Decisions are then made on a case-by-case basis, with a haphazard understanding of the cumulative impact of the hundreds, if not thousands, of other developments that go through the planning department every year.
 
Finally, there are the communities, who only ever find out about local development if someone happens to stop and read an ambiguously worded A4 laminated paper fixed to a lamp-post. Even after deciphering the technical jargon, the chances of communities understanding these proposals are minute. They often rely on self-proclaimed community activists, some of whom have entrenched anti-development attitudes, who will spread misinformation with more ease and traction than formal channels. 

Tomorrow 

There are, however, early seeds of change. Space Syntax has managed to quantify and predict improvements to permeability; State of Place is quantifying quality of place and putting an economic value to it; UrbanPlanAR is exploring the use of augmented reality to visualise proposed new buildings in their actual locations; Land Insight and Urban Intelligence are creating a growing database of planning application and policy data; and Commonplace and Stickyworld are improving community participation. 

These are just some of the many innovators and researchers that are starting to bring new technology to the planning system. Meanwhile, the government continues to build on the success of Building Information Modelling (BIM) Level 2 and is upgrading it to BIM Level 3 – to bring benefits not only to construction but also to asset management and how we plan cities. 

Today it would only take a small tweak to our planning system to ensure all major planning applications are submitted as 3D BIM files, with all the data that planning officers need to assess proposals fully embedded into them. This consolidation of data would allow us to create tools that automatically assess development proposals against planning policies and building controls, to identify issues before the developer enters a costly pre-application process. 

This early assessment could include the microclimatic impact in the surrounding area, such as sunlight and wind; visual impacts of the development in conservation areas and on the setting of listed buildings; residential quality, through an assessment of dimensions and orientation of residential units; and even viability, comparing this to live data on sales values and construction costs. 

Developers could then be notified if their scheme significantly diverges from policy requirements, modify the scheme accordingly, or be passed on to a case officer to enter into negotiations. Such a system would critically cut down the levels of uncertainty for developers and allow planners to focus on more complex decisions. 

Planning departments are already looking to create 3D City Information Models (CIM), which would allow the assessment of planning proposals to be done within their rapidly changing context. Planning officers could visualise each application and compare it to developments still in construction, those yet to be built, and those in the planning system at the time, comparing anything from viability to car parking ratios or building height. This would allow officers to have a more robust understanding of the impacts of every development they assess, and their recommendations would be significantly better informed than they are today.
 
Every new proposal tested in this model would add to a growing database, providing the Planning Authority with valuable information about commercial activity, developer priorities, trends in building typologies and much more. This database could be further augmented with the increasing amount of data already available from passive and active sources such as telecoms, social media and financial transactions. 

We could also target data collection through passive sensors built into streets and buildings, collecting data on, for example, occupancy and dwell time. This would enable us to better understand the types of places and spaces where citizens spend time – which, combined with sentience analysis of social media, would allow us to start measuring citizen satisfaction and wellbeing in the public realm.
 
A data-rich multi-layered live CIM would then become a tool not only for monitoring and visualising things which are easily quantifiable – such as the number of homes, social infrastructure and green space – but also for monitoring everything else we need to know about cites, such as whether homes are occupied, how high streets are being used, and possibly even how citizens feel in our parks and public spaces. 

This level of feedback allows us to design our plans so that policies can be tweaked in response to the types of places we want, rather than merely numbers of homes or jobs. For example, if we have policies that stipulate the need for balconies or open spaces, and identify that these are not being used when they face north or are overshadowed, we can tweak the policy accordingly. 

Policies regarding planning gain, for example, can be fine-tuned to respond to economic cycles, increasing requirements during times of prosperity and reducing them during depressions. Mixed-use policies can be altered subject to the changing nature of businesses or the proliferation of bookmakers and estate agents. This live, data-rich CIM would also allow us to build robust agent-based simulations, where we can test and prototype policies and developments before implementing them. 

With a system where we can live-monitor and test the impact of our policies, we can redesign the format of our Local Plans: rather than fixed, self-contained publications, they could be more flexible and agile lists of policies with varied time frames and sell-by-dates. Some might be written to achieve yearly or even monthly short-term outcomes, such as improving air quality, while others might be concerned with long-term requirements about the need for daylight and sunlight in the public realm. 

Finally, having a spatial simulation of the changing city would also allow for much greater engagement with citizens. Giving citizens access to this model would allow them to visualise everything in three dimensions and develop greater understanding of the impacts that individual developments and policies are having in areas they care about. They’ll also be significantly more informed and able to give good-quality, real-time feedback to genuinely influence how we plan. 

The first step 

Of course, it is unlikely that the planning system of the future will be as described here, but it is inevitable that by 2050 the way we plan cities will be very different to today. The changes will be slow and incremental, with no single product or service creating the fundamental paradigm shift that planning needs. 

Large technology companies will be eager to sell us complete off- the-shelf digital planning services, but we need to resist this. We need to ensure that the data we collect and give away is open and accessible to all; that the tools we use are transparent and interoperable; and that the policies we write are clear and accountable. Achieving this is not just about making our jobs easier, but about democratising how we plan cities, resulting in places that are better designed, better built and better equipped to respond to a changing world. 


This article was written by [Euan Mills] (“Author”) (independent of Lendlease) and originally published by the [Centre for London]. The opinions, views and representations expressed in the article are those of the Author. Lendlease has not independently verified any of the information in the article nor sought to confirm any underlying assumptions relied upon therein by the Author. Unless otherwise indicated, information contained in the article is current only as at the date it was originally published. Neither Lendlease, nor any member of the Lendlease Group, accepts any liability for any loss or damage suffered, howsoever arising, as a result of the use of any information in the article.