There are sensors everywhere collecting data, unnoticed. Data on how you move through space – on foot, on bike, in a car, in public transit. How much water and electricity you use, and when; how much garbage you produce and where it goes. Data on things we haven’t yet imagined. And these reams and reams of information will allow us to not only understand the city better but improve it. This is the future of the city.
We are only now at the earliest stages of this transformation. But consider the magnitude of this leap: Only in the last decade have we seen the emergence of the quantified self – the idea of understanding and bettering oneself through data, whether it’s number of calories consumed per day or hours slept in a week. Now that concept is being applied to whole communities. And it will impact cities just as it has individuals: With decisions based on more empirical information than we’ve ever had before, making choices will be that much more clear cut. For example, you might think you walked a lot today, but a quick check of a pedometer will tell you exactly how far you actually went. This concept will not only allow urban planners and politicians to see their workings more clearly, but how to tweak them for the better, whether it’s how to use resources more efficiently, how to create safer, more walkable neighbourhoods or which forms of transit work best.
This is evidence trumping gut feeling. Proof toppling opinion.
Big data could be the key to bettering life in urban settings, says Dr. Constantine Kontokosta, deputy director of the Center for Urban Science and Progress at New York University.
The group has partnered with real-estate developers Related Cos. and Canada’s Oxford Property Group in a project currently in New York that bills itself as the world’s first “quantified community.”
The Hudson Yards development in New York is massive in every sense: an 18-million-square-foot, mixed-use development built on 28 acres that will comprise 15 buildings, including office, residential and retail space, as well as restaurants, a hotel, a school, bars and 14 acres of open public space. Already under construction, the first phase of the project is expected to be complete in 2018. Much of it will be outfitted with sensors to measure everything from energy consumption to pedestrian traffic flow.
“Understanding the pulse of life in the neighbourhood could really help, we think, design an urban space that meets the needs of the people who are living and working there much better,” Kontokosta says. “Everything we’re doing is trying to improve quality of life in cities.”
If the big-data-for-cities movement has a motto, that’s it. Inspired by all the possibilities of applying big data to cities, policy-makers, data scientists, urban planners and others are grappling with practical hurdles as well as more theoretical issues. The Global Cities Summit, held in Toronto last May, launched a platform for developing open, standardized data for cities around the world. This year’s Big Data Week, an event with more than 20 cities around the world, examined the “social, political and technological impacts of data,” as per the event’s website.
Cities around the world are already experimenting with big-data projects and seeing significant improvements as a result.
In 2012, IBM partnered with the city of Lyon, France, in a first-of-its-kind program to reduce traffic congestion. Using real-time data and algorithms capable of learning best practices, traffic engineers are better equipped to make decisions on things such as road closures in the event of an accident or which detours to recommend through road signs to help drivers avoid traffic jams.
In New York, officials built an algorithm to help the fire department prioritize the city’s 330,000 buildings it is responsible for inspecting by assigning each a risk score.
Last year, Seattle partnered with Microsoft and Accenture in a big-data project that aims to reduce energy costs by 25 per cent. Using data from hundreds of sensors in four buildings with very different uses – the Sheraton Hotel, the Seattle Municipal Tower, some Boeing offices and a University of Washington School of Medicine building – the system will gather information on everything from tracking where energy goes to how weather conditions affect use.
The police department in Vancouver has been using analytics-led policing since 2008. Using data to identify crime hot spots and patterns, the approach led to a 24-per-cent reduction in property crime and a 9-per-cent drop in violent crime as of last year. And in Toronto, Waterfront Toronto, a government organization in charge of revitalizing the city’s waterfront, has partnered with IBM to create a community portal that promises to use big-data analytics to provide insight on everything from public safety to water conservation in the community.
But it’s not as if we can simply flip a switch and instantly enjoy the benefits of big data.
The big-data revolution faces two key challenges, both concerning the collection of information.
First, as is always the case when it comes to monitoring individuals and collecting details about their lives, is privacy. It’s the norm that whatever information is gathered for big-data projects are anonymous and aggregated, and only collected when a person has opted in to the program. That said, many people are still uncomfortable consenting to having their data tracked and analyzed. But that is likely to be less of an issue for people who have grown up freely sharing their information online.
“Anyone who has never lived in a world without Facebook is going to have far less trouble providing their data to an aggregated source,” says Kristian Roberts, a senior manager at Nordicity, a Toronto-based consultancy that has studied how cities can use big data.