What happens when the sensor-imbued city acquires the ability to see – almost as if it had eyes? Ahead of the 2019 Shenzhen Biennale of Urbanism\Architecture (UABB), titled "Urban Interactions," ArchDaily is working with the curators of the "Eyes of the City" section at the Biennial to stimulate a discussion on how new technologies – and Artificial Intelligence in particular – might impact architecture and urban life. Here you can read the “Eyes of the City” curatorial statement by Carlo Ratti, the Politecnico di Torino and SCUT.
Since its breaking-out more than 200 years ago, Industrial Revolution has never stopped its pace of advancement, along with the revolutionary progresses of science and technology. The mechanization in the 1760s, the electrification in the 1870s, the informatization in the 1950s, and the intelligentization in the 2000s either had driven or is driving forward the economic development and the urbanization process of the world, transforming the human society from agricultural to industrial and then post-industrial, as well as from rural to urban. Each revolutionary progress of science and technology had caused about remarkable transformations of not only the way of the world’s production and the way of people’s life, but also the spatial layout and the functional organization of cities, sometimes positively, sometimes negatively.
This argument can be easily justified by the development of vehicles and its impacts on cities since the mid-19th century up to day. From horse-carriage to automobile, from collective bus to private car, and from manual to automatic or even driverless, the continuous upgrading of vehicles has greatly facilitated the mobility of people and goods by transforming their travel mode. At the same time, it has also remarkably reshaped the form of cities, promoting the expansion of existing cities and the growth of new cities, while making many of them, especially the big ones, struggling with its consequent byproducts, such as the growing of traffic jam, the increase of commuting time and distance, the shortage of parking area, and the popularization of air pollution, etc..
As an important representation of modernization, technology has always double-side-sword effects, as shown by the development of vehicles. Nowadays, in the era of intelligentization, the remarkable advancement of informational and digital technology push human society to face such a dilemma again. Artificial Intelligence and Internet of Things are no longer theoretical ideas on papers, but practical creations in reality, which undeniably help to improve the efficiency of production and the quality of life. Intelligent robots are more and more used to do the arduous jobs that were once heavily relied on human labor, such as coal mining and steel making, as well as the subtle jobs that were once impossible for any man power, such as micro-invasive surgeries and operations under extreme conditions. AlphaGo successfully defeated the world champion, showing the capability of deep-learning of an intelligent machine. Driverless car is expected to be an effective measure to deal with the challenge of aging society by guaranteeing the mobility of the aged.
In the field of physical environment construction, different kinds of sensors are used in both interior and exterior spaces to observe and monitor users’ movements and activities, facilitating the management of social security, building performance, and city operation, as the basis for the construction of intelligent buildings and intelligent cities. Data collected through these sensors are analyzed for the possible controlling of lighting and air-conditioning in interior environment, for both energy-saving and personal comfortableness, as well as for the possible monitoring of traffic flows and pedestrian movements in exterior space, for both functional efficiency and social equity. In some sense, these sensors function as the “eyes” and “ears” of a building or a city, which guarantee its safety, efficiency and convenience much better than before.
However, it should be noticed that the application of intelligent technologies also brings about unexpected consequences at the same time, in particular, the nuisance on privacy during the process of data collection and data annotation, which are frequently reported by the media in recent years. Take Amazon as an example. A report made by Niraj Chokshi for The New York Times on May 5, 2018, showed that one of Amazon Echo devices, Alexa, recorded a woman’s conversation with her husband in Portland and shared it with one of her husband’s employees in Seattle. An in-depth investigation reported by Matt Day, Giles Turner and Natalia Drozdiak on Bloomberg on April 11 2019 disclosed that, at Amazon, thousands of human beings are listening to the recordings of voice requests sent to Alexa to check it for errors (https://www.bloomberg.com/news/articles/2019-04-10/is-anyone-listening-to-you-on-alexa-a-global-team-reviews-audio). According to Nick Statt’s article published on The Verge on April 10, 2019, this approach is known as “supervised learning” or “semi-supervised learning” which is believed to be “one of the only, and often the best, ways” to improve the service of Alexa. Although Amazon puts some privacy implications in its product and service terms, such as having human beings listen to recordings of your voice requests, it has often downplayed the privacy implications of having cameras and microphones in millions of homes around the world. It even uses automated systems to monitor and supervise its warehouse workers and automatically fires those who fail to meet productivity quotas, according to Collin Lecher’s report on The Verge on April 25, 2019.
The intelligent devices like Alexa are always listening and on the internet, sharing private conversations without consent. This is not an isolated case taking place incidentally at Amazon, but a quite popular phenomenon among the high-tech companies that invest heavily in AI all over the world, as shown by a personal experience during a high school classmate reunion in Beijing in the summer of 2018. While we met and chatted in a private room in a restaurant, one mentioned a TV program which she thought might be interesting for all of us. On our return to home, two of the eight members replied that they received respectively a recommendation for that TV program from the different apps on their intelligent mobile phones of different brands. It means that, without letting us know about it, intelligent mobile phones listened to our talk and made the automatic recommendation.
All these cases imply that, whenever clear information and strict regulations are absent, there is room for misuse, or even abuse, of the data collected by sensors. At least at this moment, supervised learning of intelligent machines requires human eyes and ears, although the future orientation might be semi-supervised, weakly supervised, and then ultimately unsupervised learning. Sensors as the “eyes” and “ears” of a city are not the “eyes” on the street appraised by Jane Jacobs that transfer directly the emotions and judgements of the people who are observing. Though sensors are more capable in observing through data collection, they are less capable in judging through data analysis, because they don’t have their own values of judgement. All their values of judgment come from the hidden “eyes” who are listening or watching behind them.
We may imagine that one day, when a city was full of sensors to give it the ability of watching and hearing, data could be collected and analyzed as much as possible to make the city run more efficiently. Public space would be better managed to avoid any offense and crime, traffic flows be better monitored to avoid any traffic jam or traffic accident, public services be more evenly distributed to achieve social equity in space, land use be more reasonably zoned or rezoned to achieve a land value as high as possible, and so on. The city would function as a giant machine of high efficiency and rationality that would treat everyone and everything in the city as an element on the giant machine, under the supervision and in line with the values of the “hidden eyes and ears.” But, the city is not a machine, it is an organism composed of first of all numerous men who are often different one from another, and then the physical environment they create and shape in a collective way. Before the appearance of the city full of sensors, man needs to first work out a complete set of regulations on the utilization of sensors and the data they collect to deal with the issues of privacy and diversity.
About the Author
Jian LIU received her Bachelor degree in Architecture and Master and Doctor degrees in Urban Planning & Design from Tsinghua University. She is Registered City Planner in China, Tenured Associate Professor at Tsinghua University School of Architecture, Managing Chief-Editor of China City Planning Review. She was visiting scholar at UBC Center for Human Settlements, l’Oservatoir d’Architecture de la Chine Contemporaine, and Fulbright Visiting Scholar at Graduate School of Design Harvard University. Her research focuses on urban and rural planning, urban regeneration, planning institution, and international comparison. She published both domestically and overseas and is active in both national and international academic circles.
"Urban Interactions": Bi-City Biennale of Urbanism\Architecture (Shenzhen) - 8th edition. Shenzhen, China
Opening in December, 2019 in Shenzhen, China, "Urban Interactions" is the 8th edition of the Bi-City Biennale of Urbanism\Architecture (UABB). The exhibition consists of two sections, namely “Eyes of the City” and “Ascending City”, which will explore the evolving relationship between urban space and technological innovation from different perspectives. The “Eyes of the City" section features MIT professor and architect Carlo Ratti as Chief Curator and Politecnico di Torino-South China University of Technology as Academic Curator. The "Ascending City" section features Chinese academician Meng Jianmin and Italian art critic Fabio Cavallucci as Chief Curators.
"Eyes of The City" section
Chief Curator: Carlo Ratti.
Academic Curator: South China-Torino Lab (Politecnico di Torino - Michele Bonino; South ChinaUniversity of Technology - Sun Yimin)
Executive Curators: Daniele Belleri [CRA], Edoardo Bruno, Xu Haohao
Curator of the GBA Academy: Politecnico di Milano (Adalberto Del Bo)
"Ascending City" section
Chief Curators: Meng Jianmin, Fabio Cavallucci
Co-Curator: Science and Human Imagination Center of Southern University of Science and Technology (Wu Yan)
Executive Curators: Chen Qiufan, Manuela Lietti, Wang Kuan, Zhang Li