Which is more male: a stadium or a nursery? Hannah Rozenberg, a recent graduate of the Royal College of Art, says that it’s the former—and she has an algorithm to prove it.
Rozenberg’s thesis project, “Building without Bias: An architectural language for the post-binary,” hinges on the notion that architecture can, by design, be gendered. To illustrate this point, she references St. James, an exclusive London neighborhood that houses dozens of gentlemen’s social clubs. “Women are either not even allowed in the clubs or have to follow different rules,” Rozenberg told ArchDaily. “One of the gentlemen’s clubs for example, is Boodle's. Women have to enter through the back entrance there.”
You'd be forgiven for assuming that, since the foundation of the St James' gentleman's clubs in the 18th and 19th centuries, architecture had become less overtly gendered. Rozenberg argues the opposite. In her research, she explains that even as technology becomes more and more relevant to the way architecture is designed and built, gender-biased architecture will persist. Why? Because gender is built into the technology we use every day.
Take Google Translate, a program which Rozenberg says can reveal the gender biases present in technology. If you translated, for example, “she is a leader” from English to Estonian and back to English, the program will automatically swap the pronouns, to read “he is a leader.” The same happens if you type “he is an assistant;” Google Translate will turn this into “she is an assistant” based on the word associations inherent to its algorithm. “Because architecture is my medium,” Rozenberg said, “I decided to use architecture as a way to highlight the issue and try to revise it.”
With this goal in mind, Rozenberg phoned a coder friend. Together, they developed a website that uses an algorithm similar to that of Google Translate to measure a word against the gender with which it is most commonly associated. The program then quantifies these linguistic associations into “gender units” (GU), with positive numbers indicating more female words and negative more male. As Rozenberg describes it, by analyzing text sourced from Google News articles, “the machine learns that a man is to a king what a woman is to a queen.” And when applied to architecture and design terms, it learns that what concrete, steel, and wood are to men, lace, glass, and bedroom are to women.
To interrupt these linguistic—and in turn, architectural—binaries, Rozenberg designed a series of spaces in and around St. James’ gentlemen’s clubs that disrupt the hyper-maleness of the area. Rozenberg measured the architectural elements of these spaces on the GU scale and made sure the final product equaled zero. In one of these spaces, “a bench, a canopy, a ladder, a wall, two windows, a door, a balustrade and a ramp equal zero,” says Rozenberg, “so if a machine were to read that image and label it, then all those features together would equal zero,” meaning the space registers as neither male nor female. Rozenberg is clear, though, to emphasize that because she rendered these spaces herself, they’re not entirely free of bias. “I’m not claiming that they’re gender neutral in their aesthetics,” she says. “Something that might seem very masculine to some might be very female to others.” Rather, the images are meant to be readable as gender neutral by a machine, like IBM Watson’s image recognition software. Rozenberg articulates this process as a way of retraining the machine—which we perceive to be neutral—to actually be so in practice.
Likewise, the spaces Rozenberg designed are meant to be sites for retraining language. “Biased language leads to biased technology which in turn results in a biased environment.” To get at problems like gender bias in architectural design, she says, “the first thing we need to change is language.” Renderings show a library, a theatre, a cinema, a series of benches, and a newspaper office. All are meant to represent ways people communicate with one another. The idea of these theoretical spaces, says Rozenberg, “is that people come to the space and re-think the way in which they use language and the biases that are embedded in it.”
Rozenberg’s methodology is complex, but her message, ultimately, is simple: by correcting the machine as it begins to play a larger part in designing the buildings of the future—and by self-correcting our own gender biases—we might design spaces that work better for all.