This article was originally published on Common Edge.
The rise of generative AI has given every design educator sufficient reason to reconsider both what to teach and how to teach it. Training an architect is a long process, and mapping it onto an uncertain future is a daunting task. Researchers at OpenAI, DeepMind, Meta, and similar companies seem constantly surprised by the rapid development and sometimes unforeseen capabilities of their AI creations. If even the creators don’t know how fast the future will arrive, it would be hubristic for any of us to claim that AI will do X or AI won’t be able to do Y in the next decade, which is about how long it takes to really train an architect.
The conversation about what and how to teach is already contentious, and it must necessarily evolve with technology. Parts of it will remain unresolved until the impact of these new technologies is more clearly understood. However, there’s another, easier conversation to have: what not to teach.
Related Article
Can Artificial Intelligence Systems like DALL-E or Midjourney Perform Creative Tasks?This conversation has also been historically contested and is inseparable from the discussion of what to teach. But in my own teaching and conversations with colleagues, there seems to be a consensus among design faculty that certain things should no longer be taught in architecture school. These anachronisms remain fixtures in most schools due to institutional and cultural inertia, and perhaps because schools are still able to produce great architects despite them.
However, AI will change that calculus. It gives us new arguments for purging some of the more ossified practices of design culture. What was a frustrating anachronism yesterday may become a liability tomorrow, compromising our ability to train young architects and their ability to continue the profession. With AI, we finally have the means and the motive to get rid of three things traditionally endemic to the educational process.
Masochism
If generative AI speeds up the process of visualizing and producing designs by a factor of 10, it would be a great tragedy to allow students to use all that increased productivity to indulge their instincts towards all-nighters and self-neglect. Despite the efforts of many design educators to curb these instincts, the culture of self-neglect and exhaustion in design school has proven to be a persistent and difficult problem to solve.
Part of the problem is certainly that there are still valuable lessons to be learned in the more masochistic parts of studio culture. Architecture school taught me to keep iterating, taught me to “kill my darlings,” taught me that another, better solution to any problem might be around the corner, if I could remain open to it. Once you embrace that feverish commitment to improving on your own work, all-nighters seem like a logical expression of that commitment. That was the justification, but it wasn’t the reason. The reason was that testing, proving, and demonstrating an idea takes a lot longer than having the idea in the first place. The human brain can have an inspiration in less than a second. But to test, prove, and demonstrate that idea requires execution in the form of drawings and models—lots of them. So if you care about your ideas, you’d better start a pot of coffee.
This may seem reasonable—at least to anyone who’s been to architecture school—as long as you ignore the downstream effects. As you stay up for several nights in a row to test and prove that one brilliant idea, your creative faculties steadily decline, compromising what might have been that second or third brilliant idea.
The importance of sleep for creativity cannot be overstated. Research consistently shows that a well-rested brain is better able to generate novel ideas, solve complex problems, and think critically. In a competition with machines, sleep deprivation will constitute a tactical disadvantage. If “creativity” is to be the inner keep of the architecture castle, then we must defend it at all costs by defending sleep and retiring the all-nighter.
But how will we get work done?! screams every architecture student everywhere. In an AI future, the day of an architecture student might look a lot like the day of a contemporary writer. For a creative writer, inspiration, and production often flow as a single act. Have a thought, type it out, repeat; revise later. Most writers stick to a disciplined schedule designed to maximize creativity, acknowledging that the human brain can only be so creative for so long and needs inputs like sleep, exercise, and food. Haruki Murakami rises at 4:00 a.m. and only works for five or six hours a day. In the afternoons he runs, or writes, or listens to music. Maya Angelou had a similar practice, writing only from 7:00 a.m. to 2:00 p.m., and was so committed to her own focus that she would find a hotel and rent a “tiny, mean room with just a bed, and sometimes, if I can find it, a face basin.” Once she left her desk every day at 2:00, she lived a normal life of running errands, having dinner with her husband, and getting a good night’s sleep. No one could argue that Murakami and Angelo are uncreative or unsuccessful people. And a great novel has no less complex a structure than a great building.
Imagine if an architecture student only worked six hours a day, but the entirety of that span was dedicated to pure creation, while machines took over the production?
As AI rapidly takes over the rote, mechanical aspects of design, humans must focus their efforts on only those things that a human can do. If you believe that creativity is one of those sacred cows, let us optimize for that by breaking this ugly tradition.
So tell your students to leave the studio at a reasonable hour and go home. Insist on it. Insist that they do their designs, do their best, and then go home, or go out. Advise them to meet other people their own age, preferably in disciplines other than architecture. Require them to get a hobby, or join a club or sports team. (Even an a capella group, if they absolutely must.)
Tell them what you already know: Life is what architecture is made of. Love, loss, risk, failure—these are the engines that power any real creativity. And sleep is the motor oil that keeps them working. So get some sleep, and let the machines take over the all-nighter.
Fetishization of the Image
For most of prehistory, architecture was primarily a spatial experience, appreciated through the inhabitation of its spaces. However, with the advent of mass media, architecture began to drift toward an image-based culture, more so than almost any other professional discipline. This shift can be attributed to the way mass media fragmented different types of professional success: commercial success (making money), professional success (being esteemed by one’s peers), and cultural success (being esteemed by the wider culture).
In most professional disciplines, these three types of success typically follow a sequential path. However, architecture has an alternative route, which I’ll call Path B. This route subverts the conventional sequence and, as far as I can tell, is unique to architecture. Through Path B, an architect can achieve cultural success by earning the esteem of their peers, even if they have limited commercial success or built projects.
With sufficient professional and cultural success, one can then achieve commercial success, because clients will line up to hire the famous architect whom all the other architects admire. (It’s always interesting that some architects can win the Pritzker Prize—a prize that is ostensibly awarded to “architects whose built work demonstrates a combination of those qualities of talent, vision and commitment which has produced consistent and significant contributions to humanity and the built environment”—with a very shallow portfolio of built works, principally on the strength of their publications and theoretical works. Equally interesting is how those same architects then go on to develop an expansive practice brimming with large and expensive built projects).
To be sure, architects can go the conventional route, and often do. But architects also have this Path B that’s hard to imagine in any other professional discipline. We wouldn’t know who Warren Buffett was if his initial forays into investing had lost all his early clients their money, and we wouldn’t know who Johnnie Cochran was if all of his early clients had gone to jail.
The existence of Path B in architecture enables and encourages a fetishization of image-making. If your objective is commercial success first, you’ll focus on the things that clients care about, like meeting budgets, and building buildings. If your objective is professional success first, you’ll focus on the things that other architects care about, like novel ideas, forms, materials, and other forms of experimentation. Anything novel is much more easily made, and conveyed, via images instead of wood, brick, concrete, metal, and glass. So architects have come to accept the image of a thing as adequate proof of concept of the thing itself. One can achieve notoriety through the production and dissemination of images of all the novel buildings one could imagine without ever taking on the burden of their execution. Then, when one becomes famous and in demand, one can take on the task of realizing these ideas in built form.
The rise of AI in architecture fundamentally challenges the viability of pursuing Path B. With AI-powered tools capable of generating stunning, novel renderings based on text prompts, the mere production of impressive architectural images no longer signifies the same level of creativity and innovation that it once did. As a result, achieving early acclaim primarily through image-making will become increasingly difficult. How could it not be? We won’t grant our collective admiration to architects who produce work that can be easily done by a teenager with a ChatGPT subscription.
In an AI-driven world, the primacy of the architectural image must be re-evaluated.
As image-production becomes easier, the true test of an architect’s creativity and skill will likely shift toward her or his ability to navigate the complex realities of bringing designs to life, and to navigate the changing realities of practice, rather than to simply produce visually striking representations of potential projects. We can explore this transition by first entertaining the idea that the next great piece of “architecture” may not look like a rendering, or a plan, or an elevation.
It may come in the form of a piece of code that governs a design process, ensuring that a building’s occupants are happier, more productive, and more fulfilled. That piece of code should be deserving of all the praise and merit that we once heaped on the most exacting model. It may come in the form of a business plan for a new kind of architectural firm organization that allows human architects at that firm to leverage AI more fully and, ideally, produce more, better work. It may have forms that we can’t yet imagine.
AI has made making images of our ideas simpler. Let’s adapt to this change by ending our collective fetishization of image-making and focusing instead on the higher-order problems of the future.
Design School as Rite of Passage
Higher education, particularly professional education, has long been viewed as a rite of passage. Students undergo a shared experience with their peers, transitioning from adolescence into professional adulthood and joining a specific professional tribe. However, this notion is relatively recent in the history of education.
In the early days of American universities, students were educated under the old English model, which focused on a broad range of subjects such as arithmetic, geometry, grammar, logic, rhetoric, history, and moral philosophy. It wasn’t designed to train students to do anything in particular, aside from being the landed gentry they already were.
Beginning in the mid-18th century, the first specializations to arise were in medicine and law, as they were understood to be the most technically demanding. Architecture schools wouldn’t follow until a century later, which makes sense. Eighteenth century architects were busy erecting castles, cathedrals, and inventing the modern city, while 18th century doctors were still treating diarrhea with opium and debating bloodletting. The medical students clearly needed a 100-year head start.
It wasn’t really until the late 19th century that American institutions, influenced by German models, began to adopt the graduate and research models we know today, along with the concept of electives and majors.
In contrast to the old English model, it was designed to train individuals for a lifetime of continuous work in a specific field, beginning at the age of 18.
As our collective body of human knowledge expanded, so did the degree of specialization required for work. One’s choice of possible majors expanded in lockstep, to the point where one can now major in just about anything, including puppet arts (University of Connecticut), comedy (Humber College), and bagpiping (Carnegie Mellon University). Going through a major program does more than just teach you the technical skills necessary to execute a specific type of work: it inducts you into a tribe. You undergo shared experiences that then bond you with others in your eventual profession.
However, with the rapid advancements in AI, this model may no longer be suitable for the future. As AI accelerates the pace of technological change, it becomes increasingly challenging to predict the skills and capacities that will be necessary for professional work in the coming decades. The idea of fully training someone for a profession by their early 20s, with the expectation that this education will suffice for a lifetime, seems anachronistic in a world where entire professions may become obsolete within a matter of years.
To adapt to this new reality, we must restructure learning to be a continuous, lifelong process rather than a one-time experience. And I don’t mean CEUs, where you pick up credits on new flashing products in order to satisfy the learning requirements on your license. I mean real learning—the kind that opens new worlds, rather than just shedding more light on the one we’re already in.
Remember that time in college when you attended a 90-minute lecture by Professor So-and-So and it lit your hair on fire, and you couldn’t wait to go to office hours so you could ask them a million more questions about it? Remember that time you closed that last page on a book that would forever change the way you saw the world? Remember when you learned that new thing and had to make new friends because none of your friends were interested in that thing, and you just had to talk about it? Well, get ready to get excited again, because AI has brought us into a new era of discovery. And there will be a lot for us all to learn.
This new paradigm could take many forms, such as having professionals return to school for a semester every few years. We could abolish the four-year program and just structure design education in a staggered fashion such that what one learns in the classroom is in ongoing parallel with what one is learning in practice, like so:
Or we could restructure the four-year college experience so that it more closely resembles the 18th century model: a smattering of geopolitics, international relations, psychology, non-architectural history, rhetoric, etc., all designed to give future architects tools to navigate the world they are designing for, rather than the world of design. The key is to recognize that the current specialization model, which was developed in an era when America still ran on steam power, may no longer be adequate for a future characterized by rapid technological change. In the field of architecture, this means re-evaluating the notion of design school as a singular rite of passage. In the future, design school won’t be something you went through. It will be a process that lasts a lifetime.