
Angela —
We were just talking about current shifts in thinking towards data and about how these paradigm shifts are affecting how we are approaching big societal issues. How do you feel your work fits into that conversation about data, society, and openness?
Theodora —
My current work is strongly rooted in historical scholarship. I'm looking at early encounters between design, mathematics, and computing in the sixties and at the contexts in which architects started reimagining their discipline as having a lot to do with data and computation. From the onset, architectural computing came with a language of urgency and hoped to tackle large societal issues. The stakes were always high. There was talk about the immense “complexity” of design “problems” and the insufficient methods that architects had at their disposal, the notorious “tricks of the trade.”
So, there was a feeling of urgency not unlike the one that motivates work on data and computing today, though the concerns were different. In the early sixties, some key topics of concern in architecture were establishing standards for buildings, mass industrialization, and making buildings more efficient and people’s work more productive. These concerns may sound like they were catering to capital (which they often did), but in many cases these questions had to do with economizing public money or finding ways to make buildings more responsive to what people actually do, as opposed to what architects imagine them doing. A lot of the early work in architectural computing also had to do with the sudden availability of information, of data through large-scale observational studies. As Christopher Alexander famously said, the architect loses their “innocence” when they become aware of all the information about buildings; when they know all these things, they know they can’t and shouldn’t design the same way anymore. They have to develop methods that take the information into account.

Mathematician Claude Shannon in front of a mainframe computer, MIT Computer Laboraty, Copyright: MIT Museum
When I read writings from that time, some statements sound remarkably familiar. But I’m not making an argument for reductive sameness. Obviously, we are not in the sixties, and things are not the same in contemporary work with data and computing, though many discursive tropes reoccur. I think it’s important to remember that a lot of the intellectual and technical infrastructures for what is happening today were set out in those early experiments, that computing has a history that is very much relevant today. Historical work is important, especially in situations where the new technologies seem to be unprecedented. I am thinking for instance about the so-called “generative AI revolution” taking architects by storm today and the sense of peril and fear surrounding these machine-learning black boxes. We need stories and histories to make sense of what is happening. I have always found history as a productive counterpoint to whatever is happening in the present. I am not talking about operationalizing the past but seeking out differences and continuities.
Angela —
Do you see this fascination with data now as just another iteration of the old conversations around computation, or is this something new?
Theodora —
I don't think that the fascination with data is new. I think we're dealing with different amounts of data and that makes a big difference. We're also dealing with different regimes of who owns it, who produces it, and how the data is captured. I think there's significant shifts in the sense that the data is constantly being produced. It's constantly being culled by corporations. In the sixties, data collection was a process that required effort; it was much more obvious that it was happening. Now, just existing, walking around your house, being in the world, produces data.
Angela —
The data collection of the sixties that you're describing resonates with how social scientists use and collect data, but perhaps now it's become much more technological. Like you said, the data's already on the phone, so we don't need to create a whole collection process anymore.
Theodora —
Yes, in the historical contexts that I'm talking about there were usually actual people collecting the data. One example, which comes from a work I’ve published with my McGill colleague David Theodore, involved people carrying a drawing of a hospital floor plan and following hospital staff around, placing pins whenever they stopped to do something and connecting the pins with thread when they moved. That physical tracking was then transcribed into data that was used to compute optimal hospital floor plans. The data collection was very much a visible process. Of course, again, there were many aspects that were problematic: decisions about who to follow and how to follow, ways of valuing the work of different staff members, etc. But it was evident that data collection was happening, so the tracking was a project in itself, whereas now tracking is a naturalized part of everyday life.
Angela —
Bringing it back also to the idea of empowerment, how can these massive amounts of data impact the built environment? How can they benefit the spaces that we live in and what architects are doing?
Theodora —
Empowerment was a very important issue for me because I came to studying design computation with a very optimistic vision. I liked ideas brought up by radical architects and thinkers in the sixties: dazzling participatory utopias, thinking of architecture in terms of informational flows, and this idea of the individual being an agent in the system. I initially approached these projects with a techno-nostalgic view. They had these incredible, bold visions that I thought would be worth revisiting in the present.

Yona Friedman’s “Ville Spatiale” (1959): A visionary framework for self-planned urbanism, suspended above the landscape. Drawing © Yona Friedman Foundation.
I'm much more skeptical now. In many ways they were not bold enough. Often, the techniques that powered these emancipatory visions, were being drawn from technocratic contexts. One very clear example is Yona Friedman's work on methods of self-planning in the spatial city, which was repurposing techniques from scientific management (things like the work on hospitals I mentioned above). I find this discursive repositioning of computational techniques fascinating; taking something that was made for optimization, for making more efficient buildings, and having the imagination to reposition it as a medium for empowerment. Yet these exciting transpositions are also perilous.
Techniques carry with them assumptions from the contexts of their inception, especially when they are adopted without questioning these assumptions. For example, they come with commitments about what kind of data is relevant and what kind of processing is relevant or possible. These then constrict the way one thinks about things like participation, environment, agency, etc. I mean, really, do you need to be constantly reconfiguring a building to say that you have agency? Isn’t use already full of agency? I'm still very much committed to questions of openness, but I don't think that data-driven approaches and computational models are the main avenue for achieving them. Maybe we need hybrids between these approaches and perhaps alternative understandings of what it means to dwell, or what it means to be an agent in one's space. These understandings might not be amenable or reducible to algorithmic or computational descriptions, and we should be ok with that.

Wireframe perspective for Naiju Community Center and Nursery School, Fukuoka, Japan, 1993. Shoei Yoh fonds Collection Centre Canadien d'Architecture/Canadian Centre for Architecture, Montréal; Don de Shoei Yoh/ Gift of Shoei Yoh, Copyright: © Shoei Yoh
I think there is often an issue when the link between algorithms, data, openness, and empowerment becomes too strong. As if the one needs to be used to produce the other, or as if the one is a pre-condition of the other. One of the issues with some of the early historical projects in the sixties is that they produced a very specific protocol about what it means to be an agent in one's environment. I've referred to this as the “infrastructure model,” where the architect produces a relatively open system that allows for many (often too many to count) possibilities. The idea is that users or inhabitants then enact some combinatorics on the system of possibilities. We see that in John Habraken’s work, or in Yona Friedman's work, even in Christopher Alexander's work: a stable (infra)structure that delimits choice and possibility. And then what people do is navigate these precomputed alternatives. These architects’ claim was that this predefined set of choices was all the choice you could ever have, because of mathematical and theoretical reasons. That's why I became really interested in the technical ideas that undergirded models of openness. This idea that you have a fixed set of states and then it becomes a question of the ways in which you combine, recombine, and navigate those possibilities, is, of course, a profoundly structuralist understanding of openness. But today, maybe because of the limitations of digital computing, I don't think we've moved very far-away from that as a protocol. I do think that the exciting possibility is to move from the realm of imagining what those systems might be to actually implementing a small part of them. I think that the implementation allows for unpredictable events that might be outside of the system.
Angela —
Technology is improving so fast now, opening the door to new ways of actualizing the idea of openness. How does this perspective reflect recent discourses in relation to social justice, and divergent world views? I mean, we are getting to the point where technology might enable us to reinvent the system that everything is based on. Or what about an ecological system? Would you say that that an ecological model that acknowledges complexity, and evolving, dynamic perspectives could embody this new way of thinking about openness?
Theodora —
I think the hope for more holistic systems has been around for a long time. There's a hope that a whole system's approach would allow for different kinds of engagement and agency. But I still get hung up on the “system” idea and the “complexity” idea, because these are already techno-scientific constructs that carry specific ways of modelling and imagining the world as pre-structured. It’s a very rich tradition, and it has afforded a lot of new visions but, to also go back to when you were talking about knowledge cultures that might be incommensurate with the techno scientific paradigm, one wonders what cannot be modelled? How do you model openness that caters to the things that either can't or shouldn’t be modeled? I even wonder if talking about complexity is already part of the problem, because to talk about complexity is to imagine the world as a decipherable system. If only we found the right parts, or if we found the right relationships, then the world would be this a mappable thing. And even if it becomes more intricate, it remains a network of sorts. That is a very specific way of thinking about things.
I find it is interesting to think about the limits of systems and structures to describe things like dynamic phenomena, temporality, change, history, and time. These are important cultural factors that cannot be written out or reduced to a point, a set, or a construct. I think that by naturalizing some of those terms, we already commit ourselves to specific historical lineage of thinking about openness and agency and wholeness.
Angela —
Have you thought about new terms that we can use to describe this alternative way of thinking about openness, or is your work more of a critique?
Theodora —
I don't have a full answer yet. Although I'm interested in process-oriented views rather than structure-oriented views, I'm also interested in the transformative and contingent aspects of processes that are open-ended and unpredictable. These approaches have been influenced by the work of George Stiny and Terry Knight at MIT, who I worked with during my PhD. Their work on shape grammars places emphasis on transformation rules and acts rather than a fixed model that processes data. The main idea is that you don't know what something is before you do something with it, so you can’t commit to the ontology of what it is that you're acting upon, or to what might come out of your doing, your action. But you can commit to a set of acts, or rules, knowing that their applications and their outcomes will be contingent and unpredictable. The structure only emerges after one has acted, not before.

Shape Grammars and the Generative Specification of Painting and Sculpture. Source: Stiny, G. and Gips, J. Shape Grammars and the Generative Specification of Painting and Sculpture, in Proceedings of IFIP Congress, 1971
Angela —
So, instead of this predetermined foundation, we have a set of rules and a moment of engagement where there's something dynamic. If we talk about those in social terms, you could say those rules are, maybe key values, principles, or ethics. They are the things that determine how you are going to act and the only thing that exists is your moment of engagement. This engagement then transforms something, producing a new outcome. You could even say that that's an interesting definition of design. It can also be very personal, because it holds you accountable for your beliefs, values, morals, and your ethics. These rules could be a part of every design decision, like how to move this wall or how to draw this plan. I haven't really heard that description of shape grammars, but I like that way of thinking.
Theodora —
It’s been interesting to hear the more philosophical positioning of shape grammars in relationship to process-oriented and pragmatist approaches. I like this idea that it is about these moments of engagement or these transformative acts, which can be described computationally, but that are resistant to giving a fixed structure or a description of the thing that you're operating upon. That allows for the multiplicity of enactments in a particular context.
Your point about values and beliefs is also interesting. It reminds me of the famous discussion about wicked problems, or rather its history. When Horst Rittel and Melvin Webber wrote the article on wicked problems in the early seventies, their criticism was that data and information science don’t allow you to contend with the constantly dynamic and evolving nature of social issues. Their “wicked problem” formulation was a critique of the idea that you can adequately anticipate or plan the future through data and predictive algorithms. It was becoming obvious in the seventies, that those sophisticated planning models didn’t account for transformative action. I've always found that the really interesting part is where their paper ended — with a proposal on argumentative processes for problem-setting as opposed to problem solving. Rittel, at Berkeley was also working on information systems that would allow for groups of people to model their conversations through units called “issues”. The idea was not to model the world but, rather, to model how you talk about it: values, beliefs, and possibilities of action. One can criticize a lot of these approaches, but I think what's interesting is the idea of accountability. By mapping conversation or by mapping a set of issues and beliefs, you also have a record of a social process in which multiple agents decide to act in the world.
Angela —
I feel that that's what ChatGPT is. When you ask it a question, what you're getting back is the conversation, or the metapsychology of the entire world. Those language models offer whole new opportunities.
Theodora —
The black-boxness of these models is the key difference though. With some of these more historical examples, there was always an implicit commitment to transparency. The idea was that you map things out in order to bring them to the table and to be able to talk about them. The mapping links to a human who can understand it or a group of people who can then talk about it. But now there's an obscurity that I find it harder to contend with.
Angela —
With this critique in mind what do you think is exciting about the future of open data, computation, and architecture?
Theodora —
I'm really interested to see work that thinks carefully about the specificity of context. I am eager to see fine-grain microstudies on how open data, computation, and architecture play out in very particular settings. This counters tropes of wholesale transformation of the discipline or its methods. I like the idea of thinking locally and taking into account what things like data mean for specific communities. We need more of that. It is not easy because this would mean resisting rhetorical tendencies that architecture has been engaging for a long time: the idea of technology as a transformative force is hard to resist.

Lionel March, Serial art sketch, ca. 1980-1990s. Blue, black and red ink, and graphite, 2 pages, recto verso. ARCH289634. Lionel March fonds, CCA. Gift of the Lionel March Estate © Lionel March Estate
Angela —
To me, fine-scale thinking and highly contextual data is really about architecture. In some ways, I think architecture has been removed from the conversation around data and technology. This might be because architects function at a site scale, whereas a lot of technology has been about generalization and standardization, and these are maybe more amenable to urban questions. I think this is partly because GPS and other ubiquitous data is now finer in spatial resolution, and also because computation is increasingly able to deal with diverse and larger data sets. Architects have been thinking about specificity and context for a long time, and now they have a lot to bring to the table. This makes architecture research a really cool place to be right now.
Theodora —
That's a great point. Part of the challenge has of course always been this question of generalizable knowledge. I've been thinking a lot about architecture's role in research universities, about architecture as a discipline that is required to speak in terms of research methods and outputs. Architecture has had to negate this contextual specificity for the sake of principles that would organize it as a science or a basic, fundamental knowledge. It goes back to the question of the protocol between the abstract and the concrete, the specific and the generalizable. I do think that generalizable knowledge could still be possible, but that generalization should not be a priori: a construct where we already have a model of situations, or we already have the algorithm. I think we're at a good moment now with machine learning, which might better fit the idea of an algorithm as something that emerges from a context. But at the same time, I am worried about the growing illegibility of some of these algorithmic abstractions. There is a sense that patterns emerge from the data, but at the same time those patterns are not legible, in the same way that some of the more representational approaches have historically been.
Angela —
Yes, that is important. I guess the only point is that the technology is at a new level, that maybe was unimaginable before. And I think a model like this could also help groups develop their underlying rules for engagement, as you were describing before with Rittel’s conversation mapping. But we need to bring it back to design. What are architects going to be able to do with that information?
Theodora —
The relationship in architecture between the discipline and the profession is interesting. I think universities are a very exciting place for architecture, but of course what happens with practice is an important question and the one that comes closer to issues of societal and environmental impact. A lot of the histories that I'm writing about are academic experiments; most of them never made it to the world of professional practice (though of course they had indirect cultural and technical effects). Because of the magnitude and nature of questions about open data, alternative forms of architectural agency, and new informational protocols for design, it is likely that those questions will continue to be asked in research environments and in more speculative contexts that do not mandate immediate applications or direct demonstrations of profit-making potential. It’s very interesting to map the ways in which some of these academic questions may, in maybe smaller ways, trickle into practice or to see what new forms of practice might emerge as a response to the contemporary technological landscape.
But I really do not think that DALL-E, Midjourney, or ChatGPT are currently of huge consequence to the way that architects build buildings. More mundane things like building information modeling are more consequential, but we need more ethnographic studies that will actually help us understand these changes. There’s a fixation around technology being inherently transformative for architecture, or that architects need to funnel, channel, or express technological change through their work. I don’t see this as an exigency, but I do see architecture as a unique opportunity to give critical questions around technology a cultural form.
Angela —
How did you arrive in this area of research? What is your background?
Theodora —
I got my professional training as an architectural engineer in Greece, at the National Technical University of Athens. There was a general resistance at the time against computers and computing. We were doing almost everything by hand. But there was also quite a bit of interest in the history of technology more broadly. After I finished my professional degree, I did an interdisciplinary post-professional degree in design, space, and culture, which focused more on the theory and philosophy of design. I had a wonderful mentor who has now sadly passed away, Dimitris Papalexopoulos, who was interested in post-structuralism, the philosophy of technology, and computing.
Around 2008, Greece was becoming a difficult place to imagine a future, and I was interested in pursuing another master's degree abroad. I applied to many places that focused on history, theory, technology, and media. I got accepted in the Design Computation group, at MIT, which was the only place that I never thought I would be accepted, because I didn't really have a technical background. I chose MIT because I felt it was important to get a hands-on technical understanding of the things I want to write and think about: computing technology, algorithms, and design. My degree at MIT was a very formative experience for me because I took most of my coursework outside architecture, which was really a wonderful opportunity for me to improve my technical skills, but also to hear the way that computer scientists talk about their work. I worked, for instance, with Patrick Winston, who headed the Computer Science and Artificial Intelligence lab at MIT for a long time. His courses and our conversations revealed to me that computational techniques always come with larger intellectual projects, commitments about the position of the human in a technological world, and they negotiate or are forged by major political and economic projects. It became intuitively clear to me that technical fields like computer science are very strongly cultural as well.
Then, as I started pursuing my PhD, my methods drew from the history of science and technology, science and technology studies, where there is an interest in knowledge cultures and technical languages. And, of course, my anchor was always the Design and Computation Group and the work of George (Stiny), Terry (Knight) and my colleagues with whom we spent hours and hours debating philosophical and cultural lineages of computing in design.
Angela —
I am excited to see where this leads your work in the future. As you said, we need thinkers like yourself to help us build the stories that will help us navigate our current realities. This has been a wonderful conversation. Thanks so much for sharing your ideas with me today.
❍ Notes
Title image: The graph as a hand-drawn notational technique. Copyright: Christopher Alexander and the Center for Environmental Structure. Source: Christopher Alexander, “Thick Wall Pattern”. Architectural Design, . 1968, 38, 324-326.