On Default: A Conversation with Ellie Abrons
What’s the default that you find the most pressing to address or that you’re just most interested in?
I found this to be a difficult question. Questioning the “digital default” is very much part of my teaching and my work, but given current events and ongoing police violence and brutality against black and brown bodies, the most pressing default must be white supremacy. I don’t know what other defaults we could identify right now that would be more urgent than that. Maybe we can talk about the intersection between the dangers of the digital default and questions of racism, discrimination, and white supremacy, and there’s another conversation we could have about digital defaults in terms of design, architecture, and digital environments, but that one feels much less urgent. Maybe the third default I would throw on the table is construction technologies, which I think are increasingly problematic and something that my practice T+E+A+M has been trying to work on. So I’m throwing three defaults on the table: Whiteness, software (or, the invisibility of network technology) and construction technology.
How do we operate with the default?
On the issue of construction technology, traditional wood framing is still the way that most low and mid rise buildings get built and the reason that they get built that way isn’t because it’s the cheapest way, it’s because it’s the way that we know. There’s no incentive for builders to take risks and so even if it’s not the cheapest way or the most efficient way, it’s the safest way. In Detroit and many other cities, the cost of construction is rising, due to a shortage of skilled labor and expensive material and land costs. This means that most new housing is luxury and market-rate housing. “Affordable” housing is cobbled together through incredibly complex and precarious financial structures. T+E+A+M has been working with a developer and a construction partner who see an opportunity in the middle range to build workforce housing. A significant amount of time in the early phases of the project was spent running down all the different kinds of construction systems that are out there and considering ways we could rethink them or apply them for mid-range housing. I think it’s something that architects and the construction industry really need to take on.
There is an important intersection between systemic racism and the pervasive theme of visibility in network technology. We are steeped in digital network technology. Sometimes we are aware of it, sometimes we are not, but everywhere we go, we’re leaving this trail of data behind. Companies like Google have algorithms which decide all kinds of different things about our identities. In John Cheney-Lippold’s book, We are Data, he describes this interesting, puzzling, intriguing, but also scary separation between the identity that you would self determine and the identity that companies assign to you, your algorithmic identity. Not only does your designation change, the very definition of what that identity is also changes. One of the layers of [the algorithm’s] power is the black box of the algorithm. We don’t know how it makes its determination and maybe nobody knows how it works. There’s an invisibility in the sense that there’s a lot of design that goes into hiding things from us. Everything is designed to be physically smooth, but also experientially smooth such that there’s no friction and the interface slips by. You aren’t forced to contend with the realization, or the acknowledgement of the thing you’re doing. The visible part of it becomes problematic when we think about something like facial recognition, because it is trained to recognize white faces and white bodies. Recently, a man in Detroit was wrongfully accused of a felony based on a facial recognition algorithm.1 So, it matters. It really matters. It’s not just a theoretical problem.
How should we operate with the default?
There’s an academic side to that question, which is thinking about the tools and the default settings and the software and the technologies that we all use and the ways in which that necessarily constrains or guides your work in certain directions and makes certain things possible and other things not possible. I don’t think retreat is the answer. I don’t think there’s any way really to fight against that. And in some ways, it has always been the case when using the tool you choose. What is important is just knowing [the effects of the default] as a simple axiom, thinking about it, keeping it in mind, and looking for opportunities, either to use unconventional tools or to use tools in a different way.
I think the other side of it, if we zoom out a little bit, would be to think about how important it is to understand how technology has fundamentally changed the built environment—and it’s changed! It’s really at every scale, from the scale of the object to the globe. There’s this great drawing called Anatomy of an AI by Kate Crawford and Vladan Joler. They essentially take an Amazon Echo, break it down, and expand it out into the global network of influence embodied in this object—the mining of the minerals, the shipping and logistics, and the digital networks. You could pick up almost any object in your surroundings and go through that kind of exercise and understand this intimate relationship between everyday objects, the built environment, and global networks of technology.
How can we operate with the default?
The default is intimately tied to power. The default will tell you who has power and who doesn’t. Now we’re talking in a really abstract sense that simply relying on the default perpetuates existing systems of power. What would be useful would be to think about what you want the default to be. And then, how might you move from what’s currently considered the default to where you want it to be? I think that would be an interesting exercise. My intuition is that you would often find that you wish it to be something other than what it is and thinking about how you might start to work for that change is a super worthwhile thing to do.
- New York Times, 2020, Wrongfully Accused by an Algorithm, https://www.nytimes.com/2020/06/24/technology/facial-recognition-arrest.html ↩︎