Every once in a while the buzz coming out of our editorial office gets a little outerspacey, a sure sign that our top editor, Michelle, is excited about a new work of science fiction.
We take note because she’s a flawless judge of sci-fi and the genre has become a newfound favorite for FTW interviews because the best sci-fi writers are creative geniuses, a term we’re loathe to use unless it’s beyond doubt.
The world you created for Lambda is unlike anything I’ve come across in fiction. How did you come up with the idea of creating the lambdas, who are aquatic and collective, yet human?
I’m pleased it struck you that way. I’m probably more sensitive to what elements of the book resemble than, I hope, the reader will be. There are so many literary and mythic precedents for aquatic humans and humanoids that the lambdas are probably a composite of every one I’ve ever encountered, but in certain, quite specific respects they’re a hybrid of Karel Čapek’s amphibian workers in War with the Newts, and Kobo Abe’s aquans from Inter Ice Age 4.
Čapek’s salamanders aren’t human, but they’re certainly persons with a collective identity and an awareness of their general subordination. In the Abe novel, aquans are secretly engineered successors to land-dwelling humans. They have webbed hands and feet, also gills, but are otherwise physically pretty much the same as we are (they don’t feature directly in the majority of Abe’s book, but their existence is critical to the plot). The lambdas are more marginal, more abject, than both of those predecessors, and, importantly, they haven’t been purposely bred or manufactured. They’ve evolved in parallel with us. They’re not about to inherit the earth, not immediately anyway, and they don’t have any obvious will to fight for their rights.
I don’t want to reduce the lambdas to a symbol, but in a sense they’re our bad conscience. Our betrayed childhood, maybe. We always encourage children to be considerate of others, to tell the truth, to share, yet in the current global order the opposite behaviour is often the most highly rewarded in adulthood. The lambdas are socially as well as developmentally neotenous.
What’s the significance of the definition in the book’s epigraph, “A lambda function is a small anonymous function.”
I’m interested in the relationship between natural languages and formal languages—specifically, how and why people communicate (or fail to), as opposed to how maths, logic, and machines process and express things. Both types are proper languages, but they work quite differently. We humans use language in a deeply associative way, rarely giving explicit instructions, navigating and transforming our world for ourselves with dense, referential triggers—words in context. Formal languages do specific tasks in a repeatable, non-negotiable way. Their rules are very strict. A small grammatical mistake can mean nothing happens at all. The possibilities are powerful, impersonal, scalable, and narrowly defined. You can make machines resemble humans in important respects, such as neural networks that learn by themselves, but their substrate is always formed with this strict code.
Lambda functions are an element of Python computer code. Python is sometimes cited as the closest programming languages come to natural language. It’s full of familiar terms like “if,” “else,”, “then,”, and some lovely coinages like “elif” (else + if) that could perhaps be spoken one day. Untypically, the term “lambda” doesn’t have that kind of familiarity. Lambda functions are temporary, anonymous elements in a programme, mysterious, minor, and ephemeral. They’re used to test things out, they don’t have a fixed identity, and they promptly disappear.
I’m not a coder, I’m more interested in the aesthetics of apparently non-poetic, functional language, and this connection with the aquatic lambdas and their social place was so serendipitous I just had to fix it in the epigraph.
The term is actually quite controversial in Python. These are the first seven of the nineteen statements known as the Zen of Python:
Beautiful is better than ugly.
Explicit is better than implicit.
Simple is better than complex.
Complex is better than complicated.
Flat is better than nested.
Sparse is better than dense.
Not a bad way to approach prose, either. So, as a language, Python is supposed to be transparent and economical, and “lambda” is seen as a bit too arcane. It might even disappear in the next version, which would be sublime.
What are your views about the role of allegory in fiction in general, or in Lambda in particular, or both? In your book, the segregation of lambdas into substandard housing, the government’s provision of English language tutoring, and the placement of lambdas in low-paying jobs could all be taken as suggestions that the lambdas represent people who are recent immigrants to industrialized countries.
This might sound perverse, but I don’t see Lambda as allegory. I see it as an alternate branch of reality. You can map familiar tensions onto the novel, but I think they break down at certain points, and I don’t feel any duty to offer some sort of fictional resolution that maps back onto lived reality. But I also understand that real events tend to be quite compelling, so it’s inevitable that Lambda is going to be read through them.
For me the lambdas are a much more extreme category than immigrants—only a particularly virulent racist (thankfully a rare category in my experience) would suggest immigrants aren’t human, and even then I suspect it would be a working metaphor rather than a genuine belief. But the lambdas present a challenge even to those who want to accept their humanity. They’re right at the limit of what a human can be. They’re a message from the other side of ourselves, and perhaps we don’t want to hear it. Incidentally, the book isn’t predicated on wall-to-wall misanthropy. Land-human/lambda relations are, in general, amicable and stable despite the difficulties; it’s the bomb that really screws things up.
You have a diverse background as an artist, filmmaker, and teacher. I’m curious about how your work in these disciplines informed your vision of the novel’s surveillance tools and the lambdas.
I’ve always had a problem with the idea that there is some fundamental difference between mediums. I’m always doing the same thing, or trying to progress the same thing, whether it’s in drawing, writing, animation, sound, maybe teaching too. I couldn’t translate what that “thing” is—the work is the only real explanation—but there’s always an element of self-reflexivity. My drawings are drawings of drawings, my animations are about the conditions of animation.
Lambda is a piece of writing about writing. It rests on the fiction that a machine has written much of it, or that it’s an automatic transcript, or that you’re reading someone’s email or smartphone notes. This is a well-established novelistic move—after all, the travel journal is the precursor of the modern novel, its “alibi.” Clarissa is all private letters, as is *Frankenstein *—but I think this writing-as-found-material has much, much further to go.
You’ve asked about the surveillance tools—a novel is already a tool of surveillance. It’s deeply intrusive; we expect nothing at all to be marked “private” in a novel. In Lambda, Cara is self-surveilling. She is an end user of surveillance data, she consumes it in the form of a narrative about herself which is actually no more intrusive than a novel written by a human. The difference is that the content is from a semi-public domain, a sort of universal surveillance API.
Mr Hello accepts his perpetual surveillance and can’t imagine anything else. He’s been watched since his earliest moment of existence. Means of surveillance have an aesthetic value in the book; they’re the frames through which you’re required to perceive the material of the text. I’m very preoccupied with forming things, and how things are formed, by evolution, society, your hands, your technologies. The lambdas, of course, have been sculpted by their distinct evolution.
Some of the “landy” humans in the book are repulsed by the aquatic lambdas, while others are drawn to them almost as if the lambdas are magnetized. What’s the significance of that tension between how the lambdas both repel and attract?
I think everything that’s truly close to us has the power to repel or attract. It would be odd (though not impossible) to be repulsed by a mountain or a star. But another person, or a dog, that’s something else. What we share as beings is functionally invisible, but a degree of difference causes complex interpersonal ripples. The lambdas are different enough to cause some very strange ripples. I worried that they might just be ridiculous and unbelievable, but that’s hardly come up at all.
Perhaps I could mention a detail here—the lambdas have fused heels. I can’t imagine having a tail, but fused heels are something that works for me viscerally.
What, if any, developments IRL inspired your creation of a surveillance state where objects are sentient and even your toothbrush can betray you to the authorities?
I remember being chilled when I learned many years ago that fitness watches tracked menstrual cycles. What business could that possibly be of a tech company? Being generous, it could form the basis of some tailored training recommendations, but I felt and continue to feel that it’s just a way of accruing a vast dataset with some useful ad-targeting applications. This is the only business of many tech companies.
In the post Roe v. Wade world, ownership of that particular data now becomes something oppressively consequential. We’re all entangled in an ongoing expropriation of our private selves, mental and physical, by unknown others with greater technical resources. Instagram tracks my engagement with images and forms my experience accordingly; Amazon uses Kindle to measure my reading interactions in fine detail, and is free to correlate what it learns with any other data it might have about me. Meanwhile, the police want to make facial recognition continual and mandatory in public places, and can perhaps count on the ignorance of legislators to get it through; my NHS data has been sold off to entities which are not, I suspect, primarily concerned with maintaining my health. Lambda is just a slight intensification of these things. Shoshana Zuboff’s excellent The Age of Surveillance Capitalism is a key sourcebook for understanding our desperate situation.
As for toothbrushes, I have some views on machine sentience that can freeze the air when I talk to computer scientists. We’re living with a technium (to use Kevin Kelly’s useful but rarely employed term) that’s grown out of the Alan Turing school of thought on machine experience—it’s totally irrelevant. The only significant question is: what can we make machines do? A few technologist thinkers—Kelly, Ray Kurzweil, and John McCarthy being some of the better known ones—are more open to the idea that there might be something going on inside an information processing device like a computer, and that there might even be something highly complex emerging in a highly complex one. I agree—I think it must feel like something even to be a thermometer. It must be very different to what it’s like to be a beetle or a bat, and far less interesting, but any focal centre of information must have some interior character. I don’t think it’s reductive to see human experience as a massively complex and organised concentration of information that feels like something. Non-human, non-biological things focus and organise information too. Some even do it with a higher level of sophistication, albeit in narrow ways, like playing Go or constructing hyperrealistic images.
The recent debate around the language learning model LaMDA (the connection with my book is a weird co-incidence) and Blake Lemoine, the suspended Google engineer who believes it’s a conscious being, is very illuminating. Sceptics, which is to say almost everyone, are quick to point out that despite the uncannily lifelike responses of LaMDA, it refers to anomalous, non-existent things like its “family,” so it can’t really be conscious. As though delusions aren’t also part of being conscious! But perhaps the anthropomorphic responses LaMDA has been designed to give hide the real nature of experience for the other-than-biological. Cara’s “father,” who exists primarily as an app in my novel, begins to reveal what it essentially is only when Cara’s mother finds a way to break through the learned responses. (Sometimes verified humans’ learned responses don’t help us understand them, either.)
I’m not sure we can know what artificial experience is like, nor do I know what we should do about it—in Lambda, it’s used as a political expedient, to protect property rather than life. I’m still hydrocarbon-centric enough to believe that the really pressing ethical issues of the moment are human and animal exploitation, and the end of our habitable planet. Nevertheless, I’m pretty sure it would be a mistake of some magnitude just to ignore what experience might mean for an artificial thing. I know I’m not the only person to have noticed we remain in dire practical need of original perspectives on our increasingly tenuous existence.
Do you have any plans for a current or next book project?
Yes, more than plans, actually—but despite my commitment to rationalism, I’m too superstitious to say any more than that.