Hi Everyone 👋🏽

If you're new here, welcome to Growth Imperatives, an ongoing curation of ideas that deconstruct the current world, and ask how we can build a new one.

This week, I'm pretty much obliged to mention the new AI companion, Friend, and the narrative it encourages 🫠 If you haven't heard of it, you should really watch their launch video/ragebait before jumping into this week’s main article.


👯 Embracing Sub-Optimal Relationships

L.M. Sacasas gives us an excellent dissection of Friend, the vision of the future it promotes, and why we might be tempted to use it out of necessity.

The always-listening Friend, and AI in general, mark a point where companies have, on top of commodifying our homes (Airbnb) and cars (Uber), started to commodify our memories, imagination, and relationships. Material resources are now largely spoken for, and the system is pushing people to mine society’s immaterial elements. Being designers, we might even be involved in beautifying and normalizing that mining.

Apart from the social element, Friend makes me think about the design culture that buttresses these absurd ideas. We see, once again, that technology and a vague idea of 'progress' can’t be the value system that drives what we do; technology itself isn’t a value but a carrier of the values you already have.

What does it say about our understanding of design when we, through our work, proudly and uncritically boost devices like Friend?

From the article:

It is good to be able to relate to the world in a manner that evokes and engages the various dimensions of our human personhood—embodied, imaginative, intellectual, emotional, moral, spiritual, etc.—particularly in relationship with others. But our techno-economic environment generates an experience of the world that is hostile to this ideal. It operates at a pace, scale, and intensity that undermines our capacity to relate to the world with the fulness of our presence, thought, and care. If affection is kindled by time and attention, the default settings of our techno-economic order undermine our capacity to give either. We are instead encouraged to live as machines rather than creatures, optimizing for all the wrong metrics.

And these same techno-economic structures instill in us a manufactured neediness so that we might be all the more beholden to the goods and services marketed with the promise of alleviating our plight and addressing the very neediness they cultivate. Social robots, AI assistants, VR, generative AI—each of these, as they are often marketed, can be usefully analyzed from this perspective. They are the system’s answers to the problems the system created and they serve the system not the person. [...]

Considered from a slightly more cynical perspective, we can see that there is a certain unfortunate logic at work: manufactured neediness prepares the ground for new commodities. The goal is not to alleviate loneliness or isolation by fostering vernacular human relationships, which, of course, cannot be readily monetized, but to insinuate, pejoratively, that such relationships are inefficient and full of friction. As Horning noted, “Chatbots are often marketed as though other people represent the main impediment to solving loneliness, and if you remove the threat of judgment and exclusion and rejection that other people represent, then no one will ever feel lonely again.”

Read → Embracing Sub-Optimal Relationships by L.M. Sacasas


🏗️ Reconstructions

Here are three bite-sized ideas to help challenge your thinking:

[M]ost crises – such as the 2008 financial meltdown or the recent droughts in Spain – are rarely in and of themselves sufficient to induce rapid and far-reaching policy change (unlike a war). Rather, the historical evidence suggests that a crisis is most likely to create substantive change if two other factors are simultaneously present: movements and ideas.

→ Roman Krznaric in The disruption nexus

[T]he best narratives and metaphors for thinking about how life works come not from our technologies (machines, computers) but from life itself. Some biologists now argue that we should think of all living systems, from single cells upwards, not as mechanical contraptions but as cognitive agents, capable of sifting and integrating information against the backdrop of their own internal states in order to achieve some self-determined goal. [...] The ‘organic technology’ of language, where meaning arises through context and cannot be atomised into component parts, is a constantly useful analogy. Life must be its own metaphor.

And shouldn’t we have seen that all along? For what, after all, is extraordinary – and challenging to scientific description – about living matter is not its molecules but its aliveness, its agency.

→ Philip Ball in We are not machines

[M]aking art is already a fundamentally democratic process. [...] It just takes time, effort, training, dedication, a development of craft. AI advocates have tried to argue that AI helps disabled people create art—but the already plenty vibrant disabled artist community shut that down extremely quickly. No, it’s making a living practicing art is the tricky part, the already deeply precarious part—and it’s that part to which the AI companies are taking a battering ram. [...]

The democratization pitch is aimed not at aspiring artists, but at tech enthusiasts who may or may not feel that largely abstracted gatekeepers have been unkind to them or derided their cultural contributions, who feel satisfaction at seeing slick-looking images produced from their prompting and eagerly share and promote the results, and industries who read the ‘democratize’ lingo as code for ‘cheap’, and would like to automate the production of images, text, or video.

→ Brian Merchant in AI is not "democratizing creativity." It's doing the opposite


That's all for this week! Thanks for reading.

To help these ideas spread, please consider sharing a link to this issue or the entire newsletter with your friends or on social media.