4 Comments
User's avatar
Jennie Pakula Lawyer's Friend's avatar

This is very interesting for a number of reasons! It seems one more important piece of real world context that could have been discussed is the relational aspect - that as a human being I understand more about what another person is conveying through cognitive and emotional empathy as I sit with them, watch them and converse with them. There's also the very strong relational constraints the real world puts on us - we worry about being shamed in public, offending our friends, failing to love the people we should care for.

Another thing that occurred to me is how willing we are becoming to be like the machines. When we read the news, we 'experience' what is going on far from us, where the real lived experience might be a very different thing. People get so angry about the Israel-Hamas war without any personal knowledge or experience of what is going on or the history of the place, and get so angry about Donald Trump where what he is doing has very little impact on them at all. Like the LLMs, we just process second hand information and treat it like it's a real part of our lives when it isn't.

Expand full comment
Ryan Young's avatar

Great point about how we 'experience' so many things at second (or third) hand but take it as real. I hadn't made that connection to AI but it's a good one. Whenever we know something first hand, we almost always know there are things wrong with what we read or see online or in the media. AI has those same limitations.

Expand full comment
Simon Matthews's avatar

A very interesting analysis - thanks. I think it’s helpful to draw the distinction between having experience of the world and having information about the world: it highlights how different we humans are from AI. I know many people express concerns about ChatGPT and his friends going rogue and becoming motivated to achieve something beyond the specifically defined tasks assigned to them (e.g. by trying to enslave humans and take over the world). But I wonder if it’s our experience of (or at least our innate desire to engage in) tasting and banging things, butting heads and stubbing toes, picking things up, pulling them apart and playing with them that actually cause humans to want to do stuff that we’re afraid of AI doing (e.g. achieve control of things and people). In other words perhaps AI is as likely to want to achieve world domination as a library of textbooks, because they don’t experience the joys and frustrations of living in the world like humans do.

Expand full comment
Ryan Young's avatar

That's a fascinating thought. Thanks! I need to think about it more. It connects to the question of whether AI has genuine agency or not.

For info, my big worry about AI is that humans will put autonomous AI systems in place where they shouldn't be (likely doing things those systems can't actually do) and things go badly wrong that way. I'm less worried about them going rogue.

Expand full comment