meeting with Diana, Language game… and more

The last Friday, I had a nice meeting and discussion with Diana Serbanescu who is leading a research group: Criticality of AI-based Systems at the Weizenbaum Institute. She is also a founder of the performance group REPLICA.

In our meeting, she pointed out few things:
– the current AI development is often based on data, and leaves out the embodiment of the data. In another word, it does not have a “bodily” experience.
– when you think about who are developing (and therefore deciding) the technology, it is often done by small group of people and it lacks the diversity. This is evident when you think of the definition of “intelligence” in early AI research. Why should “playing chess” should be a landmark for the intelligence? >> she suggested a book Artificial Knowing by Alison Adam
– related to the above point, the narratives around the technology plays crucial role when communicated with public. Not all the members of public becomes a computer scientist or an engineer to develop their own AI. Instead we rely on the narrative we told to make public opinions about the technology which affects policy makers. The questions is: who makes these narratives? how are they told? could we diversity these narratives? She told me this interesting research by Royal Society about Machine Learning narratives >> https://www.kobakant.at/false-lies/portrayals-and-perceptions-of-ai-and-why-they-matter/

I was also wondering about the “body” or bodily experience of the AI. My thinking started with the story of the Golem XIV by StanisÅ‚aw Lem. In this book, the AI, that is smarter than human gives a lecture to human, and in its lecture, it points out that the way it thinks is beyond what human can imagine as it does not have the body to experience the physical consequences. When I started off saying “I want to wear an AI”, I was thinking of borrowing my body to give the bodily experience for AI. But now, I started to wonder if it could ever have the same experience? Even we use the data from human to run machine learning algorithms and train them, it does not really “understand” us. Long time ago, I was fascinated with Wittgenstein and his notion of Language game. Suddenly it remind me of his theory, and I re-visited his thoughts, including this movie by Derek Jarman.

Wittgenstein states that there are no private language, but instead we are all playing language game with commonly understood rules. We can understand each other, because we all grew up in culture where we learned language(s) (and its rules). If there were a lion that speaks, we will still not understand her, as we do not know the world (culture) of lions. Now, if we give millions and billions of data taken from us and feed all the dictionary and all the encyclopedia contents, AI will still not understand us, as it does not “grew up” in culture. AI is not a subject (as it is not a living being) at the end, even though the way we talk about it sounds as if it is a living being, a subject. Instead, if we see it as a part of our culture, it is not about “it, understanding us”, but “us, understanding the context of its existence” and developing how to use this new “word” in our language game.