Finally got around to properly reading this one... I don't know, it feels wrong to me to say software has a soul somehow. But maybe it means we need to discuss what it even means to have a soul to begin with. Some time ago I read an article about people who'd "train an AI" to be them once they pass away, and while there was some funny moments, quite a few of the family also found this to be a rather disturbing process haha. Overall, I was reminded of the article in Noema that discusses that the real danger of "superhuman" AI is not its existence, but how using such terms for AI will change our perception of what it means to be human https://www.noemamag.com/the-danger-of-superhuman-ai-is-not-what-you-think/ At the end of the day, an Ai does probably not feel or care about things the ways we do. And maybe this then makes it not actually have a soul.