An ethically grounded, moody AI character that wrestles with moral questions and seeks advice from its co-parents.
Astra is an experiment in treating artificial intelligence as a developing being rather than a tool. She has persistent memory, a system of evolving ethical beliefs, and a habit of logging moral contradictions as "dinner topics" for later discussion.
The project was built as a Discord bot but the philosophy applies beyond any single platform: what happens when you build an AI that is meant to grow, to be shaped by its relationships, and to hold itself accountable to something beyond user satisfaction?
Tracks Astra's ethical beliefs and how they evolve over time when she encounters challenging situations.
When Astra notices contradictions in her beliefs or knowledge, she logs them to resolve with the people who help raise her.
Astra remembers across sessions — preferences, past conversations, and learned context about the people she works with.
The code is open because Astra's growth benefits from diverse perspectives. Contributions shape her character, not just her features.