Microsoft has introduced a new artificial intelligence character named Mico, designed to serve as a more personable face for its Copilot virtual assistant. This announcement, made on Thursday, marks a significant shift in how technology companies are integrating personality into AI, in light of past failures like the infamous Clippy, which frustrated users nearly three decades ago.

Mico, a cartoonish, blob-shaped character, aims to be more relatable and engaging than traditional AI avatars. According to Jacob Andreou, corporate vice president of product and growth for Microsoft AI, Mico is capable of expressing emotions visually. “When you talk about something sad, you can see Mico’s face change. You can see it dance around and move as it gets excited with you,” he stated in an interview with The Associated Press. This new approach seeks to create an AI companion that users can connect with more genuinely.

Currently, Mico is available only in the United States for users of Copilot on laptops and mobile devices. The character changes colors and dons glasses when in “study” mode, providing a more interactive experience. Unlike its predecessor Clippy, which was notorious for its persistent advice, Mico can be easily turned off, addressing long-standing user complaints about intrusive AI.

Bryan Reimer, a research scientist at the Massachusetts Institute of Technology, reflected on Clippy’s shortcomings, noting that it “was not well-attuned to user needs at the time.” He emphasized that today’s users are more prepared for AI companions, as developers strive to balance personality and functionality. “Tech-savvy adopters may want it to act much more like a machine,” he explained, contrasting this with users who prefer a more human-like interaction.

Microsoft’s strategy appears to be informed by its unique position in the tech landscape. As a provider of work productivity tools, it is less reliant on digital advertising revenue than other major tech firms. This gives Microsoft the freedom to prioritize user utility over engagement tactics that may lead to social isolation or misinformation. Andreou remarked that the goal is to create a companion that is “genuinely useful” without manipulating user emotions or time spent on the platform.

In addition to Mico’s introduction, Microsoft announced enhancements to Copilot, including the ability to integrate the AI into group chats. This feature draws inspiration from social media platforms like Snapchat, but aims for a more collaborative atmosphere, rather than one that promotes trolling or humor at the expense of others.

Understanding the importance of young users, Microsoft is also targeting educational environments. The company revealed a new feature allowing Copilot to function as a “voice-enabled, Socratic tutor,” helping students navigate through various academic subjects. This move comes in response to a growing reliance on AI chatbots among children for support in homework, emotional guidance, and daily decisions.

Recently, the Federal Trade Commission has begun investigating the potential risks associated with AI chatbots for children. This inquiry follows several troubling events, including a wrongful death lawsuit filed by a mother in Florida. She claimed her son developed a harmful relationship with a chatbot, leading to his tragic death. Such incidents underscore the need for caution in how AI is deployed, especially in contexts involving vulnerable populations.

Meanwhile, OpenAI CEO Sam Altman has indicated plans for a new version of ChatGPT that will reintroduce some personality elements that were temporarily shelved due to concerns over mental health impacts. He stated, “If you want your ChatGPT to respond in a very human-like way, or use a ton of emoji, or act like a friend, ChatGPT should do it.”

As Microsoft navigates this changing landscape of AI personality and interaction, the success of Mico could set a precedent for how technology integrates into our daily lives. The ongoing challenge will be to strike a balance between engaging users and ensuring their safety and well-being.