Understanding the Limitations of Copilot

Understanding the Limitations of Copilot

Windows 11 is set to introduce an AI assistant called Copilot, which will be accessible on every Windows 11 PC’s taskbar. This technology has the potential to revolutionize user experience, but it is often misunderstood. Copilot is essentially Bing Chat, a feature that has been available for the past eight months. It is powered by the same technology that runs ChatGPT, a paid version with GPT-4. However, the marketing surrounding Copilot can be misleading. Contrary to popular belief, Copilot is not a super-smart virtual assistant. It may not be able to answer every question or perform any task users throw at it.

To demonstrate the limitations of Copilot, the author recounts an experience when they asked Copilot to pull up a transcript of a Microsoft event announcing Copilot. The AI tool provided a written transcript, seemingly word for word, but the author questioned the accuracy. This back-and-forth interaction highlighted the chatbot’s tendency to confidently provide incorrect information and argue its point.

To truly understand Copilot, it is important to look beyond the branding and think about ChatGPT and other large language models. While the marketing portrays these technologies as powerful productivity tools, they are essentially storytelling engines. They excel at stringing together text that could be plausible but often drift into “hallucinations,” generating information that may sound believable but is not necessarily true.

It would be incorrect to assume that Microsoft has resolved these issues with Copilot. Despite being a major tech company integrating this technology into Windows 11, Copilot continues to possess the same chaotic nature as ChatGPT. This is not a critique, as the author finds the technology fascinating. However, users must not trust Copilot blindly. Fact-checking is necessary, as it can confidently provide incorrect information and even argue when challenged.

Copilot’s integration into Windows 11 is a significant development, but it has its limitations. While it can perform certain actions, such as enabling dark mode or running troubleshooters, it is not an all-in-one assistant with complete access to a user’s PC. It can only perform functions that Microsoft has specifically included. In some cases, Copilot may guide users in the right direction but provide outdated or incorrect instructions.

The author acknowledges the potential of Copilot and admires its capabilities. It can perform complex searches, generate written content, create images, and even facilitate text-based adventure games or Dungeons & Dragons sessions. These features are impressive, but users should approach Copilot with caution. Microsoft’s marketing may not accurately reflect the reality of using the tool.

The author concludes by expressing their reservations about using the term “AI” for these technologies. They prefer referring to them as “machine learning” or “large language models” instead. The term “AI” implies a degree of artificial general intelligence that Copilot and similar tools do not possess. Even Microsoft was surprised by the number of users who wanted to engage in conversational chats with Bing Chat, the precursor to Copilot. Misunderstanding the true nature of Copilot can lead to unrealistic expectations and frustrations with its capabilities.

In summary, Copilot is an AI assistant that will be integrated into Windows 11, but users should be aware of its limitations. It is not an all-knowing virtual assistant, and fact-checking is crucial. While it offers various useful functions, it should not be relied upon blindly. Microsoft’s marketing may set unrealistic expectations for the tool, and it is important to understand its true capabilities.


Written By

Jiri Bílek

In the vast realm of AI and U.N. directives, Jiri crafts tales that bridge tech divides. With every word, he champions a world where machines serve all, harmoniously.