Microsoft’s Copilot terms call the AI ‘for entertainment purposes only’

Microsoft’s Copilot disclaimer draws attention
Microsoft is facing fresh scrutiny over the language in Copilot’s terms of use, which describe the AI assistant as “for entertainment purposes only.” The wording, which appears to have been last updated on October 24, 2025, has been circulating on social media as users point out the gap between AI marketing and the limits companies place on their own products.
The terms also warn that Copilot “can make mistakes, and it may not work as intended,” adding: “Don’t rely on Copilot for important advice. Use Copilot at your own risk.”
The disclaimer stands out because Microsoft is actively pushing Copilot to corporate customers as part of its broader AI strategy. But the language underscores a familiar tension in the AI industry: companies promote these tools as useful, capable assistants while also cautioning users not to trust them too much.
Microsoft says the language is outdated
A Microsoft spokesperson told PCMag that the company plans to revise what it called “legacy language.” The spokesperson said the wording is “no longer reflective of how Copilot is used today” and will be changed in the next update.
Microsoft is not alone in using cautionary language. Tom’s Hardware noted that OpenAI and xAI also tell users not to treat their systems as a definitive source of truth. xAI says users should not rely on its output as “the truth,” while OpenAI says its service should not be treated as “a sole service of truth or factual information.”
The debate around Copilot’s terms comes as AI companies continue trying to balance consumer-facing promises with legal and practical disclaimers about accuracy. The warning may be unusually blunt, but it reflects a broader reality across the industry: even the companies building these systems are telling users that AI output can be wrong, incomplete or unreliable when the stakes are high.
Sources: