NEWS
OpenAI Bans ChatGPT From Giving Medical, Legal, and Financial Advice
OpenAI has rolled out new restrictions on how its AI tool, ChatGPT, can be used — formally banning it from giving medical, legal, or financial advice that would normally require professional licensing.
The updated Usage Policy, which took effect on October 29, explicitly prohibits the use of ChatGPT for:
•Consultations that require professional certification (such as medical, legal, or financial advice)
•Facial or personal recognition without consent
•Making critical decisions in areas like finance, housing, education, migration, or employment without human oversight
•Academic misconduct or manipulation of evaluation results
OpenAI says the new guidelines are intended to improve user safety and reduce the risk of harm caused by people relying on AI for advice outside its intended purpose.
As reported by NEXTA, ChatGPT will now serve primarily as an educational and explanatory tool, not as a consultant. The chatbot can still explain concepts, outline general mechanisms, and offer context — but it will no longer give specific or actionable recommendations.
That means users won’t get medication names or dosages, legal document templates, investment tips, or buy/sell suggestions. Instead, ChatGPT will now consistently refer users to licensed professionals for expert advice.
The policy change reflects OpenAI’s growing caution amid global concerns about AI liability and regulation, particularly as governments begin to impose stricter safety and accountability standards on advanced models.