I feel like use of an LLM should come with a pretty huge disclaimer. Anything that uses an LLM should have a disclaimer that reads like this:
Chat at your own risk:This system may generate inaccurate, incomplete, misleading, or harmful outputs. It is not a professional, sentient, or your friend and cannot replicate the training, judgment, accountability, practiced expertise, or lived experience of a doctor, lawyer, engineer, therapist, financial advisor, safety professional, or other qualified human expert. It is inadvisable to rely upon its outputs for medical, legal, financial, engineering, safety, employment, or other consequential decisions without independent verification and review by an appropriate professional. The software is provided “as is,” without warranties, and you are solely responsible for how you use it and for any consequences that result.
But I feel like if people read and paid attention to that the people who are trying to shove it into everything might not make as much money so they won't.