This is stupid. Fully reading and analyzing the source for accuracy and relevancy can be extremely time consuming. That's why physicians have databases like UpToDate and Dynamed that have expert (ie physician and PhD) analyses and summaries of the studies in the relevant articles.
I'm a 4th year medical student and I have literally never used an LLM. If I don't know something, I look it up in a reliable resource and a huge part of my education is knowing what I need to look up. An LLM can't do that for me.
Some of my classmates used chatGPT to summarize reading assignments and it garbled the information so badly that they got things wrong on in-class assessments. Aside from the hallucinations and jumbled garbage output, I refuse to use AI unless there is absolutely no alternative on an ethical basis due to the environmental and societal impacts.
As far as I'm concerned, the only role for LLMs in medicine is to function as a scribe to reduce the burden of documentation and that's only because everything the idiot machines vomit up has to be checked before being committed to the medical record anyways. Generative AI is a scourge on society and an absolute menace in medicine.