

If some newspaper start printing LLM generated slop news articles. Would you say, “It is the responsibility of the reader to research if anything in it is true or not?” No, its not! A civilization is build on trust, and if you start eroding that kind of trust, people will more and more distrust each other.
As long as the newspaper clearly says it’s written by an LLM that’s fine with me. I can either completely ignore it or take it with a grain of salt. Truth is built on trust but trust should be a spectrum, you should never fully believe or fully dismiss something based on its source. There are some sources you can trust more than others, but there should always be some doubt. I have a fair amount of trust in LLMs because in my experience most of the time they are correct, I’d trust them more than something printed in Breitbart but less than something printed in the New York Times, but even with the new York times I watch out for anything that seems off.
You, along with most of this sub, seem to have zero trust in LLMs, which is fine, believe what you want. I’m not going to argue with you on that because I’m not going to be able to change your mind just as you won’t be able to change Trump’s mind on the new York times. I just want you to know that there are people who do trust LLMs and do think their responses are valuable and can be true.
If I want to talk to a person, and ask for their input on a matter. I want their input, not them asking their relatives or friends. I think this is just normal etiquette and social assumptions. This is true even before LLMs where a thing.
I don’t think this is universal, that may be your expectation, but assuming it’s not something private or sensitive I’d be fine with my friend asking a third party. Like if I texted in a group chat that I’m having car troubles and asked if anyone knows what’s wrong I would not be offended if one of my friends texted back that they’re uncles a mechanic and said to try x. I would be offended if that person lied about it coming from their uncle or lied about their uncle being a mechanic, but in this case the person was very clear about the source of the information they got and it’s “credentials”. Part of the reason I may be asking someone something is if they don’t know the answer they may know someone who knows the answer and forward it on to them.
Nowadays you can no longer trust the goodwill on researchers, you have to get to the bottom of it. Looking up statistics, or doing your own experiments, etc. A person is generally superior to any LLM agent, because they can do that. People in a specific field understand the underlying rules, and don’t just produce strings of words, that they make up as they go. People can research the reputation of certain internet sites, and look further and deeper.
I don’t think this is true for every person, maybe for experts, but an AI agent is probably just as good as a layman on doing online research. Yes if you can ask an expert in the field to do the research for you they will be better then an AI agent but that’s rarely an option, most of the time it’s going to be you by yourself, or if your lucky a friend with some general knowledge of the area googling something and looking through the top 3-5 links and using those to synthesize an answer. An AI agent can do that just as well and may have more “knowledge” of the area than the person. Like chatgpt knows more about say the country of Bhutan then your average person, probably not as much as a Bhutanese person, but you probably don’t know a Bhutanese person and can’t ask them the question. It can even research the sources themselves or use a tool that rates the trustworthiness of a source to inform which one is true in a contradiction.






















Nah, in fact he probably went there hoping Israel would do something like this. Tucker has turned against Israel and has even been making connections with them and Epstein. This will only give him street cred in the rising anti Israel nationalist right.