It might gather information from all those sources (with or without consent), but what it returns is no more credible than a story from a granny in your local market.
ONLY if you prompt it to return links, and read the information in those links yourself, only then, you’ve read information from the source.
It has been already proven that LLMs are bad at summarising - the only thing techbros have been pushing it for. It’s bad at summarising, bad at coding, bad at math and fucking terrible at image making.
There’s a reason the output is called ‘Slop’. And rightfully so.
You are assuming too much.
deleted by creator
If you think that’s wrong - you’re wrong.
deleted by creator
It might gather information from all those sources (with or without consent), but what it returns is no more credible than a story from a granny in your local market.
ONLY if you prompt it to return links, and read the information in those links yourself, only then, you’ve read information from the source.
It has been already proven that LLMs are bad at summarising - the only thing techbros have been pushing it for. It’s bad at summarising, bad at coding, bad at math and fucking terrible at image making.
There’s a reason the output is called ‘Slop’. And rightfully so.
deleted by creator
You really did read my comment explaining the shortcomings of statistical engines and called my ‘hate’ “irrational”, huh.
You should try some of that Natural Intelligence.
I’m done feeding the sealion.