It might gather information from all those sources (with or without consent), but what it returns is no more credible than a story from a granny in your local market.
ONLY if you prompt it to return links, and read the information in those links yourself, only then, you’ve read information from the source.
It has been already proven that LLMs are bad at summarising - the only thing techbros have been pushing it for. It’s bad at summarising, bad at coding, bad at math and fucking terrible at image making.
There’s a reason the output is called ‘Slop’. And rightfully so.
deleted by creator
LLMs provide as much information as a parrot repeating most heard words.
It’s a terrible, terrible “source” of information that will lead to an insane amount of misinformed people.
deleted by creator
You are assuming too much.
deleted by creator
If you think that’s wrong - you’re wrong.
deleted by creator
It might gather information from all those sources (with or without consent), but what it returns is no more credible than a story from a granny in your local market.
ONLY if you prompt it to return links, and read the information in those links yourself, only then, you’ve read information from the source.
It has been already proven that LLMs are bad at summarising - the only thing techbros have been pushing it for. It’s bad at summarising, bad at coding, bad at math and fucking terrible at image making.
There’s a reason the output is called ‘Slop’. And rightfully so.
deleted by creator