Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Kids need to be taught how to use ChatGPT because it’s here to stay and they will use it, but it’s also important to demonstrate that ChatGPT is both useful and senile, and so you need to validate its responses with secondary sources.


I understand this sentiment now, but I've already made GPT work in ways which boggle the mind, in my own time, and I'm a nobody. So this concern might be rephrased as (something like) "Kids will need to learn how to use GPT to validate secondary sources."


> ...but it’s also important to demonstrate that ChatGPT is both useful and senile, and so you need to validate its responses with secondary sources.

> So this concern might be rephrased as (something like) "Kids will need to learn how to use GPT to validate secondary sources."

That's not a rephrase, it's an inversion. It's like saying kids will need to learn to validate medical advice from their doctor with information from mercola.com and other random alternative-health blogs.


LLM's and prompt interfaces aren't a source of truth in the first place. They could be used to compare sources, though.


> ...and so you need to validate its responses with secondary sources.

If you have to do that, what's the point?

And of course, few will actually double-check it. Easy is more important than correct.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: