I'll take logical reasoning over "stories" any day.
And calling language model "parrots" is flouting. Many people worked for decades to reach that accomplishment, here come the critics to shit all over it.
> But how do you update a culture's shared set of beliefs?
It's not the place of AI models to do activism, and it's a slippery slope leading to AI based inquisition. Take a look at how China uses AI to oppress their own people.
Stories are compressed representations of complex spatiotemporal patterns. We use stories to make sense of the world and to share our insights with others. And if you think about it, stories are essentially containers for if-then relationships. So they're not as far removed from logical reasoning as you might imagine.
I don't understand why you'd find the use of the term 'parrot' offensive. Language models extract linguistic patterns. GPTs generate patterns based on those which they have been trained on. That's a process that can be described as parroting. If you find it offensive because you think it implies that the researchers coming up with these models aren't worthy of credit, I think you are reading something into it that isn't there. At least not from my perspective.
When I mentioned updating cultural beliefs, I was referring to the traditional way of going about it: through cultural products. My point was that the "PC crowd" would be better off if they relied on this strategy rather than attempting to halt the development of language models. I was absolutely not suggesting that language models should be used to "train" members of society. That's a dystopian nightmare.
> We use stories to make sense of the world and to share our insights with others.
True, as long as you empathize with the story, and empathy comes from feeling united. But the PC army abandoned empathy in favor of identity, and think they are on their own, fighting a war, a zero sum game. Noninclusive politics is asking for empathy, how ridiculous, it's the same as demanding tolerance for intolerance.
They teach a whole ideology of guilt in order to dehumanize their opponents and cut the empathy towards them. They can't complain.
And calling language model "parrots" is flouting. Many people worked for decades to reach that accomplishment, here come the critics to shit all over it.
> But how do you update a culture's shared set of beliefs?
It's not the place of AI models to do activism, and it's a slippery slope leading to AI based inquisition. Take a look at how China uses AI to oppress their own people.