Hacker Newsnew | past | comments | ask | show | jobs | submit | Peritract's commentslogin

Not being punished for something doesn't mean it isn't a crime, and doesn't mean it isn't wrong.

Children have a more developed sense of ethics than that.


> Majority of AI text, music, images, videos and code is indistinguishable and you use it every day.

I really don't think this is true. If it was, we'd be able to point to countless examples of things assumed not to be AI that actually were, but there's a dearth of such examples.


Examples _are_ countless. Look around yourself - it's simply indistinguishable. Videos are still not quite there, the biggest telltale is how short they are, but we are very very close.

> Examples are countless

Then you should be able to name at least one.


It's worth emphasizing here that you haven't changed it to iambic pentameter with an easy button. Your A lines are, but the Bs butcher it.

> There's also the point that LLMs can give you explicit control over features like reading age, social register, metaphor frames/ themes/imagery, sentence structure, grammatical uniqueness, rhythmic variation, and other linguistic markers.

You already have this. Control over your writing is the default position.


Not stupid, but I think it's fair to say "careless about/unaware of the wider impact of their work".

What do you mean by wider impact? Model collapse would be the opposite of a wider impact: it's an immediate impact, and I'm fairly sure the people training these models have good incentives to avoid that.

Eg by filtering data, by procuring better data, by applying techniques for making do with more limited data (we used to have a lot of those, and they are still known), or you can also adapt your training process to be less vulnerable to model collapse. Just because some researchers have shown that this happened for the models they tested, doesn't mean it has to be a universal thing.


You could, you'd just license them at creation time for X years. It would stop large corporations hoarding everything.

a licensing agreement is not work for hire - work for hire means the person doing the hiring owns the copyright, not the person who did the work.

That's how it works now, but we're talking about changing it. That's the context of the conversation.

> You could have streamers watch unpopular (modern) movies with their audience. Or a youtuber could read a book to their viewers (listeners).

Why is this something that the government should promote?


It's just an example of culture being more varied than just Disney or Marvel.

> with Sora some of that ability democratized

No it didn't; OpenAI had control.

Saying Sora democratised video generation is like saying that landlords democratised home ownership.


OOP is great. "OOP is the one perfect paradigm for all coding" was the craze.

The answers to those questions have been clear for a while; it approaches concern trolling to keep on pretending to ask them in wide-eyed innocence.

Yes, revenge porn is very effective at causing harm, even though it can be generated.

No, because 'plausibly deniable' has never worked for social consequences and shame.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: