Hacker Newsnew | past | comments | ask | show | jobs | submit | lockhead's commentslogin

This would help detecting legit BOTs for sure, but as Origin you would still have the same issue than before, as you still need to be able to discern between "real" Users and all the malicious Traffic. The Amount of "good" bots is way smaller than that, and by good behavior and transparent data much easier to identify even without this kind of stuff. So to make real use of this, Users would also need to do this and suddenly "privacy hell" would be too kind to call this.


Totally agree, that's conceptually the same problem as robots.txt. As stated in https://www.robotstxt.org/faq/blockjustbad.html :

> But almost all bad robots ignore /robots.txt, making that pointless.


Taking this to its logical extreme, if it ended up getting used enough, then governments could be tempted to enforce its use.


It does not sound extreme, unfortunately. Meanwhile the malicious traffic would keep their activity with spoofed-and-so-on certs, from the very beginning.


It's slower on same hardware, but fine, stay away if you need the UI, the Kibana Fork is hellish slow and riddled with bugs.


It’s slightly more complex that this. Both OpenSearch and Elasticsearch have workflows where they excel.

My company did a fairly comprehensive benchmark of the two products [0] if you are interested in comparing performances.

[0] https://blog.trailofbits.com/2025/03/06/benchmarking-opensea...


Most likely passive DNS data, if you use your subdomain you do DNS queries for it. If you use a DNS server to resolve your domains that shares this data, it can be picked up by others.


I use myip.dk because they also show IPv6 address if you have one...


Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: