Hacker Newsnew | past | comments | ask | show | jobs | submit | poopicus's commentslogin

As someone who has difficulty in detecting irony, could you explain the irony in this statement?


2024 is the current year and it's the same as the number of lines of code. I don't think describing it as ironic is correct though.


It's about ironic as rain on your wedding day.


> It's about ironic as rain on your wedding day.

Ah!!! It's the Alanis Morrisette meaning of irony, not the dictionary one!


That certainly is gregarious!


Never get that. Bad in literature. Thanks.


The other response is correct that this is not ironic. Roughly speaking, irony is when something happens that is the opposite of what you'd expect. A firefighter's home burning down is ironic. Sometimes irony is related to unfortunate or funny coincidences/timing, and it's easy to confuse the two. Alanis's song Ironic famously has a lot of examples of this. Rain on your wedding day--is that ironic? Maybe? You certainly hope there is no rain on your wedding day, but I don't think there's an expectation that there won't be rain. Now if your parents decided to get a divorce on your wedding day, I think that's ironic.

But the parent commenter dilutes the definition further. A project with 2024 lines of code in 2024 is just an amusing coincidence. There's no reason why you'd expect a project in 2024 to not have 2024 lines of code.


If you're wondering how he did it, Brad wrote a tool called 'gitbrute'[1] that (as the name suggests) brute-forces the prefix of the hash to whatever you like.

[1]: https://github.com/bradfitz/gitbrute


This is a fairly popular thing to do with bitcoin addresses as well; keep generating keys until you get one with a recognizable prefix in the address.


Tor hidden services also do this. (The address of a hidden service is a hash)


Not just for vanity, either - it makes phishing a hidden service harder: if users know the .onion for Agora starts with 'agora', then a phisher has to invest weeks of compute-time just to get a plausible .onion to start his phish with, rather generate than any old .onion in a millisecond.


Neat. I remember Stripe did something similar with their CTF challenge


Yep! There were a bunch of us who wrote GPU "miners" (mostly OpenCL). I was intending to open-source mine at one point but never got around to it.

At the end of the challenge, the network was hashing ten digits (i.e. 000000000 prefix) in just a few seconds. Here's one of the rounds: https://github.com/pushrax/round660/commits/master


I wrote this a little over a year ago, only (for the lulz) in Node rather than Go:

https://github.com/stuartpb/lhc

Mine allows using a custom word list in the commit message for the nonce.

Judging by this commit, I'm guessing gitbrute uses miniscule variations in the commit time instead.

EDIT: yep: https://github.com/bradfitz/gitbrute/blob/master/gitbrute.go


How do you know they shipped your code?


Because it was shipped about 48 hours after I wrote it, and I see details in it that are specific to my implementation.

In fact, if you use their iOS app(especially on the more memory-consuming devices like an iPad Retina)you'll notice that the app reproducibly crashes on all screens. Except one.


"reproducibly crashes on all screens. Except one"

This reminds me of the story about a large telecom company in North America and one in China. They were able to show that their code was stolen because they had the same bug.


Exactly. My product was stolen by a competitor. In court I could show their product had the same bug as mine. That was the start of their downfall. They bankrupted.


You could reverse engineer their app and see if it matches your implementation. Super easy if it's html app. Not too hard if objective-c since it is very verbose, even with symbols stripped.


I would argue that as a PR move, it's pretty brilliant. It's been brought up countless times since it was announced, and it's been pretty hype up until now. Sure, it was basically a sure bet for Buffet.. But it spawned a community in itself[1] and captured the imagination of many. I'd call that a good move.

[1] http://www.takebuffettsbillion.com/


He did say he was wrong. He was simply stating he was surprised he didn't realise sooner, given how he had already identified a similar method of forgery in an earlier affair.


No he did not. Unless you are 5 years old "Duh, I knew that" is not the same thing as admitting fault:

"Duh, it's fake. Obviously, it's possible to set your system clock back. I was so intrigued I wasn't thinking clearly, even after I pointed this very "attack" out during the Ed Snowden GPG affair. Sorry folks, maybe if it were timestamped in the blockchain."

He played it off and says the real answer is obvious. Then he provides an excuse for not recognizing this obvious truth. Finally to make matters super strange he informs us (w/o evidence) that he was the one that pointed this "attack" out to us during the Snowden affair.


You're reading something very different into jnbiche's words than I am. I'm reading it more like this:

"[Ooooooh,] Duh[!], it's fake[!] [forehead smack of realization and insight]."

You seem to be reading more of a dismissive "Duh, I knew that, why are you telling me what I already know?" into this, whereas I'm reading a much more realization-striken tone.

"Obviously, it's possible to set your system clock back."

Thinking out loud through the realization and insight as it occurred.

"I was so intrigued I wasn't thinking clearly, even after I pointed this very "attack" out during the Ed Snowden GPG affair."

Explaining the circumstances of and implicitly acknowledging and admitting to a mistake or fault, continuing revelation. An introspective done, not a defensive one, not offered up as an "excuse".


It's okay for you to be wrong about something.


It's funny, as a kid I can remember thinking: "Right, everything has to go in the main function unless absolutely necessary, it will just be easier to read that way."

Nowadays, it's the exact opposite!


As a kid I wrote a few games using QBasic. I apparently didn't have a firm grasp on control flow because every call to a sub routine ended with a call to another subroutine; sometimes to a sub routine that was already called earlier in the stack. You win the game by beating it before crashing with a stack overflow.


Nah you just invented continuation passing style. You just needed tail recursion too, and you would be good to go ;)


My code changed in a similar way, however I recently devolved.

I've mostly worked with higher level languages, and recently delved back into C. I had forgotten what a complete pain in the ass it is to pass complex data structures around between functions.

I'm a bit ashamed to admit my functions got pretty damn beefy real quick.


Don't worry, you can do this in C# or Python as well. Just structure your small, gradually growing proof-of-concept and soon-to-be-finished-product program correctly. No classes! Or at the very least a few monolithic ones, which cannot be instantiated. Use static methods whenever possible. I have never ever ever done this.

(I am getting ready to write an essay titled "Confessions of a Shitty Programmer").


Try C++, join the dark side! In addition to chocolate chip cookies, we promise C++11 move semantics for returned objects/values. The cookies/cake might be a lie. :-)


I had to create a small ADO class for advanced C++ last year.

It was... terrifying.


&structure ... ?


> &structure

AFAIK, that's only a legal function signature in C++.

To the point of the question : when I wrote that, I was specifically thinking of 2-dimensional arrays, not structs per se.


I meant passing a pointer rather than a by-ref call (which is indeed C++ only).

i.e. void myfunc(thing* stuff) { return; }

int main() { thing stuff; myfunc(&stuff); }


Oh, totally agreed. What I had difficulty with, specifically, was passing around a struct that contained a multidimensional array of a size determined at runtime.

I'm sure the problems I had could be solved by either a full re-design (not really an option), or accomplished by someone with more C experience.


It shouldn't matter what the struct is holding, you are passing a pointer, it's always the same size.


I don't understand the reasoning about working less time overall. I do something similar with my lunch-breaks: I work through them, and go home an hour earlier at the end of the day. It doesn't mean I've worked less time, just that I've moved the lunch hour to the end of the day. I still work 7.5 hours a day.


This is incredibly user hostile with lines like "you should probably practice a little before embarrassing yourself again" and "It should not take 4.18s to solve that next time."

Is that part of discouraging procrastination or what? As it stands, I couldn't show my sisters this without them getting upset with me.


Damn, I tried searching up some rap lyrics that I knew worked a couple of days ago, no dice. How long does this penalisation last?


You would have to make a huge number of assumptions to draw the conclusion that this guy is a psychopath or otherwise a no-lifer. So what if he continues working after dinner? Does he say what he works on? It could be anything, personal projects, writing..

I know that I split my post-work time in to my various projects, and I wouldn't personally call it 'work'. I understand that others believe that you should take the time to watch television or play games or otherwise unwind. For me, working on my projects lets me unwind. I'm not sure what people mean by 'balance', when I consider my own projects the 'life' part of work-life.

This guy gets up early, keeps himself fit, works hard, keeps up the social side of things (sure, he might talk about work over lunch, but he doesn't say he doesn't just shoot the shit too), and then gets on with stuff he enjoys. This doesn't scream Patrick Bateman to me at all.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: