Hacker Newsnew | past | comments | ask | show | jobs | submit | solid_fuel's commentslogin

Take a piece of paper, write two numbers on it, let me know when they start to reproduce.

The math isn’t the ink on the page.

This is such a weird comment.

> Since the times GPT-2 was reimplemented inside Minecraft - its quite obvious LLMs are just math.

This was obvious since LLMs were first invented. They published papers with all the details, you don't need to see something implemented in Minecraft to realize that it's just math. You could simply read the paper or the code and know for certain. [0]

> math is the only area of human knowledge with perfect flawless reductionism, straight to the roots

Incorrect, Kurt Gödel showed with his Incompleteness Theorems in 1931 [1] that it is impossible to find a complete and consistent set of axioms for mathematics. Math is not perfectly reducible and there is no single set of "roots" for math.

> It was build [sic] that way since the beginning,

This is a serious misunderstanding of what mathematics is. Math is discovered as much as it is built. No one sat down and planned out what we understand as modern mathematics - the math we know is the result of endless amounts of logical reasoning and exploration, from geometric proofs to calculus to linear algebra to everything else that encompasses modern mathematics.

> And because of that flawless reductionism, complexity adds nothings to the nature of math things, this is how math working by design

This sentence means nothing, because math is not reducible in that way.

> so it can be proven there are no anything like consciousness simply because conciousness [sic] was not implented [sic] in the first place, only perfect mimicry.

Even if the previous sentence held, this does not follow, because while we are conscious the current consensus is that LLMs are not and most AI experts who are not actively selling a product recognize that LLMs will not lead to human-equivalent general intelligence. [3]

[0] https://github.com/openai/gpt-2

[1] https://en.wikipedia.org/wiki/G%C3%B6del's_incompleteness_th...

[2] https://www.cambridge.org/core/journals/think/article/mathem...

[3] https://deepmind.google/research/publications/231971/


Math used in LLMs is perfectly reducible and Gödel have nothing to do with it - inside commonly used axioms (which sufficient for LLM to exist and outside of Kurt Gödel scope) there are ZERO questions/uncertainties how it works, it's just a fact :)

Go on, name the axioms required for LLMs to work. I’ll wait. Obviously you are just talking out your ass about something you don’t understand.

> I am open to any (constructive) comments/suggestions

Here's one:

I think a senior sysadmin needs to sit you down in their office and have a very serious talk with you about the responsibility that comes with writing code other people run. I am serious. We used to have these talks with everyone who got sudo access. You shouldn't be shipping code if you don't understand the trust that is required of people in your position.

This isn't just about this "feature" being active when AI features are disabled, the way you mis-implemented this has resulted in it modifying the commit message with the user even seeing it! That is malicious behavior, not an innocent little feature "to make life easier".

I've fully switched off of VS Code to Kate now, which is faster and better behaved in most cases anyway. Bye.


To be fair, looks like a PM vibe coded it and this person “just” gave it an approval with no comments after an LLM review.

To be fair, that makes it worse for MS, not better.

This should not be vibe-coded by someone who has absolutely no idea about any of these things.


Oh yeah sorry, I was being sarcastic. I think it’s hilariously bad, but I also avoid MS products like the plague as a rule.

He makes more money than you and you’re responding to his ai chatbot

Seethe


There’s more to life than money, child.

Eh, the main thing you would feel with this is latency, not bandwidth. Even on a 10 Mbps LAN, you would be able to open a file pretty quick, but over the internet latency is going to be > 100 ms in almost every case. That's a lot more painful.

Correct. Well, almost correct. Will see how much uptake this service will take (if any), and we can probably place it really close to the edge - for now it's on an Oregon server only.

That said, this isn't too far from mechanical HDD latencies of the /real/ SCSI drives.


I can't even view the commits because GitHub is claiming I'm over a request rate limit. This is the first time I've even opened GitHub today. Time to knock another 9 off their status page.

> The same argument applies to open source itself. Why use someone's project when you can just have the robot write your own?

This is only a valid strategy if you either

a) understand the problem domain well enough to make a judgement call on what the LLM shits out.

or b) don't care about the correctness of the project.

Obviously, many software devs feel comfortable enough with CS problems to validate the LLM solution, but a flower shop owner does NOT know enough about accounting to vibe code a bookkeeping project, so for a shop owner an open source option - with many human contributors and actual production use elsewhere - would be a much better choice.


> verifying the correctness of LLM-generated code

It's... pretty clear in the original conversation.


I find that people who write "may I ask" are often/usually bad-faith arguers under cover of being polite.

That's a good rule of thumb, it seems that way more often than not.

> The problem is you can get the LLM to iterate until it compiles and lints and even passes LLM review

Worst of both worlds with this, if you're doing it in a github workflow. You wind up effectively paying for the testing/validation layer of someone else's irresponsible LLM use.


Jesus (who is also God), died so that God (who is also Jesus), wouldn't be mad at us - who He created - for sinning (breaking the rules that God (who is also Jesus) made).

He had to do this because although He is omnipotent and omniscient, He was unable to forgive us without literally pinning our sins on His Son Jesus (who is also God) and then letting us kill him for it.

It makes perfect sense and has no contradictions when someone teaches it to you every Sunday during your childhood.


Plus however many billions to bomb Iran and another 400 million for that stupid ballroom, the mad king's spending is out of control.

Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: