Hacker Newsnew | past | comments | ask | show | jobs | submit | tl2do's commentslogin

The word "personality" smuggles in biological assumptions. Asking "does this model have personality?" feels unproductive because the term implies something it can't be.

More useful framing: how do these subnetworks produce outputs that observers evaluate as personality-consistent? Personality isn't an internal property - it's a judgment made by people watching behavior.


I've been programming for 20 years and apparently still don't understand what my terminal is doing. Recently I asked Claude Code to generate a small shell script. It came back full of escape codes and I just stared at it like a caveman looking at a smartphone. This article finally explains what's been happening right under my nose.

I'm Japanese, and the 80286 at 10MHz was huge for Japan's PC-98 scene. The V30 handled backward compatibility while the 286 ran much faster than what we had before. This project brought back memories—the 286 was the chip of my era, and it's great to see people still exploring its capabilities decades later.

Japanese PC-98 games have an aesthetic that’s so unique. There’s this one Twitter account that posts images from visual novels from that era and they all look so pretty: https://x.com/PC98_bot Also on bluesky https://bsky.app/profile/pc98bot.gang-fight.com

Since generative AI exploded, it's all anyone talks about. But traditional ML still covers a vast space in real-world production systems. I don't need this tool right now, but glad to see work in this area.

A nice way to use traditional ML models today is to do feature extraction with a LLM and classification on top with trad ML model. Why? because this way you can tune your own decision boundary, and piggy back on features from a generic LLM to power the classifier.

For example CV triage, you use a LLM with a rubric to extract features, choosing the features you are going to rely on does a lot of work here. Then collect a few hundred examples, label them (accept/reject) and train your trad ML model on top, it will not have the LLM biases.

You can probably use any LLM for feature preparation, and retrain the small model in seconds as new data is added. A coding agent can write its own small-model-as-a-tool on the fly and use it in the same session.


What do you mean by "feature extraction with an LLM?". I can get this for text based data, but would you do that on numeric data? Seems like there are better tools you could use for auto-ML in that sphere?

Unless by LLM feature extraction you mean something like "have claude code write some preprocessing pipeline"?


Isn't the whole point for it to learn what features to extract?

I agree that expanding communication with strangers is important. But starting with "Do you mind if I sit here? Or did you want to be alone with your thoughts?" and then continuing a conversation for 10+ minutes is a real struggle for me. Sometimes I even wonder—how exactly does this kind of individual conversation actually help me? Maybe this is just me.

Yeah it'll be hard. But with a lot of practice it'll get easier. I think part of the practice is recognizing "they don't want me to continue this conversation" and bailing, vs trying to force every interaction to be a deeper conversation.

I never practiced "idle conversation with a complete stranger" like that because I was lazy. But I did practice making normal, non-sexual, conversation with women on dating sites and dates so that I could go from "isolated in school, then after going online, low response rate and never more than 1 or 2 dates" to someone in a long-term relationship. And recognizing that sort of "ok there's just not any interest here, move along" signal was definitely relevant there too.

Skills take investment.

My parents didn't give me nearly as many opportunities to practice these skills as they had when they grew up, and pop culture actively encouraged me not to talk to strangers as a kid, so I had to work harder at them as an adult. But it was worth it.


Is it a matter of skill, or a matter of courage?

It's a matter of priveledge. Many people don't have time to try and make new connections

>how exactly does this kind of individual conversation actually help me?

It doesn't. It just helps the speaker.


That makes me think—why do I enjoy conversations with friends then? What's really the difference between a friend and a stranger? Friends annoy me too, maybe even more often than strangers do.

Your friends are hopefully somewhat invested in you for a non-transactional reason, and have proven to be a non-threat. There's no guarantees with a stranger.

What a bizarre perspective. Have you never gotten any personal value out of a single conversation in your entire life? Have you never made a friend? I don't understand this "all conversations are bad and useless" nonsense. What on earth do you think you're doing on social media?

I'm not saying that all conversations are bad. I'm just talking about the one hour one-sided conversations described in the article.

One of the basic rules of starting conversations with other people is letting the other person do most of the talking. People like talking about themselves. So the old lady in the article violated that rule. That isn't to say that just talking to people instead of actually talking with them will never work. You might be lucky and the person you talk to just happens to be very interested in what you want to tell them, but it is rather unlikely.

Once you have shown that you are interested in someone by listening to them and thereby learning about them, you might sometimes find that they might also be interested in something you can share with them. The easiest way to get someone interested in you is to first get interested in them.

It's a pretty simple principle, but since people like to talk about themselves they often do not follow it.


Is there a compile-to-Python language with built-in type safety, similar to how TypeScript transpiles to JavaScript? I'm aware of Mojo and mypyc, but those compile to native code/binaries, not Python source.

Python does not need that, as it has built-in type annotation support. The annotation is any expression, so you can in theory express anything a custom type-only language would allow you (although you could make it less verbose and easier to read).

However, the it IMHO just works much worse than TS because: * many libraries still lack decent annotations * other libraries are impossible to type because of too much dynamic stuff * Python semantics are multiple orders of magnitude more complex than JavaScript. Even just the simplest question: Is `1` allowed in parameter typed `float`? What about numpy float64?


Thanks for helping me understand. I wasn't aware of Python's type annotation support. I did some quick research and learned that type annotations don't cause compile errors even when there are type errors. Is that why type checkers like Pyrefly exist?

Correct, currently in Python the type checking is implemented more in a linting phase than in a compiling or runtime phase. Though you can also get it from editors that do LSP, they'll show you type errors while editing the code.

Thanks linsomniac and exyi. I didn't realize Python's type hints are checked by linters, not the compiler. Learned something today.

Yes, but there are also runtime type checkers that can be used to check that input data conforms to the expected types (aka a schema but defined using python types and classes).

The only language I'm aware of that's a bit like that is rpython, but it's the other way round: designed for python to compile to it. If you think about it, you get more benefit from the typed language being the base one, as the compiler or JIT can make more assumptions, producing faster code . Typescript had no alternative but to do it the other way, since it's a lot harder to get things adopted into the browser than to ship them independently.

You can compile F# to Python with Fable https://github.com/fable-compiler/Fable.Python

In Japan, Omron developed early ATMs that looked similar to American and European machines. Though those early forms have changed significantly over time, Omron remains a top maker today (their ATM division later became a joint venture with Hitachi, so the Omron name is no longer used).

Unlike IBM, Omron specializes in ATM hardware, not bank internal systems. That difference in focus could have mattered.


For a hobbyist embedded developer like me, the adoption of RISC-V in the ESP series is big news. In day-to-day development, instruction sets are often abstracted away by the compiler, but I appreciate open specifications and architectures. This makes me particularly interested in how an emulator like Emuko could facilitate evaluating code without the slow process of repeatedly burning it to ROM. I'm keen to see reports of its application in actual ESP32 development.

Or you can write code which can directly run on x86, i.e. FreeRTOS does support that without issues. For peripherals drivers you will need to burn it on chip regardless because emulator rarely can emulate peripherals in some reasonable way.

So if you correctly abstract business logic from peripheral code, you can do most of your development without ever uploading to target.


That's a solid approach, and for high-level logic, it's definitely the way to go.

I find that a lot of my development time is actually spent on lower-level tasks—like writing custom string operations—since we don't have the rich standard libraries of a host environment.

This is exactly where an emulator really shines for me. It enables a "device-less" workflow where I can work through those low-level details on a sofa at a cafe without needing to bring the physical hardware along just to verify the behavior.


RISC-V is supported on QEMU. The available devices don't have a ton of peripherals compared to aarch64, but it exists. Even FreeRTOS has a QEMU virt port for RISC-V. And if you have unit tests QEMU could easily run those accurately.

I am a native Japanese

Original Kanji - hiragana works: おほけなき床の錦や散り紅葉

How it sounds: Oh ke naki Yukano nishikiya chiri ko yo


>> I really wish they'd cite the original Japanese.

Given the Japanese above, translate.google can do text to speech[1], and goog AIMode[3] and bing/chat[2][4] can give multiple translations with notes.

But finding that Japanese, given only the TFA's description? I only saw AIMode manage that, not vanilla search. Perhaps using the author's Japanese wikipedia page[5], or perhaps here, or?

[1] https://translate.google.com/?sl=auto&tl=en&text=%E3%81%8A%E... [2] https://copilot.microsoft.com/shares/JcJSRgDDvT84M16x7RJDb [3] https://share.google/aimode/FNEXZGRPFPANlvNwd [4] https://copilot.microsoft.com/shares/wjaWnGHNpGs18X4M6CJV6 [5] https://ja.wikipedia.org/wiki/%E8%99%9A%E7%99%BD


In which case the "crimson carpet" appears to be the loose invention of the translator. The original just says "brocade" or I guess, "quilt", implying some sort of silk bed cover?

Try an image search with 紅葉 落葉. The result will be the typical image a Japanese person imagines when hearing 散り紅葉. Then try the same search with "crimson carpet." From the standpoint of literary and artistic sensibility, the difference is not small.

my imagination when reading 散り紅葉 is an enka song lol

You hit the point. Image from enka will be nearly identical to ones from poem.

Oh ho ke na ki?

It is modern Japanese pronunciation. In classical literature, おほ is pronounced as a prolonged "o" (an elongated /oː/ sound).

Does that apply to longer vowels with the same(ish) sound, as in 因果応報?

Yes, and no. 報 . this is contemporary word. it pronounces "ho-o", not "ho", not "o-o", not "o". Someone read "bo-o".

As a native Japanese speaker, I'm happy to see our literature introduced to other countries. But I also feel conflicted.

The original Japanese of the first poem is:

おほけなき床の錦や散り紅葉

The translation on the site:

> I am not worthy > of this crimson carpet: > autumn maple leaves.

This contains the translator's interpretation, and the sound and intonation are completely lost. I admire the translator's effort, but I want visitors to understand how much this differs from the original.


This is the general problem with literature and poetry especially. They're not entirely translatable.

- Languages are part of culture and they are historically conditioned, making them necessarily bounded and finite [0]. While the essential thing signified may be the same for corresponding words in two languages (snow vs. Schnee), there is variance in semantic emphasis, connotation, and symbolic significance. In other words, the pragmatic aspect of language is highly contextual and conditioned.

- Words can be used univocally, equivocally, or analogically, and there isn't necessarily a correspondence between these constellations across any two languages. But so much of wordplay trades on such constellations.

- The syntactic and phonetic features peculiar to a language - apart from the what is signified per se - is heavily exploited by poetry.

[0] This reminds me of words like the Greek λόγος (logos), which does not find a satisfactory counterpart in any language as far as I can tell. (Approximations are Tao, Ṛta, or Ma'at, for instance.) You see this difficulty in the translation of John 1 where it is usually rendered verbum or word, which have their own perfections, but fail to do justice to the richness of the original meaning of Logos in passages like John 1:1 and 1:3: "In the beginning was the Word, and the Word was with God, and the Word was God. [...] All things were made through him, and without him was not any thing made that was made." When you substitute "Word" with "Logos", you can clearly see how much more pregnant that message is, e.g., that, contrary to the pagan mythology of those John was addressing, in the beginning there was order, not chaos; that God is Reason; that everything that exists is caused by God and therefore fundamentally intelligible. (Curiously, the Latin Verbum is better than the Greek at emphasizing the procession of divine Reason as Second Person from the First Person in the Trinity.)


True, but there is no obstacle in the way of showing the source. Especially considering how concise Japanese is. Best of both worlds. Fascinating discussion in this whole thread.

By "procession", do you allude to the filioque clause? Agreed on difficulty of translation as I follow Quine so think a language as a whole is the unit of meaning as opposed to any specific granular element.

> By "procession", do you allude to the filioque clause?

The filioque is about the procession of the Holy Spirit from the Father and the Son, not the Son from the Father.


I feel like trying to replicate the meter in English is a silly constraint

I would prefer to know how each line would be best interpreted if it weren't a haiku


I am not a literature lover. I found a modern language interpretation of the poem. Many interpretation are possible. But I feel this is relevant. I translated it to English.

============

おほけなき床の錦や散り紅葉 "Ohokenaki toko no nishiki ya chiri momiji" is interpreted as a haiku-like expression of introspection and refined aesthetic sensibility — one in which the speaker, surrounded by undeserved honor (ohokenaki) and luxurious living (toko no nishiki = sumptuous furnishings), gazes upon the fleeting falling autumn leaves and reflects on their own vanity and attachment to life.

Key points of interpretation: おほけなき Ohokenaki (身の程知らず /畏れ多い): Refers to a luxurious situation or standing that exceeds one's true worth or station — something almost presumptuous to possess. 床の錦 Toko no nishiki: Literally, a beautifully brocaded floor covering; a symbol of opulence. By extension, it evokes the sight of vivid autumn leaves carpeting the ground — the splendor of autumn (nishiki-aki) likened to a gorgeous spread of fabric. 散り紅葉 Chiri momiji : Falling, scattering autumn leaves — a classic symbol of impermanence and the Buddhist sense of transience (mujo). Overall picture: The speaker finds themselves in lavish surroundings that feel undeserved (ohokenaki), while the scattering leaves (mujo) adorn that world with a beauty that is at once gorgeous and hollow — a quiet contrast between humility and the ephemeral.

Even amid a life of splendor, the sight of leaves falling reveals a universal truth — that all things must eventually end. The poem captures a mood that is gently melancholic yet elevated: savoring that beauty from a place of quiet, dignified acceptance.


Thank you for this. I'm just realizing that the pronoun "I" has no place in most haiku.

Sound and intonation are never going to translate between Japanese and English. It's not even on the table.

Such things can't even necessarily translate well between two languages as similar as French and English. Japanese and English is completely hopeless.

It's true in the other direction too, though this being an English site it might be more easily neglected. I've seen some English songs translated into Japanese, keeping the same syllable count scheme. The Japanese is radically simplified compared to the English, with entire adverbs, adjectives, even clauses removed. And that's even before we ask whether Japanese necessarily has the correct words to translate some of the richer English concepts with their own centuries of history and connotation behind them that these songs contained.

It is what it is. There isn't much that can be done about it. Even if someone made an exhaustive translation of something, it could never be repacked into something that matches the original concise packing.


As a native speaker, would there be any way that you could translate this back one poem back into Japanese? I am curious what the original would be, and if the translation was truly accurate. It was favorite one from the article:

RAIZAN (来山) Died on the 3rd day of the 10th month, 1716 at the age of 63

Farewell, sire—

like snow, from water come

to water gone.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: