Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This doesn't seem like the worst idea, but from a hardware perspective I'm not sure how an "AI PC" differs from machines with nice GPUs.


An AI PC will have an AI App Store. Where Intel can reap like a 30% tax of all developer profits.

Of course the hardware is the same, except for some features that allow Intel to lock it down hard.


GPUs are increasingly optimised for machine learning workloads, but they aren't really designed for it on an architectural level and still leave a lot of potential performance and efficiency on the table. Current ML implementations are still quite kludgy and path-dependent, with lots of technical debt. There is huge potential for optimisation throughout the stack.


I’d welcome correction, but my understanding is that NPUs are “native” neural weight processors that trade the flexibility of doing graphics for the optimization of not having a layer like CUDA between neural net and hardware.


They are different enough that you could certainly add additional performance gains if you knew you were running LLaMa versus Fortnite.


You could say the same about gaming PCs (ok, and LEDs).

And less obviously about almost anything. So much is just marketing, including product design as a form of it.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: