Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Anyone try 'becoming one with the computer' when programming?

Seems to compress/clarify modelling & managing complexity sometimes.



Can you elaborate on what do you mean by 'becoming one with the computer'? My understanding is that you mean that we should put some assumptions aside and try to look from the CPU point of view, which can be helpful for debugging, testing, design, etc. Am I right? edit: spelling mistake




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: