Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Realistically, most people barely get to multiple GPUs, let alone multiple machines. You're more likely to do hyperparameter tuning across machines before you do distributed training.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: