Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Overtraining is bad by definition, like overcooking. But "don't overtrain" is about as useful a maxim as "don't overcook".

Naive bayes has higher bias than logistic regression. Is that a good or a bad thing? Depends.

(I don't actually see any analogy with overtraining.)



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: