Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I think it can be argued that the set of weights of a neural network is a program, but not “the preferred form of the work for making modifications to it”.

That would mean the training set and training algorithm (and random number generator seed) would have to be distributed under the GPL, too.



I don't think it is always feasible to make the learning process entirely deterministic, especially for distributed reinforcement learning.


Binaries weren't historically deterministic either. This isn't a problem.


They did the same thing though. Two differently trained neural networks will have different outcomes, in this case they'd pick different chess moves in some special cases.


I’m fairly sure there were (maybe still are) linkers that, given duplicate symbol names, would happily grab one of them without checking whether they had the same code.

Add in parallel linking, and program behavior can change.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: