In our "regular" programming scene the programmer knows all combinations of input/output and the program is "deterministic". If the programmer forgets about a combination, it leads to a bug. :)
If the input/output dataset is known exhaustively and is free of noise then this learner overfits/memorizes all instances. This can be seen as "deterministic" use-cases, which I mentioned above. In other words if you think like a tester and are aware of all input/output instances then you don't need to write code explicitly.
Of course, I am not talking about the efficiency of such an outcome here. ;)
If the input/output dataset is known exhaustively and is free of noise then this learner overfits/memorizes all instances. This can be seen as "deterministic" use-cases, which I mentioned above. In other words if you think like a tester and are aware of all input/output instances then you don't need to write code explicitly.
Of course, I am not talking about the efficiency of such an outcome here. ;)
No comments:
Post a Comment