I enjoyed the exercise, thanks!
You’re welcome, and thank you for playing.
(I wrote a custom loss function for the NN)
I’m curious how you defined that. (i.e. was it “gradient = x for rows where predicted>actual, gradient = −8x for rows where actual>predicted”, or something finickier?)
I had to use keras backend’s switch function for the automatic differentiation to work, but basically yes.
You’re welcome, and thank you for playing.
I’m curious how you defined that. (i.e. was it “gradient = x for rows where predicted>actual, gradient = −8x for rows where actual>predicted”, or something finickier?)
I had to use keras backend’s switch function for the automatic differentiation to work, but basically yes.