Common Activation Functions

easy· Machine Learningruns: 0

Implement the building-block activation and loss functions used across deep learning: sigmoid, softmax (numerically stable), ReLU, and binary cross-entropy. These appear in nearly every from-scratch ML interview question.

sign in to paste and practice your own solution
desktop only

codedrill is a typing game and needs a real keyboard. open this on a laptop or desktop to practice.

you can still browse problems and sections from your phone.