Common Activation Functions
easy· Machine Learningruns: 0Implement the building-block activation and loss functions used across deep learning: sigmoid, softmax (numerically stable), ReLU, and binary cross-entropy. These appear in nearly every from-scratch ML interview question.
sign in to paste and practice your own solution
wpm 0acc 100%time 0:000 / 443
import numpy as np
def sigmoid(z):
return 1 / (1 + np.exp(-z))
def softmax(z):
z = z - np.max(z, axis=-1, keepdims=True)
exp = np.exp(z)
return exp / exp.sum(axis=-1, keepdims=True)
def relu(z):
return np.maximum(0, z)
def binary_cross_entropy(y_true, y_pred, eps: float = 1e-15):
y_pred = np.clip(y_pred, eps, 1 - eps)
return -np.mean(
y_true * np.log(y_pred) + (1 - y_true) * np.log(1 - y_pred)
)
click the box to focus · tab inserts 4 spaces · backspace to correct · esc to pause
desktop only
codedrill is a typing game and needs a real keyboard. open this on a laptop or desktop to practice.
you can still browse problems and sections from your phone.