Logistic Regression
medium· Machine Learningruns: 0Implement logistic regression for binary classification, trained with batch gradient descent on the binary cross-entropy loss. The model fits weights and bias such that sigmoid(X @ w + b) approximates the probability of the positive class. Provide fit, predict_proba, and predict.
sign in to paste and practice your own solution
wpm 0acc 100%time 0:000 / 822
import numpy as np
class LogisticRegression:
def __init__(self, lr: float = 0.01, epochs: int = 1000):
self.lr = lr
self.epochs = epochs
self.weights = None
self.bias = 0.0
def _sigmoid(self, z):
return 1 / (1 + np.exp(-z))
def fit(self, X, y):
n_samples, n_features = X.shape
self.weights = np.zeros(n_features)
for _ in range(self.epochs):
z = X @ self.weights + self.bias
preds = self._sigmoid(z)
error = preds - y
self.weights -= self.lr * (X.T @ error) / n_samples
self.bias -= self.lr * error.mean()
def predict_proba(self, X):
return self._sigmoid(X @ self.weights + self.bias)
def predict(self, X):
return (self.predict_proba(X) >= 0.5).astype(int)
click the box to focus · tab inserts 4 spaces · backspace to correct · esc to pause
desktop only
codedrill is a typing game and needs a real keyboard. open this on a laptop or desktop to practice.
you can still browse problems and sections from your phone.