Naive Bayes
medium· Machine Learningruns: 0Implement a Gaussian Naive Bayes classifier from scratch. For each class, fit the mean and variance of every feature. Classify by computing the log posterior using the Gaussian likelihood and the class prior, then return the class with the highest posterior.
sign in to paste and practice your own solution
wpm 0acc 100%time 0:000 / 997
import numpy as np
class GaussianNB:
def fit(self, X, y):
self.classes = np.unique(y)
self.means = {}
self.vars = {}
self.priors = {}
for c in self.classes:
Xc = X[y == c]
self.means[c] = Xc.mean(axis=0)
self.vars[c] = Xc.var(axis=0) + 1e-9
self.priors[c] = len(Xc) / len(X)
def _log_likelihood(self, x, c):
mean = self.means[c]
var = self.vars[c]
log_prob = -0.5 * np.sum(np.log(2 * np.pi * var))
log_prob -= 0.5 * np.sum((x - mean) ** 2 / var)
return log_prob
def predict(self, X):
predictions = []
for x in X:
posteriors = []
for c in self.classes:
prior = np.log(self.priors[c])
likelihood = self._log_likelihood(x, c)
posteriors.append(prior + likelihood)
predictions.append(self.classes[np.argmax(posteriors)])
return np.array(predictions)
click the box to focus · tab inserts 4 spaces · backspace to correct · esc to pause
desktop only
codedrill is a typing game and needs a real keyboard. open this on a laptop or desktop to practice.
you can still browse problems and sections from your phone.