Machine Learning Practice
For a given class variable \(y\) and dependent feature vector \(x_1\) through \(x_m\),
the naive conditional independence assumption is given by:
Naive Bayes learners and classifiers can be extremely fast compared to more sophisticated methods.
ComplementNB
GaussianNB
BernoulliNB
CategoricalNB
MultinomialNB
GaussianNB
implements the Gaussian Naive Bayes algorithm for classification
from sklearn.naive_bayes import GaussianNB
gnb = GaussianNB()
gnb.fit(X_train, y_train)
Instantiate a GaussianNBClassifer estimator and then call fit method using X_train and y_train.
MultinomialNB
implements the naive Bayes algorithm for multinomially distributed data
(text classification)
from sklearn.naive_bayes import MultinomialNB
mnb = MultinomialNB()
mnb.fit(X_train, y_train)
Instantiate a MultinomialNBClassifer estimator and then call fit method using X_train and y_train.
ComplementNB
implements the complement naive Bayes (CNB) algorithm.
from sklearn.naive_bayes import ComplementNB
cnb = ComplementNB()
cnb.fit(X_train, y_train)
Instantiate a ComplementNBClassifer estimator and then call fit method using X_train and y_train.
CNB regularly outperforms MNB (often by a considerable margin) on text classification tasks.
BernoulliNB
from sklearn.naive_bayes import BernoulliNB
bnb = BernoulliNB()
bnb.fit(X_train, y_train)
Instantiate a BernoulliNBClassifer estimator and then call fit method using X_train and y_train.
CategoricalNB
implements the categorical naive Bayes algorithm suitable for classification with discrete features that are categorically distributed
from sklearn.naive_bayes import CategoricalNB
canb = CategoricalNB()
canb.fit(X_train, y_train)
Instantiate a CategoricalNBClassifer estimator and then call fit method using X_train and y_train.
assumes that each feature, which is described by the index \(i\), has its own categorical distribution.