English 中文(简体)
3. 两条线回归模式和纳伊夫·巴斯安模式中的类似情况
原标题:Likelihoods in both linear regression model and naive Bayesian model

The model in linear regression is $y = omega^T x + e$ , where x, y, e represent the feature, the target and the noise, respecively.

p(y >, omega)常常被称为线性回归模式的可能性。

In naive Bayesian algorithm, p(x|y) is called likelihood function.

这两种可能性功能混淆不清,因为X和x的顺序正相反。 如何解释这两个可能性? 感谢。

问题回答

In the logistic regression model, p(y|x, omega) is termed conditional likelihood function, in which omega is the parameter.

在Navive Bayesian族分类器中,第(x)页是特征的可能功能,因为在这种情况下,标签是我们能够改变的。 在纳维德湾,我们注重不同阶层的特征分布。





相关问题
Naive bayesian classifier - multiple decisions

I need to know whether the Naive bayesian classifier can be used to generate multiple decisions. I couldn t find any examples which have any evidence in supporting multiple decisions. I m new to this ...

Naive Bayes in Ruby

We are trying to do naive bayes classification in Ruby. Currently we are using http://ai4r.rubyforge.org/ We couldn t get it working for float values and have about 20% points lower accuracy with ...

Ways to improve the accuracy of a Naive Bayes Classifier?

I am using a Naive Bayes Classifier to categorize several thousand documents into 30 different categories. I have implemented a Naive Bayes Classifier, and with some feature selection (mostly ...

Classifying Documents into Categories

I ve got about 300k documents stored in a Postgres database that are tagged with topic categories (there are about 150 categories in total). I have another 150k documents that don t yet have ...

Any Naive Bayesian Classifier in python? [closed]

I have tried the Orange Framework for Naive Bayesian classification. The methods are extremely unintuitive, and the documentation is extremely unorganized. Does anyone here have another framework to ...

Understanding Bayes Theorem

I m working on an implementation of a Naive Bayes Classifier. Programming Collective Intelligence introduces this subject by describing Bayes Theorem as: Pr(A | B) = Pr(B | A) x Pr(A)/Pr(B) As well ...

热门标签