2.4. Bayesian Pattern Recognition 19The simplest state space is defined by the 2-state case:"The received email text iseither spam or ham". In this minimal example, the previously mentioned evidenceis a representation of the text in a machine processable form. Such a form of theevidence is commonly called Feature Vector and should be denoted as x.For the"decision making process", that should be called Decision Rule, it would bebeneficial to have a measure of the quality of the different possible decisions, thedifferent possible states under the feature vector represented evidence. In Bayesianinference this measure is given by the joint probability between the state s and thefeature vector x with:p( x, s)which evaluates under the definitions of probability theory to a scalar value in therange[0.. 1]. Furthermore, if x and s are conditionally independent, the joint proba-bility can be factored into the two conditional forms that are essential for the Bayestheorem:p( x, s)= p( x| s) p( s)= p( s| x) p( x)Under the Bayes theorem, the interpretation of the probabilities p( x| s), p( s) andp( s| x) is indicated by their commonly used names. p( s| x) is called the posteriorprobability as it represents the belief that the state follows the evidence given byx. p( s) is the prior probability and represents an evidence independent knowledgeabout the probability how often the state s will be observed. And finally, p( x| s) is thestate-conditional probability that represents the probability to observe the evidence xunder assumption that the environment is in state s. p( x), the probability to observea specific evidence, has not been given a dedicated name as it will be unimportantfor the sought Decision Rule.The Decision Rule should obviously lead to a high quality decision. This shouldtherefore be the decision with the maximum joint probability. By this reasoning,the Bayes Decision Rule rbayes: x s is defined as:rbayes( x)= argmax p( x, s)swhich factors according to the laws of conditional probabilities into:rbayes( x)= argmax[ p( s| x) p( x)]s= argmax[ p( s| x)], since argmax is independent of p( x)ssThis is an intuitive result. A decision that is based on the posterior probabilityleads to the same result as using the joint probability. But it is still unknown howto obtain the posterior p( s| x). Therefore, the Bayes theorem will be used again:p( s| x)=p( x| s) p( s)p( x)Inserting the factored posterior into the Decision Rule:p( x| s) p( s)rbayes( x)= argmaxsp( x)= argmax[ p( x| s) p( s)]s, since argmax is independent of p( x)s
Diplomarbeit
Indoor Localization of Mobile Devices Based on Wi-Fi Signals Using Raytracing Supported Algorithms
Einzelbild herunterladen
verfügbare Breiten