{"id":148,"date":"2018-10-30T18:49:57","date_gmt":"2018-10-31T01:49:57","guid":{"rendered":"http:\/\/hub.wsu.edu\/fernandovillanea\/?page_id=148"},"modified":"2018-10-30T19:13:08","modified_gmt":"2018-10-31T02:13:08","slug":"bayesian-inference","status":"publish","type":"page","link":"https:\/\/hub.wsu.edu\/fernandovillanea\/bayesian-inference\/","title":{"rendered":"Bayesian Inference"},"content":{"rendered":"<br \/>\n<section id=\"builder-section-1540948829986\" class=\"row single gutter pad-top\">\n<div style=\"\" class=\"column one \">\n<p>Bayesian inference is a statistical method based on Bayes\u2019 theorem, in which the probability of a hypothesis is updated based on prior evidence and a model created to explain the data (Konigsberg and Frankenberg 2013). In Bayesian inference probability is treated as \u201cconditional probability\u201d, the probability of an outcome <em>given<\/em> another outcome (Casella 2008; Puga et al. 2015b). At the core of Bayesian inference is Bayes\u2019 theorem (Puga et al. 2015a), in which the probability of a model <em>M<\/em> given the data <em>D<\/em> is described by <em>P<\/em>(<em>M<\/em>\u01c0<em>D<\/em>), and it is calculated as follows:<\/p>\n<figure id=\"attachment_163\" aria-describedby=\"caption-attachment-163\" style=\"width: 396px\" class=\"wp-caption aligncenter\"><img decoding=\"async\" loading=\"lazy\" class=\"size-medium wp-image-163\" src=\"https:\/\/wpcdn.web.wsu.edu\/wp-hub\/uploads\/sites\/1198\/2018\/10\/equation-1-396x128.png\" alt=\"\" width=\"396\" height=\"128\" srcset=\"https:\/\/wpcdn.web.wsu.edu\/wp-hub\/uploads\/sites\/1198\/2018\/10\/equation-1-396x128.png 396w, https:\/\/wpcdn.web.wsu.edu\/wp-hub\/uploads\/sites\/1198\/2018\/10\/equation-1-768x249.png 768w, https:\/\/wpcdn.web.wsu.edu\/wp-hub\/uploads\/sites\/1198\/2018\/10\/equation-1.png 784w\" sizes=\"(max-width: 396px) 100vw, 396px\" \/><figcaption id=\"caption-attachment-163\" class=\"wp-caption-text\">(Equation 1)<\/figcaption><\/figure>\n<p>Here, <em>P<\/em>(<em>D<\/em>\u01c0<em>M<\/em>) is referred to as the <em>likelihood<\/em>, and it describes the compatibility of the data, given a model (specifically, it is the probability of the model <em>M<\/em> producing the data <em>D)<\/em>. The <em>P<\/em>(<em>M<\/em>) is the probability of the model <em>M<\/em> before the data <em>D<\/em> are observed, also known as the <em>prior probability<\/em> or simply a <em>prior<\/em>. A prior represents our degree of belief in the values that a parameter can take, and it modifies the likelihood to produce the probability of a model given the data <em>P<\/em>(<em>M<\/em>\u01c0<em>D<\/em>). The <em>P<\/em>(<em>M<\/em>\u01c0<em>D<\/em>) is referred to as the <em>posterior probability<\/em>. Finally, <em>P<\/em>(<em>D<\/em>) represents the probability of the data. Critically, when posterior probabilities are calculated using the same data, <em>P<\/em>(<em>D<\/em>) takes the same value in all independent calculations (as the empirical data are the same for all), it is therefore a fixed scalar of <em>P<\/em>(<em>M<\/em>\u01c0<em>D<\/em>) and is often ignored:<\/p>\n<figure id=\"attachment_164\" aria-describedby=\"caption-attachment-164\" style=\"width: 396px\" class=\"wp-caption aligncenter\"><img decoding=\"async\" loading=\"lazy\" class=\"size-medium wp-image-164\" src=\"https:\/\/wpcdn.web.wsu.edu\/wp-hub\/uploads\/sites\/1198\/2018\/10\/equation-2-396x75.png\" alt=\"\" width=\"396\" height=\"75\" srcset=\"https:\/\/wpcdn.web.wsu.edu\/wp-hub\/uploads\/sites\/1198\/2018\/10\/equation-2-396x75.png 396w, https:\/\/wpcdn.web.wsu.edu\/wp-hub\/uploads\/sites\/1198\/2018\/10\/equation-2-768x146.png 768w, https:\/\/wpcdn.web.wsu.edu\/wp-hub\/uploads\/sites\/1198\/2018\/10\/equation-2-792x150.png 792w, https:\/\/wpcdn.web.wsu.edu\/wp-hub\/uploads\/sites\/1198\/2018\/10\/equation-2.png 864w\" sizes=\"(max-width: 396px) 100vw, 396px\" \/><figcaption id=\"caption-attachment-164\" class=\"wp-caption-text\">(Equation 2)<\/figcaption><\/figure>\n<p>Model testing in Bayesian frameworks are relatively straightforward and is usually performed using Bayes factors. A Bayes factor is the ratio of the prior odds of two hypotheses (i.e., the odds of model <em>M<sub>1<\/sub><\/em> over model <em>M<sub>2<\/sub><\/em>) to the posterior odds of the hypotheses (Kass and Raftery 1995). The Bayes factor <em>K<\/em> is thus the ratio of the two marginal likelihoods of the models:<\/p>\n<figure id=\"attachment_165\" aria-describedby=\"caption-attachment-165\" style=\"width: 396px\" class=\"wp-caption aligncenter\"><img decoding=\"async\" loading=\"lazy\" class=\"size-medium wp-image-165\" src=\"https:\/\/wpcdn.web.wsu.edu\/wp-hub\/uploads\/sites\/1198\/2018\/10\/equation-3-396x199.png\" alt=\"\" width=\"396\" height=\"199\" srcset=\"https:\/\/wpcdn.web.wsu.edu\/wp-hub\/uploads\/sites\/1198\/2018\/10\/equation-3-396x199.png 396w, https:\/\/wpcdn.web.wsu.edu\/wp-hub\/uploads\/sites\/1198\/2018\/10\/equation-3.png 550w\" sizes=\"(max-width: 396px) 100vw, 396px\" \/><figcaption id=\"caption-attachment-165\" class=\"wp-caption-text\">(Equation 3)<\/figcaption><\/figure>\n<p>Conveniently, Kass and Raftery (1995) provide a scale to discriminate between models based on the value of the ratio <em>K<\/em>.\u00a0 Notably, though this scale has assumed some authority in the field of Bayesian inference, it is itself a suggestion when interpreting the importance of Bayes factor values (as are schemes regarding the significance of <em>p\u00ad-values<\/em>). It is also important to remember when calculating Bayes factors that most coalescent or phylogenetic software packages report probabilities and likelihoods in log<sub>e<\/sub> units (this is done because likelihoods of phylogenies and genealogies can be exceeding small).<\/p>\n<p>&nbsp;<\/p>\n<\/p><\/div>\n<\/section>\n","protected":false},"excerpt":{"rendered":"<p>Bayesian inference is a statistical method based on Bayes\u2019 theorem, in which the probability of a hypothesis is updated based on prior evidence and a model created to explain the data (Konigsberg and Frankenberg 2013). In Bayesian inference probability is treated as \u201cconditional probability\u201d, the probability of an outcome <em>given<\/em> another outcome (Casella 2008; Puga et al. 2015b). At the core of Bayesian inference is Bayes\u2019 theorem (Puga et al. 2015a), in which the probability of a model <em>M<\/em> given the data <em>D<\/em> is described by <em>P<\/em>(<em>M<\/em>\u01c0<em>D<\/em>), and it is calculated as follows:<\/p>\n<p>Here, <em>P<\/em>(<em>D<\/em>\u01c0<em>M<\/em>) is referred to as the <em>likelihood<\/em>, and it describes &#8230; <a href=\"https:\/\/hub.wsu.edu\/fernandovillanea\/bayesian-inference\/\" class=\"more-link\"><span class=\"more-default\">&raquo; More &#8230;<\/span><\/a><\/p>\n","protected":false},"author":2159,"featured_media":0,"parent":0,"menu_order":0,"comment_status":"closed","ping_status":"closed","template":"template-builder.php","meta":[],"wsuwp_university_location":[],"wsuwp_university_org":[],"_links":{"self":[{"href":"https:\/\/hub.wsu.edu\/fernandovillanea\/wp-json\/wp\/v2\/pages\/148"}],"collection":[{"href":"https:\/\/hub.wsu.edu\/fernandovillanea\/wp-json\/wp\/v2\/pages"}],"about":[{"href":"https:\/\/hub.wsu.edu\/fernandovillanea\/wp-json\/wp\/v2\/types\/page"}],"author":[{"embeddable":true,"href":"https:\/\/hub.wsu.edu\/fernandovillanea\/wp-json\/wp\/v2\/users\/2159"}],"replies":[{"embeddable":true,"href":"https:\/\/hub.wsu.edu\/fernandovillanea\/wp-json\/wp\/v2\/comments?post=148"}],"version-history":[{"count":4,"href":"https:\/\/hub.wsu.edu\/fernandovillanea\/wp-json\/wp\/v2\/pages\/148\/revisions"}],"predecessor-version":[{"id":180,"href":"https:\/\/hub.wsu.edu\/fernandovillanea\/wp-json\/wp\/v2\/pages\/148\/revisions\/180"}],"wp:attachment":[{"href":"https:\/\/hub.wsu.edu\/fernandovillanea\/wp-json\/wp\/v2\/media?parent=148"}],"wp:term":[{"taxonomy":"wsuwp_university_location","embeddable":true,"href":"https:\/\/hub.wsu.edu\/fernandovillanea\/wp-json\/wp\/v2\/wsuwp_university_location?post=148"},{"taxonomy":"wsuwp_university_org","embeddable":true,"href":"https:\/\/hub.wsu.edu\/fernandovillanea\/wp-json\/wp\/v2\/wsuwp_university_org?post=148"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}