top of page

bayesian intelligence

  • 6 days ago
  • 3 min read

Updated: 20 hours ago

Bayesian modelling is an extremely powerful and sometimes overlooked mathematical concept which serves as a fundamental tool to help understand the nature of reality. Bayesian modelling operates on the core principle of uncertainty of existing knowledge and presents a probabilistic view of modelling reality with a mechanism to continuously update probabilities when encountering new information. The mathematical roots of Bayesian modelling was formulated by mathematician Thomas Bayes in the early 18th century through a simple mathematical formula which involves updating a prior probability based on likelihood of new evidence to get a posterior probability :


P(H|E) = P(H) x [P(E|H)/P(E)]

P(H) - probability of a hypothesis (H) being true (prior probability)

P(H|E) - probability of a hypothesis (H) being true given new evidence (E) (posterior probability)

P(E|H) - probability of finding evidence (E) if hypothesis (H) is true (likelihood)

P(E) - probability of finding evidence (E)


The above formula is a simple but extremely powerful mathematical tool and possesses the property of nested recursion which allows for continuous updation of posterior probability when encountering further new evidence (E’) where we can calculate the updated posterior probability as : P((H|E)|E’) = P(H|E) x [P(E’|(H|E))/P(E’)]. The property of nested recursion allows for the creation of more sophisticated probability models known as Bayesian networks which is an extremely useful tool for complex statistical modelling. Bayesian modelling is the mathematical skeleton of artificial and biological intelligence which are essentially very complex probabilistic prediction models. If we look into artificial intelligence models take for e.g. large language models, the model output is a mathematical probability vector of the next likely token or word which is represented as a list of probabilities of all possible tokens present within in the vocabulary of the language, based on input information fed into the model by a user where the tokens which are unlikely to be the output is near zero and the probability for the most likely next token is near 1 and the largest magnitude row element within the probability vector. During training process of creating an inference-ready artificial intelligence model, the model which is essentially a mathematical probability generator continuously updates its underlying parameters or weights based on new evidence which is the error between predicted probability generated by the model under training and actual probability of the correct outcome. The updated weights generates an updated posterior probability value which would be closest to the actual probability value by the end of the training process. Biological intelligence is theorized to operate in a similar way. A key discovery about the working of the human mind is that reality is dynamically constructed within in the brain in real-time by processing prior information in the form of memories and new information in the form of disparate real-time sensory signals. The brain in order to preserve energy likely constructs an initial model of physical reality first before exposure to input sensory signals based on all prior memories stored in the brain to develop what can viewed as a prior probabilistic view of reality which then gets updated based on incoming sensory information to provide the actual posterior probabilistic view of reality which can get further based on new sensory information. Bayesian modelling at its heart presents a different philosophy for viewing and understanding our complex reality by thinking of possibilities in terms of mutable probabilities instead of rigid certainties thereby providing a framework resistant to uncertainty which accounts for all possibilities viewed in terms of their ascertained probabilities and allows for infinite error-correction of the ascertained probabilities based on a continuous feedback-loop with reality.

 
 

social

  • Instagram
  • LinkedIn
bottom of page