Evolutionary Algorithms – A Brief Overview

Last year, as part of the assessment for a machine learning course, we were asked to write a tutorial paper on a topic of our choice. I chose to write about Evolutionary Algorithms. I’m happy with how it turned out, and the lecturer chose it as an exemplar for future students. For posterity, I’m going to share it here. Unfortunately, the original .tex file is lost to the sands of time (more specifically, it was saved on a thumb drive that was the only casualty in a backpack-dropping accident while getting off the bus). Thus, the only copy I have access to is the PDF file that I submitted online. You can find that here:

SCIE3250 Project – Speed and Complexity of Bayesian Network Selection Using Continuous Evolutionary Techniques

Abstract

Choosing a Bayesian network which best explains the dependencies in a dataset is an important problem in scientific areas including medicine, epidemiology, genetic analysis and domino effect analysis. However, inference using Bayesian belief networks is an NP-hard problem, and inference is considered computationally demanding even for feasible problems. With these conditions, it is generally hoped that a small army of algorithms can be developed which will find almost optimal solutions, and these are then all applied to a given problem and the best result taken. We therefore aim to provide an alternative family of methods using continuous evolutionary global optimisation techniques, and quantify its effectiveness compared with the already available hill-climbing method. Various implementations are considered, and it is found that combining available continuous global optimisation packages with the pre-existing score function is computationally unfeasible for even small networks. Potential reasons are discussed, and improvements proposed using custom-built methods, based on the effectiveness of the existing hill climbing algorithm.

The full PDF can be downloaded here, and the presentation PDF can be downloaded here.