Search

The Butterfly Effect

for the idea in all of us

Category

Genetic Algorithms

Coming Soon: Synthetic Neurointerfaces

I’m getting ready to release my work in persisting synthetic neurointerfaces in unbounded spatial networks. I truly believe that the use of computational tools such as this can be used to study the structure of intelligent computation in high-dimensional neural systems. What I tried to emulate in this project was a neuron by neuron representation of some basic cognitive functions by persisting a memory field in which self organizing neocortical hypercolumns could be functionally represented. The project was inspired by biological neural dynamical systems and foundationally rooted in some of the brilliant work Google’s Deep Mind project has been doing.  Before I publish any results, I would like to give a special thanks to my mentor and long time friend, Dr. Celia Rhodes Davis. Also, I would like to especially thank the Stanford Department of Computational Neuroscience  (Center for Brain, Mind & Computation) for functioning as an advisory board throughout my independent research and functioning as a sound logic board for general guidance.

Below is a problem definition, goals, and a small sneak peek regarding the immediate potential, and execution of my project:

Introduction

The interface between the neuroanatomical activation of neocortical hypercolumns and their expressive function is a realm largely unobserved, due to the inability to efficiently and ethically study causational relationships between previously exclusively observed phenomenon. The field of general neuroscience explores the anatomical significance of cortical portions of the brain, extending anatomy as a means to explain the persistence of various nervous and physically expressive systems. Psychological approaches focus purely on \textit{expressive} behaviors as means to extend, with greater fidelity, the existence and constancy of the brain-mind interface. The interface between the anatomical realms of the mind and their expressive behaviors is a field widely unexplored, with surgeries such as the lobotomy and other controversial, experimental, and life-threatening procedures at the forefront of such study. However, the understanding of these neurological interfaces has potential to function as a window into the neural circuitry of mental illnesses, opening the door for cures and an ultimately more complete understanding of our brain.

Goals

We propose a method to simulate unbounded memory fields upon which recall functions can be parameterized. This model will be able to simulate cortical functions of the amygdala in its reaction to various, unfiltered stimuli. An observer network will be parallely created to analyze geometric anomalies in the neuroanatomical interface in memory recall functions, and extend equivalences between recall function parameters and memory recall gradients. This enables it to extend hypothesis to neuroanatomical functions.

Advertisements

GA Utilizing Efficient Operators in TSP

Through the data collected in the above two pages, it can be reasonably be concluded that center inverse mutation in unison with the inversely linear roulette wheel selection and the random crossover point yield the best result with a higher number of generations. We decided to test a combination of all of these genetic operators and see the value of the lowest path yielded by it. The same input graph used for the other tests was used in this case with 6000 chromosomes in the initial population and 5000 generations with a cutoff percentage of 30%

The results are as follows of the top path after 5000 generations:
Weight = 238
Path: {A, X, C, P, S, G, E, U, Q, Y, B, V, N, T, W, I, F, H, Z, O, D, R, M, L, K, J, A}

Graph (with all edges and weights present):Graph

In the Comparison of Genetic Operators For Solving the Traveling Salesman Problem: Selection

In comparing selection methods, for the sake of comparison it was in our best interest to leave the least to randomness except in the selection method. The mutation method was the center inverse mutation throughout all the trials and a center mutation point was chosen every time. The cutoff percentage was the same (30%) for each trial and the number of generations was a fixed 5000.

The numbers displayed below are the average of 10 trials conducted with the same input graph but a different initial population for each trial.Selection Comparison

In the Comparison of Genetic Operators For Solving the Traveling Salesman Problem: Mutation

In attempt to statistically compare the operators, the input graph and the initial population was kept the same for each trial. The numbers displayed below are the average of 10 trials conducted with the same input graph but a different initial population. The algorithm was ran with an input graph consisting of 26 static nodes and approximately 4.03E26 possible combinations. Each trial ran 5000 generations with an input population of 5000 chromosomes. The fitness percentage was 30% throughout every trial.

Mutation Operators and Crossover Point

In this trial the method of selection was kept standard using the percentage cutoff method to avoid any influence from the selection method.

Random Crossover Point Center Crossover Point
Reverse Sequence Mutation 336 414
Center Inverse Mutation 253 310

The representation of each mutation operator over iterations was tested with a constant center crossover point.

Mutation Operator Comparison

Genetic Algorithm: Selection

In every generation, a selection agent comes to play which sifts out the fit chromosomes from the unfit chromosomes. The selection agent “kills off” a user specified percentage of organisms in the population.However, it is under the discretion of the selection agent in determining which chromosomes to kill. As mentioned earlier, fitness is defined by having the lowest weight in the circumstances put forth by the TSP. However selection may not necessarily be only off of that. This can be seen when comparing the two most prevalent types of selection operators:
Continue reading “Genetic Algorithm: Selection”

Genetic Algorithm: Mutation

During the progression of a genetic algorithm, the population can hit a local optima (or extrema). Nature copes for this local optima by adding random genetic diversity to the population set “every-so-often” with the help of mutation. Our genetic algorithm accomplishes this via the mutation operator. Although there are a plethora of mutation types our GA focused on a select two:

1. Reverse Sequence Mutation – In the reverse sequence mutation operator, we take a random point in the sequence or organism. We split the path (P1) at the selected point. The second half of the split path (P1H2) is then inverted and appended to the end of the first half (P1H1) with the necessary corrections made to make sure the last node is the same as the start node to get a final mutated path (M1).

P1: {A, C, J | D, G, H, E, B, F, I, A}  ⇒ M1: {A, C, J, I, F, B, E, H, G, D, A}

2. Center Inverse Mutation – The chromosome (path or organism) is divided into two sections at the middle of the chromosome. Each of these sections are then inverted and added to a new path. The order of each of these halves remains constant, meaning the first inverted half remains the first half in the mutated path. The necessary corrections are made to amend the mutated path into a viable path so solve the TSP.

P1: {A, C, J, D, G, H, E | B, F, I, A}  ⇒ M1: {A, E, H, G, D, J, C, I, F, B, A}

Genetic Algorithms: Crossover

The method of crossover remains fairly constant regardless of the problem and scope. Crossover is achieved by first selecting a crossover point within a pair of defined and unique organisms P1 and P2 (which are the equivalent of parents for the crossed over parent). The chromosomes are then split at the selected crossover point. The second half of P2 (P2H2)  is then appended to the first half of P`1  (P1H1) to make one child chromosome (C1). The second child (C2) is made by appending the second half of P1 (P1H2) to the first half of P2 (P2H1). Continue reading “Genetic Algorithms: Crossover”

Genetic Algorithms: Intro

In this exercise, we attempt to utilize genetic algorithms to find an optimal, but not perfect, solution to the traveling salesman problem. A genetic algorithm emulates nature in its optimization process. Nature uses several mechanisms which have led to the emergence of new species and still better adapted to their environments. The laws which react to species evolution have been known by the research of Charles Darwin in the last century: Genetic algorithms are powerful methods of optimization that utilize these rules defined by evolution in their process to find a pseudo-optimal answer. These algorithms were modeled on the evolution of species. The genetic algorithm utilizes the properties of genetics such as selection, crossover, mutation.

Introduction to the Traveling Salesman Problem

The Problem
The traveling salesman problem (TSP) is a typical example of a very hard combinatorial optimization problem. The problem is to find the shortest tour that passes through each vertex in a given graph exactly once. The TSP problem is classified as an NP-complete problem. There are some intuitive methods to find the approximate solutions, but all of these methods have exponential complexity, they take too much computing time or require too much memory. Mathematically TSP can be expressed as:

Create a website or blog at WordPress.com

Up ↑