0% found this document useful (0 votes)
11 views

ML RUSA Module 7 Genetic Algorithm

Uploaded by

mohamed2003imran
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
11 views

ML RUSA Module 7 Genetic Algorithm

Uploaded by

mohamed2003imran
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 28

Evolutionary Learning

Module 7
What is Genetic Algorithm
• GA is a heuristic search algorithm inspired by Darwin’s natural
evolution theory.
• The algorithm reflects the process of Natural Selection in which
the fittest individuals are selected for reproduction in order to
produce offspring of the next generation.
• Rather than search from general-to-specific hypotheses, or from
simple-to-complex, GA generate successor hypotheses by
repeatedly mutating and recombining parts of the best currently
known hypotheses.
• A collection of hypotheses called the current population
• Generate-and-test beam-search: present population is
updated by replacing some fraction of the population
by offspring of the most fit current hypotheses for
next generation.
• Widely used in many real-world applications such as
image processing, game programming, artificial
creativity.
THE GENETIC ALGORITHM (GA)
• The Genetic Algorithm is a computational approximation to how
evolution performs search, which is by producing modifications of
the parent genomes in their offspring and thus producing new
individuals with different fitness.

• To model simple genetics inside a computer and solve problems


with it are:
• a method for representing problems as chromosomes
• a way to calculate the fitness of a solution
• a selection method to choose parents
• a way to generate offspring by breeding the parents
String Representation
• GAs use a string, with each element of the string (equivalent to the
gene) being chosen from some alphabet.
• The different values in the alphabet, which is often just binary
• create a set of random strings to be our initial population.
Facts about GA
• Evolution is known to be a successful, robust method for
adaptation within biological systems.
• GAS can search spaces of hypotheses containing complex
interacting parts, where the impact of each part on overall
hypothesis fitness may be difficult to model.
• Genetic algorithms are easily parallelized and can take
advantage of the decreasing costs of powerful computer
hardware.
Phases of Genetic Algorithm
• Initial Population
• Fitness Assignment
• Selection
• Crossover
• Termination
Initialize
population

Genetic
Fitness Parent operations: Comput
computatio selection Crossover, e fitness
Mutation
n

If criteria
End
met?
Initial Population
0 1 Gene
• To begin GA process, we want a set of
individual chromosomes
• Each individual is called chromosome which A1 0 0 0 0 0 0
is a solution to the problem.
• Each individual is characterized by a set of
A2 1 1 1 1 1 1
parameters called “genes”
Chromosome
• Genes are combined in the form of string to
form Chromosome. A3 1 0 1 0 1 1

A4 1 1 0 1 1 0
Population
Fitness Function
0 1 Gene
• Determines how fit an individual is: that
means the ability of an individual to compete
with other individual. A1 0 0 0 0 0 0
• Fitness function gives a fitness score to each
individual.
A2 1 1 1 1 1 1
• The probability that an individual will be
selected for next generation is based on the Chromosome
fitness score. 1 0 1 0 1 1
A3

A4 1 1 0 1 1 0
Population
Population
• After the initial population is chosen randomly, the algorithm evolves
to produce each successive generation, with the hope being that
there will be progressively fitter individuals in the populations as the
number of generations increases.
• first generation usually being created randomly.
• The fitness of each string is then evaluated, and that first generation is
bred together to make a second generation, which is then used to
generate a third, and so on.
Generating Offspring: Parent Selection
For the current generation we need to select those strings that will be used to generate
new offspring.
Tournament Selection
Repeatedly pick four strings from the population, with replacement and put the fittest
two of them into the mating pool.
Truncation Selection
Pick some fraction f of the best strings and ignore the rest. For example, f = 0.5 is often
used, so the best 50% of the strings are put into the mating pool, each twice so that the
pool is the right size. The pool is randomly shuffled to make the pairs.
Fitness Proportional Selection
The better option is to select strings probabilistically, with the probability of a string being
selected being proportional to its fitness. The function that is generally used is (for string
α):
GENERATING OFFSPRING: GENETIC OPERATORS
There are two genetic operators that are generally used:
1. Cross Over
2. Mutation
Crossover :
Crossover is the operator that performs global exploration.
we generate the new string as part of the first parent and part of the second.
The crossover operator produces two new offspring from two parent strings, by copying
selected bits from each parent. The bit at position i in each offspring is copied from the bit
at position i in one of the two parents.
The choice of which parent contributes the bit for position i is determined by an additional
string called the crossover mask

1. Single point crossover


2. Multi-point crossover
3. Uniform crossover
2. Mutation
• The other genetic operator is mutation,
which effectively performs local random
search.
• The value of any element of the string can
be changed, governed by some (usually
low) probability p.
• For our binary alphabet in the knapsack
problem, mutation causes a bit-flip.
• For chromosomes with real values, some
random number is generally added or
subtracted from the current value.
Evaluating Fitness
• forms the problem-specific part of the GA
• The fitness function defines the criterion for ranking potential
hypotheses and for probabilistically selecting them for inclusion in the
next generation population. If the task is to learn classification rules, then
the fitness function typically has a component that scores the
classification accuracy of the rule over a set of provided training
examples.
• the probability that a hypothesis will be selected is given by the ratio of
its fitness to the fitness of other members of the current population.
• This method is sometimes called jitness proportionate selection, or
roulette wheel selection.
maximum fitness in each generation can decrease, at least temporarily, and
since in the end we are only interested in the ‘best’ solution.
we could potentially lose a really good string that we find early on in the
search, and that we never see again.
Ways to avoid this:
1. Elitism - which takes some number of the fittest strings from one
generation and puts them directly into the next population, replacing
strings that are already there either at random, or by choosing the least
fit to replace.
2. Tournament - where the two parents and their two offsprings compete,
with the two fittest out of the four being put into the new population.
Elitism and tournament both leads to premature convergence
alternative ways to solve the problem of premature convergence is
• Niching (also known as using island populations) and
• Fitness Sharing
3. Niching - where the population is separated into several
subpopulations, which all evolve independently for some
period of time, so that they are likely to have converged to
different local maxima, and a few members of one
subpopulation are occasionally injected as ‘immigrants’ into
another subpopulation.
4. Fitness sharing - where the fitness of a particular string is
averaged across the number of times that that string
appears in the population.
Termination

• The algorithm terminated if the population has


converged. Means the new offspring not having
significant difference from previous generation
Genetic Algorithm Solved Example
• Consider the function of maximizing the function:
f(x) = x2
• Where x in permitted to vary between 0 to 31

Encoding:
• To map upto 31 use 5 bit encoding 0(00000) to
1(11111)
Select Initial Population
• At random 4
• Choose any four from the population
f(x)/Sum f(x)/AVG
= 144/1155 = 0.1247 = 144/288 = 0.4987

You might also like