Full Metaheuristics For Machine Learning Algorithms and Applications 1st Edition Kanak Kalita Ebook All Chapters
Full Metaheuristics For Machine Learning Algorithms and Applications 1st Edition Kanak Kalita Ebook All Chapters
com
https://ebookname.com/product/metaheuristics-for-machine-
learning-algorithms-and-applications-1st-edition-kanak-
kalita/
OR CLICK BUTTON
DOWNLOAD EBOOK
https://ebookname.com/product/handbook-of-research-on-machine-
learning-applications-and-trends-algorithms-methods-and-
techniques-1st-edition-emilio-soria-olivas/
https://ebookname.com/product/computer-vision-principles-
algorithms-applications-learning-5th-edition-e-r-davies/
https://ebookname.com/product/introduction-to-statistical-
relational-learning-adaptive-computation-and-machine-learning-
adaptive-computation-and-machine-learning-series-lise-getoor/
https://ebookname.com/product/new-regionalism-and-asylum-
seekers-1st-edition-susan-kneebone/
Democratic Theorizing from the Margins 1st Edition
Marla Brettschneider
https://ebookname.com/product/democratic-theorizing-from-the-
margins-1st-edition-marla-brettschneider/
https://ebookname.com/product/martin-rauch-refined-earth-
construction-design-with-rammed-earth-detail-special-1st-edition-
marko-sauer/
https://ebookname.com/product/negotiating-buck-naked-doukhobor-
public-policy-and-conflict-resolution-1st-edition-gregory-j-cran/
https://ebookname.com/product/rapid-system-prototyping-with-
fpgas-accelerating-the-design-process-embedded-technology-1st-
edition-cofer/
https://ebookname.com/product/syntactic-argumentation-and-the-
structure-of-english-david-m-perlmutter/
Che Guevara Talks to Young People 2nd Edition Ernesto
Che Guevara
https://ebookname.com/product/che-guevara-talks-to-young-
people-2nd-edition-ernesto-che-guevara/
本书版权归John Wiley & Sons Inc.所有
Metaheuristics for
Machine Learning
The book series is aimed to provide comprehensive handbooks and reference books for the
benefit of scientists, research scholars, students and industry professional working towards
next generation industrial transformation. .
Publishers at Scrivener
Martin Scrivener ([email protected])
Phillip Carmical ([email protected])
Edited by
Kanak Kalita
Vel Tech University, Avadi, India
Narayanan Ganesh
Vellore Institute of Technology, Chennai, India
and
S. Balamurugan
Intelligent Research Consultancy Services, Coimbatore, Tamilnadu, India
All rights reserved. No part of this publication may be reproduced, stored in a retrieval system, or
transmitted, in any form or by any means, electronic, mechanical, photocopying, recording, or other-
wise, except as permitted by law. Advice on how to obtain permission to reuse material from this title
is available at http://www.wiley.com/go/permissions.
For details of our global editorial offices, customer services, and more information about Wiley prod-
ucts visit us at www.wiley.com.
ISBN 978-1-394-23392-2
Set in size of 11pt and Minion Pro by Manila Typesetting Company, Makati, Philippines
10 9 8 7 6 5 4 3 2 1
Foreword xv
Preface xvii
1 Metaheuristic Algorithms and Their Applications in Different
Fields: A Comprehensive Review 1
Abrar Yaqoob, Navneet Kumar Verma and Rabia Musheer Aziz
1.1 Introduction 2
1.2 Types of Metaheuristic Algorithms 3
1.2.1 Genetic Algorithms 3
1.2.2 Simulated Annealing 5
1.2.3 Particle Swarm Optimization 7
1.2.4 Ant Colony Optimization 8
1.2.5 Tabu Search 10
1.2.6 Differential Evolution 11
1.2.7 Harmony Search 12
1.2.8 Artificial Bee Colony 13
1.2.9 Firefly Algorithm 14
1.2.10 Gray Wolf Optimizer 14
1.2.11 Imperialist Competitive Algorithm 14
1.2.12 Bat Algorithm 15
1.2.13 Cuckoo Search 15
1.2.14 Flower Pollination Algorithm 16
1.2.15 Krill Herd Algorithm 17
1.2.16 Whale Optimization Algorithm 17
1.2.17 Glowworm Swarm Optimization 18
1.2.18 Cat Swarm Optimization 18
1.2.19 Grasshopper Optimization Algorithm 19
1.2.20 Moth–Flame Optimization 19
1.3 Application of Metaheuristic Algorithms 20
1.4 Future Direction 25
1.5 Conclusion 26
References 26
2 A Comprehensive Review of Metaheuristics
for Hyperparameter Optimization in Machine Learning 37
Ramachandran Narayanan and Narayanan Ganesh
2.1 Introduction 38
2.1.1 Background and Motivation 38
2.1.2 Scope of the Review 38
2.1.3 Organization of the Paper 39
2.2 Fundamentals of Hyperparameter Optimization 39
2.2.1 Introduction to Hyperparameters 40
2.2.2 Importance of Hyperparameter Optimization 40
2.2.3 Performance Metrics for Hyperparameter
Optimization 41
2.2.4 Challenges in Hyperparameter Optimization 41
2.3 Overview of Metaheuristic Optimization Techniques 42
2.3.1 Definition and Characteristics of Metaheuristics 42
2.3.2 Classification of Metaheuristic Techniques 42
2.4 Population-Based Metaheuristic Techniques 43
2.4.1 Genetic Algorithms 44
2.4.2 Particle Swarm Optimization 44
2.4.3 Differential Evolution 44
2.4.4 Ant Colony Optimization 45
2.4.5 Biogeography-Based Optimization 45
2.4.6 Cuckoo Search 45
2.4.7 Gray Wolf Optimizer 45
2.4.8 Whale Optimization Algorithm 46
2.4.9 Recent Developments in Population-Based
Metaheuristics 46
2.5 Single Solution-Based Metaheuristic Techniques 47
2.5.1 Simulated Annealing 47
2.5.2 Tabu Search 47
2.5.3 Harmony Search 48
2.5.4 Bat Algorithm 48
2.5.5 Recent Developments in Single Solution-Based
Metaheuristics 48
2.6 Hybrid Metaheuristic Techniques 49
2.6.1 Genetic Algorithm and Particle Swarm
Optimization Hybrid 49
2.6.2 Genetic Algorithm and Simulated Annealing Hybrid 49
3.3.3 Ultrasound 79
3.3.4 Magnetic Resonance Imaging 80
3.3.5 Infrared Breast Thermal Images 80
3.4 Research Survey 83
3.4.1 Histopathological WSI 83
3.4.1.1 Machine Learning-Based Histopathological
WSI 83
3.4.1.2 Deep Learning-Based Histopathological WSI 83
3.4.2 Digital Mammogram 84
3.4.2.1 Machine Learning-Based Digital
Mammogram 84
3.4.2.2 Deep Learning-Based Digital Mammogram 85
3.4.3 Ultrasound 86
3.4.3.1 Machine Learning-Based Ultrasound 86
3.4.3.2 Deep Learning-Based Ultrasound 86
3.4.4 MRI-Based 87
3.4.4.1 Machine Learning-Based MRI Analysis 87
3.4.4.2 Deep Learning-Based MRI Analysis 88
3.4.5 Thermography-Based 89
3.4.5.1 Machine Learning-Based Thermography
Analysis 89
3.4.5.2 Deep Learning-Based Thermography Analysis 90
3.5 Conclusion 90
3.6 Acknowledgment 91
References 92
4 Enhancing Feature Selection Through Metaheuristic
Hybrid Cuckoo Search and Harris Hawks Optimization
for Cancer Classification 95
Abrar Yaqoob, Navneet Kumar Verma, Rabia Musheer Aziz
and Akash Saxena
4.1 Introduction 96
4.2 Related Work 99
4.3 Proposed Methodology 104
4.3.1 Cuckoo Search Algorithm 104
4.3.2 Harris Hawks Algorithm 106
4.3.3 The Proposed Hybrid Algorithm 110
4.3.4 Classifiers Used 113
4.3.4.1 KNN Classifier 113
4.3.4.2 SVM Classifier 113
4.3.4.3 NB Classifier 113
4.3.4.4 mRMR 114
xv
xvii
In closing, we offer our sincere thanks to the Scrivener and Wiley pub-
lishing teams for their help with this book. We entreat you to immerse
your intellect and curiosity in the mesmerizing world of metaheuristics
and their applications. Here’s to an enlightening reading journey ahead!
Kanak Kalita
Narayanan Ganesh
S. Balamurugan
Abstract
A potent method for resolving challenging optimization issues is provided by
metaheuristic algorithms, which are heuristic optimization approaches. They
provide an effective technique to explore huge solution spaces and identify close
to ideal or optimal solutions. They are iterative and often inspired by natural or
social processes. This study provides comprehensive information on metaheuris-
tic algorithms and the many areas in which they are used. Heuristic optimization
algorithms are well-known for their success in handling challenging optimization
issues. They are a potent tool for problem-solving. Twenty well-known meta-
heuristic algorithms, such as the tabu search, particle swarm optimization, ant col-
ony optimization, genetic algorithms, simulated annealing, and harmony search,
are included in the article. The article extensively explores the applications of these
algorithms in diverse domains such as engineering, finance, logistics, and com-
puter science. It underscores particular instances where metaheuristic algorithms
have found utility, such as optimizing structural design, controlling dynamic sys-
tems, enhancing manufacturing processes, managing supply chains, and address-
ing problems in artificial intelligence, data mining, and software engineering. The
paper provides a thorough insight into the versatile deployment of metaheuristic
algorithms across different sectors, highlighting their capacity to tackle complex
optimization problems across a wide range of real-world scenarios.
Kanak Kalita, Narayanan Ganesh and S. Balamurugan (eds.) Metaheuristics for Machine Learning:
Algorithms and Applications, (1–36) © 2024 Scrivener Publishing LLC
1.1 Introduction
Metaheuristics represent a category of optimization methods widely
employed to tackle intricate challenges in diverse domains such as engi-
neering, economics, computer science, and operations research. These
adaptable techniques are designed to locate favorable solutions by explor-
ing an extensive array of possibilities and avoiding stagnation in subopti-
mal outcomes [1]. The roots and advancement of metaheuristics can be
traced back to the early 1950s when George Dantzig introduced the simplex
approach for linear programming [2]. This innovative technique marked
a pivotal point in optimization and paved the way for the emergence of
subsequent optimization algorithms. Nonetheless, the simplex method’s
applicability is confined to linear programming issues and does not extend
to nonlinear problems. In the latter part of the 1950s, John Holland devised
the genetic algorithm, drawing inspiration from concepts of natural selec-
tion and evolution [3]. The genetic algorithm assembles a set of potential
solutions and iteratively enhances this set through genetic operations like
mutation, crossover, and selection [4]. The genetic algorithm was a major
milestone in the development of metaheuristics and opened up new pos-
sibilities for resolving difficult optimization issues. During the 1980s and
1990s, the field of metaheuristics experienced significant expansion and
the emergence of numerous novel algorithms. These techniques, which
include simulated annealing (SA), tabu search (TS), ant colony optimiza-
tion (ACO), particle swarm optimization (PSO), and differential evolution
(DE), were created expressly to deal with a variety of optimization issues.
They drew inspiration from concepts like simulated annealing, tabu search,
swarm intelligence, and evolutionary algorithms [5].
The term “meta-” in metaheuristic algorithms indicates a higher level
of operation beyond simple heuristics, leading to enhanced performance.
These algorithms balance local search and global exploration by using ran-
domness to provide a range of solutions. Despite the fact that metaheuris-
tics are frequently employed, there is not a single definition of heuristics
and metaheuristics in academic literature, and some academics even use
the terms synonymously. However, it is currently fashionable to classify as
metaheuristics all algorithms of a stochastic nature that utilize randomness
and comprehensive exploration across the entire system. Metaheuristic
algorithms are ideally suited for global optimization and nonlinear mod-
eling because randomization is a useful method for switching from local
to global search. As a result, almost all metaheuristic algorithms can be
used to solve issues involving nonlinear optimization at the global level [6].
In recent years, the study of metaheuristics has developed over time and
new algorithms are being developed that combine different concepts and
techniques from various fields such as machine learning, deep learning,
and data science. The development and evolution of metaheuristics have
made significant contributions to solving complex optimization problems
and have led to the development of powerful tools for decision-making in
various domains [7]. In order to find solutions in a huge search area, meta-
heuristic algorithms are founded on the idea of mimicking the behaviors of
natural or artificial systems. These algorithms are particularly valuable for
tackling problems that are challenging or impossible to solve using tradi-
tional optimization methods. Typically, metaheuristic algorithms involve
iterations and a series of steps that modify a potential solution until an
acceptable one is discovered. Unlike other optimization techniques that
may become stuck in local optimal solutions, metaheuristic algorithms are
designed to explore the entire search space. They also exhibit resilience
to noise or uncertainty in the optimization problem. The adaptability and
plasticity of metaheuristic algorithms are two of their main features. They
can be modified to take into account certain limitations or goals of the
current task and are applicable to a wide variety of optimization situations.
However, for complex problems with extensive search spaces, these algo-
rithms may converge slowly toward an optimal solution, and there is no
guarantee that they will find the global optimum. Metaheuristic algorithms
find extensive application in various fields including engineering, finance,
logistics, and computer science. They have been successfully employed in
solving diverse problems such as optimizing design, control, and manufac-
turing processes, portfolio selection, and risk management strategies [8].
Crossover/Mutation
Termination Criteria
NO Reached?
Yes
END
Yes No
Δf<=0
No
Is the iteration
satisfied?
Yes
No
Is the stopping Decrease slightly the
criteria met? temperature
Yes
START
END
pheromone trails left behind by other ants. The strength of the phero-
mone trail represents the quality of the solution that passed through that
edge. As more ants traverse the same edge, the pheromone trail becomes
stronger. This is similar to how ants communicate with each other in real
life by leaving pheromone trails to signal the location of food sources [39,
40]. The ACO algorithm has several key parameters, such as the amount
of pheromone each ant leaves, the rate at which pheromones evaporate,
and the balance between exploiting the best solution and exploring new
solutions. The optimal values of the parameters in the algorithm are deter-
mined through a process of experimentation and refinement to obtain the
best possible results for a specific problem [41].
The ACO has showcased impressive achievements in resolving diverse
optimization challenges, including but not limited to the traveling sales-
man problem, vehicle routing, and job scheduling. One notable advantage
of the algorithm is its ability to swiftly discover favorable solutions, even
Randomly Ant
Initial Population
Located
Each Ant
Randomly
Choose a path
Update
Pheremone Trial
Search Process
Pheremone Continue
Evaporation
Evaluation Stop
Condition
Results
find the optimal solution. The algorithm adheres to the fundamental steps
of mutation, crossover, and selection, which are key elements commonly
shared among numerous evolutionary algorithms [48].
In the process of the differential evolution, a population of potential
solutions undergoes iterative evolution through the implementation of the
following sequential steps:
The control variables, such as the harmony memory size, pitch adjust-
ment rate, and number of iterations, affect how well the HS performs. The
method follows the core phases of mutation, crossover, and selection, and
the approach has been utilized to tackle diverse optimization challenges,
such as managing water resources, designing structures, and operating
power systems [52, 53].