Details

Machine Learning Algorithms and Applications


Machine Learning Algorithms and Applications


1. Aufl.

von: Mettu Srinivas, G. Sucharitha, Anjanna Matta

164,99 €

Verlag: Wiley
Format: PDF
Veröffentl.: 16.08.2021
ISBN/EAN: 9781119769255
Sprache: englisch
Anzahl Seiten: 368

DRM-geschütztes eBook, Sie benötigen z.B. Adobe Digital Editions und eine Adobe ID zum Lesen.

Beschreibungen

<p><b>Machine Learning Algorithms </b>is for current and ambitious machine learning specialists looking to implement solutions to real-world machine learning problems. It talks entirely about the various applications of machine and deep learning techniques, with each chapter dealing with a novel approach of machine learning architecture for a specific application, and then compares the results with previous algorithms.</p> <p>The book discusses many methods based in different fields, including statistics, pattern recognition, neural networks, artificial intelligence, sentiment analysis, control, and data mining, in order to present a unified treatment of machine learning problems and solutions. All learning algorithms are explained so that the user can easily move from the equations in the book to a computer program.</p>
<p>Acknowledgments xv</p> <p>Preface xvii</p> <p><b>Part 1: Machine Learning for Industrial Applications 1</b></p> <p><b>1 A Learning-Based Visualization Application for Air Quality Evaluation During COVID-19 Pandemic in Open Data Centric Services 3<br /></b><i>Priyank Jain and Gagandeep Kaur</i></p> <p>1.1 Introduction 4</p> <p>1.1.1 Open Government Data Initiative 4</p> <p>1.1.2 Air Quality 4</p> <p>1.1.3 Impact of Lockdown on Air Quality 5</p> <p>1.2 Literature Survey 5</p> <p>1.3 Implementation Details 6</p> <p>1.3.1 Proposed Methodology 7</p> <p>1.3.2 System Specifications 8</p> <p>1.3.3 Algorithms 8</p> <p>1.3.4 Control Flow 10</p> <p>1.4 Results and Discussions 11</p> <p>1.5 Conclusion 21</p> <p>References 21</p> <p><b>2 Automatic Counting and Classification of Silkworm Eggs Using Deep Learning 23<br /></b><i>Shreedhar Rangappa, Ajay A. and G. S. Rajanna</i></p> <p>2.1 Introduction 23</p> <p>2.2 Conventional Silkworm Egg Detection Approaches 24</p> <p>2.3 Proposed Method 25</p> <p>2.3.1 Model Architecture 26</p> <p>.3.2 Foreground-Background Segmentation 28</p> <p>2.3.3 Egg Location Predictor 30</p> <p>2.3.4 Predicting Egg Class 31</p> <p>2.4 Dataset Generation 35</p> <p>2.5 Results 35</p> <p>2.6 Conclusion 37</p> <p>Acknowledgment 38</p> <p>References 38</p> <p><b>3 A Wind Speed Prediction System Using Deep Neural Networks 41<br /></b><i>Jaseena K. U. and Binsu C. Kovoor</i></p> <p>3.1 Introduction 42</p> <p>3.2 Methodology 45</p> <p>3.2.1 Deep Neural Networks 45</p> <p>3.2.2 The Proposed Method 47</p> <p>3.2.2.1 Data Acquisition 47</p> <p>3.2.2.2 Data Pre-Processing 48</p> <p>3.2.2.3 Model Selection and Training 50</p> <p>3.2.2.4 Performance Evaluation 51</p> <p>3.2.2.5 Visualization 51</p> <p>3.3 Results and Discussions 52</p> <p>3.3.1 Selection of Parameters 52</p> <p>3.3.2 Comparison of Models 53</p> <p>3.4 Conclusion 57</p> <p>References 57</p> <p><b>4 Res-SE-Net: Boosting Performance of ResNets by Enhancing Bridge Connections 61<br /></b><i>Varshaneya V., S. Balasubramanian and Darshan Gera</i></p> <p>4.1 Introduction 61</p> <p>4.2 Related Work 62</p> <p>4.3 Preliminaries 63</p> <p>4.3.1 ResNet 63</p> <p>4.3.2 Squeeze-and-Excitation Block 64</p> <p>4.4 Proposed Model 66</p> <p>4.4.1 Effect of Bridge Connections in ResNet 66</p> <p>4.4.2 Res-SE-Net: Proposed Architecture 67</p> <p>4.5 Experiments 68</p> <p>4.5.1 Datasets 68</p> <p>4.5.2 Experimental Setup 68</p> <p>4.6 Results 69</p> <p>4.7 Conclusion 73</p> <p>References 74</p> <p><b>5 Hitting the Success Notes of Deep Learning 77<br /></b><i>Sakshi Aggarwal, Navjot Singh and K.K. Mishra</i></p> <p>5.1 Genesis 78</p> <p>5.2 The Big Picture: Artificial Neural Network 79</p> <p>5.3 Delineating the Cornerstones 80</p> <p>5.3.1 Artificial Neural Network vs. Machine Learning 80</p> <p>5.3.2 Machine Learning vs. Deep Learning 81</p> <p>5.3.3 Artificial Neural Network vs. Deep Learning 81</p> <p>5.4 Deep Learning Architectures 82</p> <p>5.4.1 Unsupervised Pre-Trained Networks 82</p> <p>5.4.2 Convolutional Neural Networks 83</p> <p>5.4.3 Recurrent Neural Networks 84</p> <p>5.4.4 Recursive Neural Network 85</p> <p>5.5 Why is CNN Preferred for Computer Vision Applications? 85</p> <p>5.5.1 Convolutional Layer 86</p> <p>5.5.2 Nonlinear Layer 86</p> <p>5.5.3 Pooling Layer 87</p> <p>5.5.4 Fully Connected Layer 87</p> <p>5.6 Unravel Deep Learning in Medical Diagnostic Systems 89</p> <p>5.7 Challenges and Future Expectations 94</p> <p>5.8 Conclusion 94</p> <p>References 95</p> <p><b>6 Two-Stage Credit Scoring Model Based on Evolutionary Feature Selection and Ensemble Neural Networks 99<br /></b><i>Diwakar Tripathi, Damodar Reddy Edla, Annushree Bablani and Venkatanareshbabu Kuppili</i></p> <p>6.1 Introduction 100</p> <p>6.1.1 Motivation 100</p> <p>6.2 Literature Survey 101</p> <p>6.3 Proposed Model for Credit Scoring 103</p> <p>6.3.1 Stage-1: Feature Selection 104</p> <p>6.3.2 Proposed Criteria Function 105</p> <p>6.3.3 Stage-2: Ensemble Classifier 106</p> <p>6.4 Results and Discussion 107</p> <p>6.4.1 Experimental Datasets and Performance Measures 107</p> <p>6.4.2 Classification Results With Feature Selection 108</p> <p>6.5 Conclusion 112</p> <p>References 113</p> <p><b>7 Enhanced Block-Based Feature Agglomeration Clustering for Video Summarization 117<br /></b><i>Sreeja M. U. and Binsu C. Kovoor</i></p> <p>7.1 Introduction 118</p> <p>7.2 Related Works 119</p> <p>7.3 Feature Agglomeration Clustering 122</p> <p>7.4 Proposed Methodology 122</p> <p>7.4.1 Pre-Processing 123</p> <p>7.4.2 Modified Block Clustering Using Feature Agglomeration Technique 125</p> <p>7.4.3 Post-Processing and Summary Generation 127</p> <p>7.5 Results and Analysis 129</p> <p>7.5.1 Experimental Setup and Data Sets Used 129</p> <p>7.5.2 Evaluation Metrics 130</p> <p>7.5.3 Evaluation 131</p> <p>7.6 Conclusion 138</p> <p>References 138</p> <p><b>Part 2: Machine Learning for Healthcare Systems 141</b></p> <p><b>8 Cardiac Arrhythmia Detection and Classification From ECG Signals Using XGBoost Classifier 143<br /></b><i>Saroj Kumar Pandeyz, Rekh Ram Janghel and Vaibhav Gupta</i></p> <p>8.1 Introduction 143</p> <p>8.2 Materials and Methods 145</p> <p>8.2.1 MIT-BIH Arrhythmia Database 146</p> <p>8.2.2 Signal Pre-Processing 147</p> <p>8.2.3 Feature Extraction 147</p> <p>8.2.4 Classification 148</p> <p>8.2.4.1 XGBoost Classifier 148</p> <p>8.2.4.2 AdaBoost Classifier 149</p> <p>8.3 Results and Discussion 149</p> <p>8.4 Conclusion 155</p> <p>References 156</p> <p><b>9 GSA-Based Approach for Gene Selection from Microarray Gene Expression Data 159<br /></b><i>Pintu Kumar Ram and Pratyay Kuila</i></p> <p>9.1 Introduction 159</p> <p>9.2 Related Works 161</p> <p>9.3 An Overview of Gravitational Search Algorithm 162</p> <p>9.4 Proposed Model 163</p> <p>9.4.1 Pre-Processing 163</p> <p>9.4.2 Proposed GSA-Based Feature Selection 164</p> <p>9.5 Simulation Results 166</p> <p>9.5.1 Biological Analysis 168</p> <p>9.6 Conclusion 172</p> <p>References 172</p> <p><b>Part 3: Machine Learning for Security Systems 175</b></p> <p><b>10 On Fusion of NIR and VW Information for Cross-Spectral Iris Matching 177<br /></b><i>Ritesh Vyas, Tirupathiraju Kanumuri, Gyanendra Sheoran and Pawan Dubey</i></p> <p>10.1 Introduction 177</p> <p>10.1.1 Related Works 178</p> <p>10.2 Preliminary Details 179</p> <p>10.2.1 Fusion 181</p> <p>10.3 Experiments and Results 182</p> <p>10.3.1 Databases 182</p> <p>10.3.2 Experimental Results 182</p> <p>10.3.2.1 Same Spectral Matchings 183</p> <p>10.3.2.2 Cross Spectral Matchings 184</p> <p>10.3.3 Feature-Level Fusion 186</p> <p>10.3.4 Score-Level Fusion 189</p> <p>10.4 Conclusions 190</p> <p>References 190</p> <p><b>11 Fake Social Media Profile Detection 193<br /></b><i>Umita Deepak Joshi, Vanshika, Ajay Pratap Singh, Tushar Rajesh Pahuja, Smita Naval and Gaurav Singal</i></p> <p>11.1 Introduction 194</p> <p>11.2 Related Work 195</p> <p>11.3 Methodology 197</p> <p>11.3.1 Dataset 197</p> <p>11.3.2 Pre-Processing 198</p> <p>11.3.3 Artificial Neural Network 199</p> <p>11.3.4 Random Forest 202</p> <p>11.3.5 Extreme Gradient Boost 202</p> <p>11.3.6 Long Short-Term Memory 204</p> <p>11.4 Experimental Results 204</p> <p>11.5 Conclusion and Future Work 207</p> <p>Acknowledgment 207</p> <p>References 207</p> <p><b>12 Extraction of the Features of Fingerprints Using Conventional Methods and Convolutional Neural Networks 211<br /></b><i>E. M. V. Naga Karthik and Madan Gopal</i></p> <p>12.1 Introduction 212</p> <p>12.2 Related Work 213</p> <p>12.3 Methods and Materials 215</p> <p>12.3.1 Feature Extraction Using SURF 215</p> <p>12.3.2 Feature Extraction Using Conventional Methods 216</p> <p>12.3.2.1 Local Orientation Estimation 216</p> <p>12.3.2.2 Singular Region Detection 218</p> <p>12.3.3 Proposed CNN Architecture 219</p> <p>12.3.4 Dataset 221</p> <p>12.3.5 Computational Environment 221</p> <p>12.4 Results 222</p> <p>12.4.1 Feature Extraction and Visualization 223</p> <p>12.5 Conclusion 226</p> <p>Acknowledgements 226</p> <p>References 226</p> <p><b>13 Facial Expression Recognition Using Fusion of Deep Learning and Multiple Features 229<br /></b><i>M. Srinivas, Sanjeev Saurav, Akshay Nayak and Murukessan A. P.</i></p> <p>13.1 Introduction 230</p> <p>13.2 Related Work 232</p> <p>13.3 Proposed Method 235</p> <p>13.3.1 Convolutional Neural Network 236</p> <p>13.3.1.1 Convolution Layer 236</p> <p>13.3.1.2 Pooling Layer 237</p> <p>13.3.1.3 ReLU Layer 238</p> <p>13.3.1.4 Fully Connected Layer 238</p> <p>13.3.2 Histogram of Gradient 239</p> <p>13.3.3 Facial Landmark Detection 240</p> <p>13.3.4 Support Vector Machine 241</p> <p>13.3.5 Model Merging and Learning 242</p> <p>13.4 Experimental Results 242</p> <p>13.4.1 Datasets 242</p> <p>13.5 Conclusion 245</p> <p>Acknowledgement 245</p> <p>References 245</p> <p><b>Part 4: Machine Learning for Classification and Information Retrieval Systems 247</b></p> <p><b>14 AnimNet: An Animal Classification Network using Deep Learning 249<br /></b><i>Kanak Manjari, Kriti Singhal, Madhushi Verma and Gaurav Singal</i></p> <p>14.1 Introduction 249</p> <p>14.1.1 Feature Extraction 250</p> <p>14.1.2 Artificial Neural Network 250</p> <p>14.1.3 Transfer Learning 251</p> <p>14.2 Related Work 252</p> <p>14.3 Proposed Methodology 254</p> <p>14.3.1 Dataset Preparation 254</p> <p>14.3.2 Training the Model 254</p> <p>14.4 Results 258</p> <p>14.4.1 Using Pre-Trained Networks 259</p> <p>14.4.2 Using AnimNet 259</p> <p>14.4.3 Test Analysis 260</p> <p>14.5 Conclusion 263</p> <p>References 264</p> <p><b>15 A Hybrid Approach for Feature Extraction From Reviews to Perform Sentiment Analysis 267<br /></b><i>Alok Kumar and Renu Jain</i></p> <p>15.1 Introduction 268</p> <p>15.2 Related Work 269</p> <p>15.3 The Proposed System 271</p> <p>15.3.1 Feedback Collector 272</p> <p>15.3.2 Feedback Pre-Processor 272</p> <p>15.3.3 Feature Selector 272</p> <p>15.3.4 Feature Validator 274</p> <p>15.3.4.1 Removal of Terms From Tentative List of Features on the Basis of Syntactic Knowledge 274</p> <p>15.3.4.2 Removal of Least Significant Terms on the Basis of Contextual Knowledge 276</p> <p>15.3.4.3 Removal of Less Significant Terms on the Basis of Association With Sentiment Words 277</p> <p>15.3.4.4 Removal of Terms Having Similar Sense 278</p> <p>15.3.4.5 Removal of Terms Having Same Root 279</p> <p>15.3.4.6 Identification of Multi-Term Features 279</p> <p>15.3.4.7 Identification of Less Frequent Feature 279</p> <p>15.3.5 Feature Concluder 281</p> <p>15.4 Result Analysis 282</p> <p>15.5 Conclusion 286</p> <p>References 286</p> <p><b>16 Spark-Enhanced Deep Neural Network Framework for Medical Phrase Embedding 289<br /></b><i>Amol P. Bhopale and Ashish Tiwari</i></p> <p>16.1 Introduction 290</p> <p>16.2 Related Work 291</p> <p>16.3 Proposed Approach 292</p> <p>16.3.1 Phrase Extraction 292</p> <p>16.3.2 Corpus Annotation 294</p> <p>16.3.3 Phrase Embedding 294</p> <p>16.4 Experimental Setup 297</p> <p>16.4.1 Dataset Preparation 297</p> <p>16.4.2 Parameter Setting 297</p> <p>16.5 Results 298</p> <p>16.5.1 Phrase Extraction 298</p> <p>16.5.2 Phrase Embedding 298</p> <p>16.6 Conclusion 303</p> <p>References 303</p> <p><b>17 Image Anonymization Using Deep Convolutional Generative Adversarial Network 305<br /></b><i>Ashish Undirwade and Sujit Das</i></p> <p>17.1 Introduction 306</p> <p>17.2 Background Information 310</p> <p>17.2.1 Black Box and White Box Attacks 310</p> <p>17.2.2 Model Inversion Attack 311</p> <p>17.2.3 Differential Privacy 312</p> <p>17.2.3.1 Definition 312</p> <p>17.2.4 Generative Adversarial Network 313</p> <p>17.2.5 Earth-Mover (EM) Distance/Wasserstein Metric 316</p> <p>17.2.6 Wasserstein GAN 317</p> <p>17.2.7 Improved Wasserstein GAN (WGAN-GP) 317</p> <p>17.2.8 KL Divergence and JS Divergence 318</p> <p>17.2.9 DCGAN 319</p> <p>17.3 Image Anonymization to Prevent Model Inversion Attack 319</p> <p>17.3.1 Algorithm 321</p> <p>17.3.2 Training 322</p> <p>17.3.3 Noise Amplifier 323</p> <p>17.3.4 Dataset 324</p> <p>17.3.5 Model Architecture 324</p> <p>17.3.6 Working 325</p> <p>17.3.7 Privacy Gain 325</p> <p>17.4 Results and Analysis 326</p> <p>17.5 Conclusion 328</p> <p>References 329</p> <p>Index 331</p>
<p><B>Mettu Srinivas</b> PhD from the Indian Institute of Technology Hyderabad, and is currently an assistant professor in the Department of Computer Science and Engineering, NIT Warangal, India.</p> <p><b>G. Sucharitha </b>PhD from KL University, Vijayawada and is currently an assistant professor in the Department of Electronics and Communication Engineering at ICFAI Foundation for Higher Education Hyderabad. <p><b>Anjanna Matta</b> PhD from the Indian Institute of Technology Hyderabad and is currently an assistant professor in the Department of Mathematics at ICFAI Foundation for Higher Education Hyderabad. <p><b>Prasenjit Chatterjee </b>PhD is an associate professor in the Mechanical Engineering Department at MCKV Institute of Engineering, India.
<p><b>The book is written for experienced and starting machine learning specialists looking to implement solutions to real-world machine learning problems.</b></p> <p><i>Machine Learning Algorithms and Applications</i> shows how one can easily adopt machine learning to build solutions for small applications. It clearly explains the various applications of machine and deep learning for use in the medical field, animal classification, gene selection from microarray gene expression data, sentiment analysis, manufacturing, fake profile detection in social media, farming sectors, etc.</p> <p>For the veteran and new machine learning specialists who are looking to implement solutions to real-world machine learning problems, this book thoroughly discusses the various applications of machine and deep learning techniques. Each chapter deals with the novel approach of machine learning architecture for a specific application and its results include comparisons with previous algorithms. In order to present a unified treatment of machine learning problems and solutions, many methods based in different fields are discussed, including statistics, pattern recognition, neural networks, artificial intelligence, sentiment analysis, control, and data mining. Furthermore, all learning algorithms are explained in a way that makes it easy for students to move from the equations in the book to a computer program</p> <p><b>Audience</b> The book is primarily intended for researchers, students, and professionals in computer science, information technology, cybernetics, system sciences, engineering, statistics, and social sciences, and as a reference for software professionals and practitioners in biomedical fields, manufacturing, supply chain and logistics, agriculture, and Industry 4.0 professionals.</p>

Diese Produkte könnten Sie auch interessieren:

Visualize This
Visualize This
von: Nathan Yau
EPUB ebook
28,99 €
AI for Humanity
AI for Humanity
von: Andeed Ma, James Ong, Siok Siok Tan
EPUB ebook
26,99 €
AI for Humanity
AI for Humanity
von: Andeed Ma, James Ong, Siok Siok Tan
PDF ebook
26,99 €