SwePub
Sök i SwePub databas

  Utökad sökning

Träfflista för sökning "WFRF:(Altarabichi Mohammed Ghaith 1981 ) "

Sökning: WFRF:(Altarabichi Mohammed Ghaith 1981 )

  • Resultat 1-9 av 9
Sortera/gruppera träfflistan
   
NumreringReferensOmslagsbildHitta
1.
  • Altarabichi, Mohammed Ghaith, 1981- (författare)
  • Evolving intelligence : Overcoming challenges for Evolutionary Deep Learning
  • 2024
  • Doktorsavhandling (övrigt vetenskapligt/konstnärligt)abstract
    • Deep Learning (DL) has achieved remarkable results in both academic and industrial fields over the last few years. However, DL models are often hard to design and require proper selection of features and tuning of hyper-parameters to achieve high performance. These selections are tedious for human experts and require substantial time and resources. A difficulty that encouraged a growing number of researchers to use Evolutionary Computation (EC) algorithms to optimize Deep Neural Networks (DNN); a research branch called Evolutionary Deep Learning (EDL).This thesis is a two-fold exploration within the domains of EDL, and more broadly Evolutionary Machine Learning (EML). The first goal is to makeEDL/EML algorithms more practical by reducing the high computational costassociated with EC methods. In particular, we have proposed methods to alleviate the computation burden using approximate models. We show that surrogate-models can speed up EC methods by three times without compromising the quality of the final solutions. Our surrogate-assisted approach allows EC methods to scale better for both, expensive learning algorithms and large datasets with over 100K instances. Our second objective is to leverage EC methods for advancing our understanding of Deep Neural Network (DNN) design. We identify a knowledge gap in DL algorithms and introduce an EC algorithm precisely designed to optimize this uncharted aspect of DL design. Our analytical focus revolves around revealing avant-garde concepts and acquiring novel insights. In our study of randomness techniques in DNN, we offer insights into the design and training of more robust and generalizable neural networks. We also propose, in another study, a novel survival regression loss function discovered based on evolutionary search.
  •  
2.
  • Altarabichi, Mohammed Ghaith, 1981-, et al. (författare)
  • Extracting Invariant Features for Predicting State of Health of Batteries in Hybrid Energy Buses
  • 2021
  • Ingår i: 2021 IEEE 8th International Conference on Data Science and Advanced Analytics (DSAA), Porto, Portugal, 6-9 Oct., 2021. - : IEEE. ; , s. 1-6
  • Konferensbidrag (refereegranskat)abstract
    • Batteries are a safety-critical and the most expensive component for electric vehicles (EVs). To ensure the reliability of the EVs in operation, it is crucial to monitor the state of health of those batteries. Monitoring their deterioration is also relevant to the sustainability of the transport solutions, through creating an efficient strategy for utilizing the remaining capacity of the battery and its second life. Electric buses, similar to other EVs, come in many different variants, including different configurations and operating conditions. Developing new degradation models for each existing combination of settings can become challenging from different perspectives such as unavailability of failure data for novel settings, heterogeneity in data, low amount of data available for less popular configurations, and lack of sufficient engineering knowledge. Therefore, being able to automatically transfer a machine learning model to new settings is crucial. More concretely, the aim of this work is to extract features that are invariant across different settings.In this study, we propose an evolutionary method, called genetic algorithm for domain invariant features (GADIF), that selects a set of features to be used for training machine learning models, in such a way as to maximize the invariance across different settings. A Genetic Algorithm, with each chromosome being a binary vector signaling selection of features, is equipped with a specific fitness function encompassing both the task performance and domain shift. We contrast the performance, in migrating to unseen domains, of our method against a number of classical feature selection methods without any transfer learning mechanism. Moreover, in the experimental result section, we analyze how different features are selected under different settings. The results show that using invariant features leads to a better generalization of the machine learning models to an unseen domain.
  •  
3.
  • Altarabichi, Mohammed Ghaith, 1981-, et al. (författare)
  • Fast Genetic Algorithm for feature selection — A qualitative approximation approach
  • 2023
  • Ingår i: Expert systems with applications. - Oxford : Elsevier. - 0957-4174 .- 1873-6793. ; 211
  • Tidskriftsartikel (refereegranskat)abstract
    • Evolutionary Algorithms (EAs) are often challenging to apply in real-world settings since evolutionary computations involve a large number of evaluations of a typically expensive fitness function. For example, an evaluation could involve training a new machine learning model. An approximation (also known as meta-model or a surrogate) of the true function can be used in such applications to alleviate the computation cost. In this paper, we propose a two-stage surrogate-assisted evolutionary approach to address the computational issues arising from using Genetic Algorithm (GA) for feature selection in a wrapper setting for large datasets. We define “Approximation Usefulness” to capture the necessary conditions to ensure correctness of the EA computations when an approximation is used. Based on this definition, we propose a procedure to construct a lightweight qualitative meta-model by the active selection of data instances. We then use a meta-model to carry out the feature selection task. We apply this procedure to the GA-based algorithm CHC (Cross generational elitist selection, Heterogeneous recombination and Cataclysmic mutation) to create a Qualitative approXimations variant, CHCQX. We show that CHCQX converges faster to feature subset solutions of significantly higher accuracy (as compared to CHC), particularly for large datasets with over 100K instances. We also demonstrate the applicability of the thinking behind our approach more broadly to Swarm Intelligence (SI), another branch of the Evolutionary Computation (EC) paradigm with results of PSOQX, a qualitative approximation adaptation of the Particle Swarm Optimization (PSO) method. A GitHub repository with the complete implementation is available. © 2022 The Author(s)
  •  
4.
  • Altarabichi, Mohammed Ghaith, 1981-, et al. (författare)
  • Fast Genetic Algorithm For Feature Selection — A Qualitative Approximation Approach
  • 2023
  • Ingår i: Evolutionary Computation Conference Companion (GECCO ’23 Companion), July 15–19, 2023, Lisbon, Portugal. - New York, NY : Association for Computing Machinery (ACM). - 9798400701207 ; , s. 11-12
  • Konferensbidrag (refereegranskat)abstract
    • We propose a two-stage surrogate-assisted evolutionary approach to address the computational issues arising from using Genetic Algorithm (GA) for feature selection in a wrapper setting for large datasets. The proposed approach involves constructing a lightweight qualitative meta-model by sub-sampling data instances and then using this meta-model to carry out the feature selection task. We define "Approximation Usefulness" to capture the necessary conditions that allow the meta-model to lead the evolutionary computations to the correct maximum of the fitness function. Based on our procedure we create CHCQX a Qualitative approXimations variant of the GA-based algorithm CHC (Cross generational elitist selection, Heterogeneous recombination and Cataclysmic mutation). We show that CHCQX converges faster to feature subset solutions of significantly higher accuracy, particularly for large datasets with over 100K instances. We also demonstrate the applicability of our approach to Swarm Intelligence (SI), with results of PSOQX, a qualitative approximation adaptation of the Particle Swarm Optimization (PSO) method. A GitHub repository with the complete implementation is available2. This paper for the Hot-off-the-Press track at GECCO 2023 summarizes the original work published at [3].References[1] Mohammed Ghaith Altarabichi, Yuantao Fan, Sepideh Pashami, Peyman Sheikholharam Mashhadi, and Sławomir Nowaczyk. 2021. Extracting invariant features for predicting state of health of batteries in hybrid energy buses. In 2021 ieee 8th international conference on data science and advanced analytics (dsaa). IEEE, 1–6.[2] Mohammed Ghaith Altarabichi, Sławomir Nowaczyk, Sepideh Pashami, and Peyman Sheikholharam Mashhadi. 2021. Surrogate-assisted genetic algorithm for wrapper feature selection. In 2021 IEEE Congress on Evolutionary Computation (CEC). IEEE, 776–785.[3] Mohammed Ghaith Altarabichi, Sławomir Nowaczyk, Sepideh Pashami, and Peyman Sheikholharam Mashhadi. 2023. Fast Genetic Algorithm for feature selection—A qualitative approximation approach. Expert systems with applications 211 (2023), 118528.© 2023 Copyright held by the owner/author(s).
  •  
5.
  • Altarabichi, Mohammed Ghaith, 1981-, et al. (författare)
  • Improving Concordance Index in Regression-based Survival Analysis : Discovery of Loss Function for Neural Networks
  • 2024
  • Annan publikation (övrigt vetenskapligt/konstnärligt)abstract
    • In this work, we use an Evolutionary Algorithm (EA) to discover a novel Neural Network (NN) regression-based survival loss function with the aim of improving the C-index performance. Our contribution is threefold; firstly, we propose an evolutionary meta-learning algorithm SAGA$_{loss}$ for optimizing a neural-network regression-based loss function that maximizes the C-index; our algorithm consistently discovers specialized loss functions that outperform MSCE. Secondly, based on our analysis of the evolutionary search results, we highlight a non-intuitive insight that signifies the importance of the non-zero gradient for the censored cases part of the loss function, a property that is shown to be useful in improving concordance. Finally, based on this insight, we propose MSCE$_{Sp}$, a novel survival regression loss function that can be used off-the-shelf and generally performs better than the Mean Squared Error for censored cases. We performed extensive experiments on 19 benchmark datasets to validate our findings.
  •  
6.
  • Altarabichi, Mohammed Ghaith, 1981-, et al. (författare)
  • Predicting state of health and end of life for batteries in hybrid energy buses
  • 2020
  • Ingår i: Proceedings of the 30th European Safety and Reliability Conference and the 15th Probabilistic Safety Assessment and Management Conference. - Singapore : Research Publishing Services. - 9789811485930 ; , s. 1231-1231
  • Konferensbidrag (refereegranskat)abstract
    • There is a major ongoing transition from utilizing fossil fuel to electricity in buses for enabling a more sustainable, environmentally friendly, and connected transportation ecosystem. Batteries are expensive, up to 30% of the total cost for the vehicle (A. Fotouhi 2016), and considered safety-critical components for electric vehicles (EV). As they deteriorate over time, monitoring the health status and performing the maintenance accordingly in a proactive manner is crucial to achieving not only a safe and sustainable transportation system but also a cost-effective operation and thus a greater market satisfaction. As a widely used indicator, the State of Health (SOH) is a measurement that reflects the current capability of the battery in comparison to an ideal condition. Accurate estimation of SOH is important to evaluate the validity of the batteries for the intended application and can be utilized as a proxy to estimate the remaining useful life (RUL) and predict the end-of-life (EOL) of batteries for maintenance planning. The SOH is computed via an on-board computing device, i.e. battery management unit (BMU), which is commonly developed based on controlled experiments and many of them are physical-model based approaches that only depend on the internal parameters of the battery (B. Pattipati 2008; M. H. Lipu 2018). However, the deterioration processes of batteries in hybrid and full-electric buses depend not only on the designing parameters but also on the operating environment and usage patterns of the vehicle. Therefore, utilizing multiple data sources to estimate the health status and EOL of the batteries is of potential internet. In this study, a data-driven prognostic method is developed to estimate SOH and predict EOL for batteries in heterogeneous fleets of hybrid buses, using various types of data sources, e.g. physical configuration of the vehicle, deployment information, on-board sensor readings, and diagnostic fault codes. A set of new features was generated from the existing sensor readings by inducing artificial resets on each battery replacement. A neural network-based regression model achieved accurate estimates of battery SOH status. Another network was used to indicate the EOL of batteries and the result was evaluated using battery replacement based on the current maintenance strategy. © ESREL2020-PSAM15 Organizers. Published by Research Publishing, Singapore.
  •  
7.
  • Altarabichi, Mohammed Ghaith, 1981-, et al. (författare)
  • Rolling The Dice For Better Deep Learning Performance : A Study Of Randomness Techniques In Deep Neural Networks
  • 2024
  • Ingår i: Information Sciences. - Philadelphia, PA : Elsevier. - 0020-0255 .- 1872-6291. ; 667, s. 1-17
  • Tidskriftsartikel (refereegranskat)abstract
    • This paper presents a comprehensive empirical investigation into the interactions between various randomness techniques in Deep Neural Networks (DNNs) and how they contribute to network performance. It is well-established that injecting randomness into the training process of DNNs, through various approaches at different stages, is often beneficial for reducing overfitting and improving generalization. However, the interactions between randomness techniques such as weight noise, dropout, and many others remain poorly understood. Consequently, it is challenging to determine which methods can be effectively combined to optimize DNN performance. To address this issue, we categorize the existing randomness techniques into four key types: data, model, optimization, and learning. We use this classification to identify gaps in the current coverage of potential mechanisms for the introduction of noise, leading to proposing two new techniques: adding noise to the loss function and random masking of the gradient updates.In our empirical study, we employ a Particle Swarm Optimizer (PSO) to explore the space of possible configurations to answer where and how much randomness should be injected to maximize DNN performance. We assess the impact of various types and levels of randomness for DNN architectures applied to standard computer vision benchmarks: MNIST, FASHION-MNIST, CIFAR10, and CIFAR100. Across more than 30\,000 evaluated configurations, we perform a detailed examination of the interactions between randomness techniques and their combined impact on DNN performance. Our findings reveal that randomness in data augmentation and in weight initialization are the main contributors to performance improvement. Additionally, correlation analysis demonstrates that different optimizers, such as Adam and Gradient Descent with Momentum, prefer distinct types of randomization during the training process. A GitHub repository with the complete implementation and generated dataset is available. © 2024 The Author(s)
  •  
8.
  • Altarabichi, Mohammed Ghaith, 1981-, et al. (författare)
  • Stacking Ensembles of Heterogenous Classifiers for Fault Detection in Evolving Environments
  • 2020
  • Ingår i: Proceedings of the 30th European Safety and Reliability Conference and the 15th Probabilistic Safety Assessment and Management Conference. - Singapore : Research Publishing Services. - 9789811485930 ; , s. 1068-1068
  • Konferensbidrag (refereegranskat)abstract
    • Monitoring the condition, detecting faults, and modeling the degradation of industrial equipment are important challenges in Prognostics and Health Management (PHM) field. Our solution to the challenge demonstrated a multi-stage approach for detecting faults in a group of identical industrial equipment, composed of four identical interconnected components, that have been deployed to the evolving environment with changes in operational and environmental conditions. In the first stage, a stacked ensemble of heterogeneous classifiers was applied to predict the state of each component of the equipment individually. In the second stage, a low pass filter was applied to smoothen the predictions cast by stacked ensembles, utilizing temporal information of the prediction sequence. © ESREL2020-PSAM15 Organizers. Published by Research Publishing, Singapore.
  •  
9.
  • Altarabichi, Mohammed Ghaith, 1981-, et al. (författare)
  • Surrogate-Assisted Genetic Algorithm for Wrapper Feature Selection
  • 2021
  • Ingår i: 2021 IEEE Congress on Evolutionary Computation (CEC). - : IEEE. - 9781728183930 ; , s. 776-785
  • Konferensbidrag (refereegranskat)abstract
    • Feature selection is an intractable problem, therefore practical algorithms often trade off the solution accuracy against the computation time. In this paper, we propose a novel multi-stage feature selection framework utilizing multiple levels of approximations, or surrogates. Such a framework allows for using wrapper approaches in a much more computationally efficient way, significantly increasing the quality of feature selection solutions achievable, especially on large datasets. We design and evaluate a Surrogate-Assisted Genetic Algorithm (SAGA) which utilizes this concept to guide the evolutionary search during the early phase of exploration. SAGA only switches to evaluating the original function at the final exploitation phase.We prove that the run-time upper bound of SAGA surrogate-assisted stage is at worse equal to the wrapper GA, and it scales better for induction algorithms of high order of complexity in number of instances. We demonstrate, using 14 datasets from the UCI ML repository, that in practice SAGA significantly reduces the computation time compared to a baseline wrapper Genetic Algorithm (GA), while converging to solutions of significantly higher accuracy. Our experiments show that SAGA can arrive at near-optimal solutions three times faster than a wrapper GA, on average. We also showcase the importance of evolution control approach designed to prevent surrogates from misleading the evolutionary search towards false optima.
  •  
Skapa referenser, mejla, bekava och länka
  • Resultat 1-9 av 9

Kungliga biblioteket hanterar dina personuppgifter i enlighet med EU:s dataskyddsförordning (2018), GDPR. Läs mer om hur det funkar här.
Så här hanterar KB dina uppgifter vid användning av denna tjänst.

 
pil uppåt Stäng

Kopiera och spara länken för att återkomma till aktuell vy