SwePub
Sök i SwePub databas

  Extended search

Träfflista för sökning "WFRF:(Palazoglu Ahmet) "

Search: WFRF:(Palazoglu Ahmet)

  • Result 1-8 of 8
Sort/group result
   
EnumerationReferenceCoverFind
1.
  • Capaci, Francesca, et al. (author)
  • On Monitoring Industrial Processes under Feedback Control
  • 2020
  • In: Quality and Reliability Engineering International. - : John Wiley & Sons. - 0748-8017 .- 1099-1638. ; 36:8, s. 2720-2737
  • Journal article (peer-reviewed)abstract
    • The concurrent use of statistical process control and engineering process con-trol involves monitoring manipulated and controlled variables. One multivari-ate control chart may handle the statistical monitoring of all variables, butobserving the manipulated and controlled variables in separate control chartsmay improve understanding of how disturbances and the controller perfor-mance affect the process. In this article, we illustrate how step and ramp dis-turbances manifest themselves in a single-input–single-output system bystudying their resulting signatures in the controlled and manipulated variables.The system is controlled by variations of the widely used proportional-integral-derivative(PID) control scheme. Implications for applying control charts forthese scenarios are discussed.
  •  
2.
  • Gajjar, Shriram, et al. (author)
  • Least Squares Sparse Principal Component Analysis and Parallel Coordinates for Real-Time Process Monitoring
  • 2020
  • In: Industrial & Engineering Chemistry Research. - : American Chemical Society (ACS). - 0888-5885 .- 1520-5045. ; 59:35, s. 15656-15670
  • Journal article (peer-reviewed)abstract
    • The unprecedented growth of machine-readable data throughout modern industrial systems has major repercussions for process monitoring activities. In contrast to model-based process monitoring that requires the physical and mathematical knowledge of the system in advance, the data-driven schemes provide an efficient alternative to extract and analyze process information directly from recorded process data. This paper introduces the least squares sparse principal component analysis to obtain readily interpretable sparse principal components. This is done in the context of parallel coordinates, which facilitate the visualization of high dimensional data. The key contribution is the establishment of control limits on independent sparse principal component and residual spaces to facilitate fault detection, complemented by the use of the Random Forests algorithm to carry out the fault diagnosis step. The proposed method is applied to the Tennessee Eastman process to highlight its merits.
  •  
3.
  • Gajjar, Shriram, et al. (author)
  • Real-time fault detection and diagnosis using sparse principal component analysis
  • 2018
  • In: Journal of Process Control. - : Elsevier. - 0959-1524 .- 1873-2771. ; 67, s. 112-128
  • Journal article (peer-reviewed)abstract
    • With the emergence of smart factories, large volumes of process data are collected and stored at high sampling rates for improved energy efficiency, process monitoring and sustainability. The data collected in the course of enterprise-wide operations consists of information from broadly deployed sensors and other control equipment. Interpreting such large volumes of data with limited workforce is becoming an increasingly common challenge. Principal component analysis (PCA) is a widely accepted procedure for summarizing data while minimizing information loss. It does so by finding new variables, the principal components (PCs) that are linear combinations of the original variables in the dataset. However, interpreting PCs obtained from many variables from a large dataset is often challenging, especially in the context of fault detection and diagnosis studies. Sparse principal component analysis (SPCA) is a relatively recent technique proposed for producing PCs with sparse loadings via variance-sparsity trade-off. Using SPCA, some of the loadings on PCs can be restricted to zero. In this paper, we introduce a method to select the number of non-zero loadings in each PC while using SPCA. The proposed approach considerably improves the interpretability of PCs while minimizing the loss of total variance explained. Furthermore, we compare the performance of PCA- and SPCA-based techniques for fault detection and fault diagnosis. The key features of the methodology are assessed through a synthetic example and a comparative study of the benchmark Tennessee Eastman process.
  •  
4.
  • Gajjar, Shriram, et al. (author)
  • Selection of Non-zero Loadings in Sparse Principal Component Analysis
  • 2017
  • In: Chemometrics and Intelligent Laboratory Systems. - : Elsevier. - 0169-7439 .- 1873-3239. ; 162, s. 160-171
  • Journal article (peer-reviewed)abstract
    • Principal component analysis (PCA) is a widely accepted procedure for summarizing data through dimensional reduction. In PCA, the selection of the appropriate number of components and the interpretation of those components have been the key challenging features. Sparse principal component analysis (SPCA) is a relatively recent technique proposed for producing principal components with sparse loadings via the variance-sparsity trade-off. Although several techniques for deriving sparse loadings have been offered, no detailed guidelines for choosing the penalty parameters to obtain a desired level of sparsity are provided. In this paper, we propose the use of a genetic algorithm (GA) to select the number of non-zero loadings (NNZL) in each principal component while using SPCA. The proposed approach considerably improves the interpretability of principal components and addresses the difficulty in the selection of NNZL in SPCA. Furthermore, we compare the performance of PCA and SPCA in uncovering the underlying latent structure of the data. The key features of the methodology are assessed through a synthetic example, pitprops data and a comparative study of the benchmark Tennessee Eastman process.
  •  
5.
  • Gajjar, Shriram, et al. (author)
  • Use of Sparse Principal Component Analysis (SPCA) for Fault Detection
  • 2016
  • In: IFAC-PapersOnLine. - : Elsevier BV. - 2405-8963. ; 49:7, s. 693-698
  • Journal article (peer-reviewed)abstract
    • Principal component analysis (PCA) has been widely used for data dimension reduction and process fault detection. However, interpreting the principal components and the outcomes of PCA-based monitoring techniques is a challenging task since each principal component is a linear combination of the original variables which can be numerous in most modern applications. To address this challenge, we first propose the use of sparse principal component analysis (SPCA) where the loadings of some variables in principal components are restricted to zero. This paper then describes a technique to determine the number of non-zero loadings in each principal component. Furthermore, we compare the performance of PCA and SPCA in fault detection. The validity and potential of SPCA are demonstrated through simulated data and a comparative study with the benchmark Tennessee Eastman process.
  •  
6.
  • Gao, Huihui, et al. (author)
  • Process Knowledge Discovery Using Sparse Principal Component Analysis
  • 2016
  • In: Industrial & Engineering Chemistry Research. - : American Chemical Society (ACS). - 0888-5885 .- 1520-5045. ; 55:46, s. 12046-12059
  • Journal article (peer-reviewed)abstract
    • As the goals of ensuring process safety and energy efficiency become ever more challenging, engineers increasingly rely on data collected from such processes for informed decision making. During recent decades, extracting and interpreting valuable process information from large historical data sets have been an active area of research. Among the methods used, principal component analysis (PCA) is a well-established technique that allows for dimensionality reduction for large data sets by finding new uncorrelated variables, namely principal components (PCs). However, it is difficult to interpret the derived PCs, as each PC is a linear combination of all of the original variables and the loadings are typically nonzero. Sparse principal component analysis (SPCA) is a relatively recent technique proposed for producing PCs with sparse loadings via the variance–sparsity trade-off. We propose a forward SPCA approach that helps uncover the underlying process knowledge regarding variable relations. This approach systematically determines the optimal sparse loadings for each sparse PC while improving interpretability and minimizing information loss. The salient features of the proposed approach are demonstrated through the Tennessee Eastman process simulation. The results indicate how knowledge and process insight can be discovered through a systematic analysis of sparse loadings.
  •  
7.
  • Kaya, Gizem Kuşoğlu, et al. (author)
  • A Study of Spectral Envelope Method for Multi-Cause Diagnosis using Industrial Data
  • 2021
  • In: 31<sup>st</sup> European Symposium on Computer Aided Process Engineering. - : Elsevier. ; , s. 1331-1337
  • Conference paper (peer-reviewed)abstract
    • Petroleum refineries are complex systems that consist of multiple integrated units. This situation makes it difficult to track down the root cause of abnormal situations that occur during production. It is noted that abnormal situations usually trigger plant-wide oscillations in a number of measured process variables. Therefore, root cause detection is often attempted to be carried out by examining these trends in process data. Observing multiple effects and underlying problems at the same time presents a challenge in determining the root cause by examining trends only. In this study, spectral envelope is used to detect oscillations by identifying the variables and categorizing them based on a statistical hypothesis test which produces Oscillation Contribution Index (OCI) in order to isolate potential root cause variables. Two distinct abnormal events in the hydrocracker unit that occur simultaneously were successfully isolated and the root causes could be assigned by using the spectral envelope analysis.
  •  
8.
  • Udugama, Isuru A., et al. (author)
  • The Role of Big Data in Industrial (Bio)chemical Process Operations
  • 2020
  • In: Industrial & Engineering Chemistry Research. - : American Chemical Society (ACS). - 0888-5885 .- 1520-5045. ; 59:34, s. 15283-15297
  • Journal article (peer-reviewed)abstract
    • With the emergence of Industry 4.0 and Big Data initiatives, there is a renewed interest in leveraging the vast amounts of data collected in (bio)chemical processes to improve their operations. The objective of this article is to provide a perspective of the current status of Big-Data-based process control methodologies and the most effective path to further embed these methodologies in the control of (bio)chemical processes. Therefore, this article provides an overview of operational requirements, the availability and the nature of data, and the role of the control structure hierarchy in (bio)chemical processes and how they constrain this endeavor. The current state of the seemingly competing methodologies of statistical process monitoring and (engineering) process control is examined together with hybrid methodologies that are attempting to combine tools and techniques that belong to either camp. The technical and economic considerations of a deeper integration between the two approaches is then explored, and a path forward is proposed.
  •  
Skapa referenser, mejla, bekava och länka
  • Result 1-8 of 8

Kungliga biblioteket hanterar dina personuppgifter i enlighet med EU:s dataskyddsförordning (2018), GDPR. Läs mer om hur det funkar här.
Så här hanterar KB dina uppgifter vid användning av denna tjänst.

 
pil uppåt Close

Copy and save the link in order to return to this view