SwePub
Sök i SwePub databas

  Extended search

Träfflista för sökning "WFRF:(Kulahci Murat) "

Search: WFRF:(Kulahci Murat)

  • Result 1-50 of 128
Sort/group result
   
EnumerationReferenceCoverFind
1.
  • Aakjær, Mia, et al. (author)
  • Surveillance of Antidepressant Safety (SADS) : Active Signal Detection of Serious Medical Events Following SSRI and SNRI Initiation Using Big Healthcare Data
  • 2021
  • In: Drug Safety. - : Springer. - 0114-5916 .- 1179-1942. ; 44, s. 1215-1230
  • Journal article (peer-reviewed)abstract
    • Introduction The current process for generating evidence in pharmacovigilance has several limitations, which often lead to delays in the evaluation of drug-associated risks.Objectives In this study, we proposed and tested a near real-time epidemiological surveillance system using sequential, cumulative analyses focusing on the detection and preliminary risk quantification of potential safety signals following initiation of selective serotonin reuptake inhibitors (SSRIs) and serotonin-norepinephrine reuptake inhibitors (SNRIs).Methods We emulated an active surveillance system in an historical setting by conducting repeated annual cohort studies using nationwide Danish healthcare data (1996–2016). Outcomes were selected from the European Medicines Agency's Designated Medical Event list, summaries of product characteristics, and the literature. We followed patients for a maximum of 6 months from treatment initiation to the event of interest or censoring. We performed Cox regression analyses adjusted for standard sets of covariates. Potential safety signals were visualized using heat maps and cumulative hazard ratio (HR) plots over time.Results In the total study population, 969,667 new users were included and followed for 461,506 person-years. We detected potential safety signals with incidence rates as low as 0.9 per 10,000 person-years. Having eight different exposure drugs and 51 medical events, we identified 31 unique combinations of potential safety signals with a positive association to the event of interest in the exposed group. We proposed that these signals were designated for further evaluation once they appeared in a prospective setting. In total, 21 (67.7%) of these were not present in the current summaries of product characteristics.Conclusion The study demonstrated the feasibility of performing epidemiological surveillance using sequential, cumulative analyses. Larger populations are needed to evaluate rare events and infrequently used antidepressants.
  •  
2.
  • Almimi, Ashraf A., et al. (author)
  • Checking the adequacy of fit of models from split-plot designs
  • 2009
  • In: Journal of QualityTechnology. - : Informa UK Limited. - 0022-4065. ; 41:3, s. 272-284
  • Journal article (peer-reviewed)abstract
    • One of the main features that distinguish split-plot experiments from other experiments is that they involve two types of experimental errors: the whole-plot (WP) error and the subplot (SP) error. Taking this into consideration is very important when computing measures of adequacy of fit for split-plot models. In this article, we propose the computation of two R2, R 2-adjusted, prediction error sums of squares (PRESS), and R 2 -prediction statistics to measure the adequacy of fit for the WP and the SP submodels in a split-plot design. This is complemented with the graphical analysis of the two types of errors to check for any violation of the underlying assumptions and the adequacy of fit of split-plot models. Using examples, we show how computing two measures of model adequacy of fit for each split-plot design model is appropriate and useful as they reveal whether the correct WP and SP effects have been included in the model and describe the predictive performance of each group of effects.
  •  
3.
  • Almimi, Ashraf A., et al. (author)
  • Follow-up designs to resolve confounding in split-plot experiments
  • 2008
  • In: Journal of QualityTechnology. - : Informa UK Limited. - 0022-4065. ; 40:2, s. 154-166
  • Journal article (peer-reviewed)abstract
    • Split-plot designs are effective in industry due to time and/or cost constraints, restriction on randomization of the treatment combinations of the hard-to-change factors, and different sizes of experimental units. Some of the results of fractional factorial split-plot experiments can be ambiguous and a need may arise to conduct follow-up experiments to separate effects of potential interest by breaking their alias links with others. For completely randomized fractional factorial experiments, methods have been developed to construct follow-up experiments. In this article, we extend the foldover technique to break the alias chains of split-plot experiments. Because it is impractical or not economically possible to foldover the whole-plot factors, as their levels are often hard or expensive to change, the focus of this article is on folding over only one or more subplot factors in order to de-alias certain effects. Six rules are provided to develop foldovers for minimum aberration resolution III and resolution IV fractional factorial split-plot designs.
  •  
4.
  • Andersen, Emil B., et al. (author)
  • An easy to use GUI for simulating big data using Tennessee Eastman process
  • 2022
  • In: Quality and Reliability Engineering International. - : John Wiley & Sons. - 0748-8017 .- 1099-1638. ; 38:1, s. 264-282
  • Journal article (peer-reviewed)abstract
    • Data-driven process monitoring and control techniques and their application to industrial chemical processes are gaining popularity due to the current focus on Industry 4.0, digitalization and the Internet of Things. However, for the development of such techniques, there are significant barriers that must be overcome in obtaining sufficiently large and reliable datasets. As a result, the use of real plant and process data in developing and testing data-driven process monitoring and control tools can be difficult without investing significant efforts in acquiring, treating, and interpreting the data. Therefore, researchers need a tool that effortlessly generates large amounts of realistic and reliable process data without the requirement for additional data treatment or interpretation. In this work, we propose a data generation platform based on the Tennessee Eastman Process simulation benchmark. A graphical user interface (GUI) developed in MATLAB Simulink is presented that enables users to generate massive amounts of data for testing applicability of big data concepts in the realm of process control for continuous time-dependent processes. An R-Shiny app that interacts with the data generation tool is also presented for illustration purposes. The app can visualize the results generated by the Tennessee Eastman Process and can carry out a standard fault detection and diagnosis studies based on PCA. The data generator GUI is available free of charge for research purposes at https://github.com/dtuprodana/TEP. 
  •  
5.
  • Andersen, Emil B., et al. (author)
  • Big Data Generation for Time Dependent Processes : The Tennessee Eastman Process for Generating Large Quantities of Process Data
  • 2020
  • In: 30<sup>th</sup> European Symposium on Computer Aided Process Engineering. - : Elsevier. ; , s. 1309-1314
  • Conference paper (peer-reviewed)abstract
    • The concept of applying data-driven process monitoring and control techniques on industrial chemical processes is well established. With concepts such as Industry 4.0, Big Data and the Internet of Things receiving attention in industrial chemical production, there is a renewed focus on data-driven process monitoring and control in chemical production applications. However, there are significant barriers that must be overcome in obtaining sufficiently large and reliable plant and process data from industrial chemical processes for the development of data-driven process monitoring and control concepts, specifically in obtaining plant and process data that are required to develop and test data driven process monitoring and control tools without investing significant efforts in acquiring, treating and interpreting the data. In this manuscript a big data generation tool is presented that is based on the Tennessee Eastman Process (TEP) simulation benchmark, which has been specifically designed to generate massive amounts of process data without spending significant effort in setting up. The tool can be configured to carry out a large number of data generation runs both using a graphical user interface (GUI) and through a.CSV file. The output from the tool is a file containing process data for all runs as well as process faults (deviations) that have been activated. This tool enables users to generate massive amounts of data for testing applicability of big data concepts in the realm of process control for continuously operating time dependent processes. The tool is available for all researchers and other parties who are interested.
  •  
6.
  • Bach Andersen, Peter, et al. (author)
  • Added Value of Individual Flexibility Profiles of Electric Vehicle Users For Ancillary Services
  • 2018
  • In: 2018 IEEE International Conference on Communications, Control, and Computing Technologies for Smart Grids (SmartGridComm). - : IEEE.
  • Conference paper (peer-reviewed)abstract
    • Vehicle-Grid Integration (VGI) research may serve to limit the self-induced adverse effects of electric vehicles (EVs) in terms of additional grid loading, but also as to make the EV an active asset in supporting a stable, economic power system based on renewable energy. Any use of the vehicle for grid services requires an accurate understanding of the user's driving needs. This paper proposes the introduction of a user profile, describing the energy requirements for driving in terms of an energy deadline, target and minimum. To explore the use of such a profile, the paper analyses data from a Danish pilot project where the driving patterns of ten electric Nissan e-NV200 vans are investigated in terms of leave times and energy consumption. It is shown that the data can be fitted with a log-normal distribution that can be used to establish a per user profile which provides a certain statistical probability of fulfilling the driving needs while allowing an aggregator to optimize earnings. Initially, aggregators may apply similar driving assumptions across an entire fleet. Considering that the driving needs of individual EV owners are different, statistical representations of the individual behaviour may result in more flexibility, and thereby time, for providing grid services. The paper quantifies the value of such added flexibility based on the Danish market for frequency containment reserves.
  •  
7.
  • Bekki, Jennifer M., et al. (author)
  • Simulation-based cycle-time quantile estimation in manufacturing settings employing non-FIFO dispatching policies
  • 2009
  • In: Journal of Simulation. - : Informa UK Limited. - 1747-7778 .- 1747-7786. ; 3:2, s. 69-83
  • Journal article (peer-reviewed)abstract
    • Previous work shows that a combination of the Cornish-Fisher Expansion (CFE) with discrete-event simulation produces accurate and precise estimates of cycle-time quantiles with very little data storage, provided all workstations in the model are operating under the first-in-first-out (FIFO) dispatching rule. The accuracy of the approach degrades, however, as non-FIFO dispatching policies are employed in at least one workstation. This paper proposes the use of a power transformation for use in combination with the CFE to combat these accuracy problems. The suggested approach is detailed, and three methods for selecting the λ parameter of the power transformation are given. The results of a thorough empirical evaluation of each of the three approaches are given, and the advantages and drawbacks of each approach are discussed. Results show that the combination of the CFE with a power transformation generates cycle-time quantile estimates with high accuracy even for non-FIFO systems.
  •  
8.
  • Bisgaard, Søren, et al. (author)
  • Checking process stability with the variogram
  • 2005
  • In: Quality Engineering. - 0898-2112 .- 1532-4222. ; 17:2, s. 323-327
  • Journal article (peer-reviewed)abstract
    • Modern quality control methods are increasingly being used to monitor complex industrial processes. A key requirement for such methods is the derivation of long records. Once such records are obtained, the variogram becomes a simple and useful exploratory tool that can be used by quality professionals to investigate whether a process is stationary or not.
  •  
9.
  • Bisgaard, Søren, et al. (author)
  • Finding assignable causes
  • 2000
  • In: Quality Engineering. - : Informa UK Limited. - 0898-2112 .- 1532-4222. ; 12:4, s. 633-640
  • Journal article (peer-reviewed)
  •  
10.
  •  
11.
  • Bisgaard, Søren, et al. (author)
  • Quality quandaries : Box-cox transformations and time series modeling - Part I
  • 2008
  • In: Quality Engineering. - : Informa UK Limited. - 0898-2112 .- 1532-4222. ; 20:3, s. 376-388
  • Journal article (peer-reviewed)abstract
    • A demonstration in determining a Box-Cox transformation in the context of seasonal time series modeling has been provided. The first step is postulating a general class of statistical models and transformations, identifying a transformation and a model to be tentatively entertained, estimating parameters in tentatively entertained and fitted model, then checking the transformation and so on. Starting this iterative process a number of graphical methods are typically applied. Graphical determination of appropriate transformation may include log transformation and use of range-mean chart. Proceeding this step is identification of an appropriate ARIMA time series model. The Box-Cox transformation family of transformations is continuous in λ and contains the log transformation as a special case. It does have repeated model fittings but can be done relatively quick with standard time series software.
  •  
12.
  • Bisgaard, Søren, et al. (author)
  • Quality quandaries : Time series model selection and parsimony
  • 2009
  • In: Quality Engineering. - : Informa UK Limited. - 0898-2112 .- 1532-4222. ; 21:3, s. 341-353
  • Journal article (peer-reviewed)abstract
    • Choosing an adequate model for a certain set of data is considered to be one of the more difficult tasks in time series analysis as experienced analysts are also having a hard time selecting such appropriate model. Thus, one popular approach have been discussed with the use of certain numerical criteria which is believed to be a useful input for the decision making process. However, using this technique solely is also not advisable on choosing a model but the use of judgement and the use of information criteria are more preferred. Specifically, the use of parsimonious mixed autoregressive model (ARMA) is more favorable to be used as it considers the context of the model as well as illustrating what is trying to be modeled and what model is to be used.
  •  
13.
  • Bisgaard, Søren, et al. (author)
  • Quality quandaries : Practical time series modeling
  • 2007
  • In: Quality Engineering. - : Informa UK Limited. - 0898-2112 .- 1532-4222. ; 19:3, s. 253-262
  • Journal article (peer-reviewed)abstract
    • Time series analysis is important in modern quality monitoring and control. The analysis has no precise methods and no single true, final answer. There are three general classes for stationary time series models: autoregressive (AR), moving average (MA) or the autoregressive moving average model. If in case a data is nonstationary, differencing before using ARMA model to fit to the data is necessary. The formulations for AR(p), MA(q) and the ARMA(p,q) has zero intercept which is attained by subtracting the average from the stationary data before modeling the process. If it is applied in a nonstationary data, there is a need to differenced either once or twice, adding a nonzero intercept term to the model. This implies that there is an underlying deterministic first or second order polynomial trend in the data. In reality, the type of model and the order necessary to adequately model a given process is not known. Hence, there is a need to determine the model that best fit the data based on looking at the autocorrelation function (ACF) and the partial autocorrelation function (PACF). Since the time series modeling requires judgment and experience, an literative model is suggested. Once the model is fitted, diagnostic checks are conducted using the ACF and PACF. Series C consisting of 226 observations of the temperature of a chemical pilot plant has been used as an example.
  •  
14.
  • Bisgaard, Søren, et al. (author)
  • Quality quandaries : The application of principal component analysis for process monitoring
  • 2006
  • In: Quality Engineering. - : Informa UK Limited. - 0898-2112 .- 1532-4222. ; 18:1, s. 95-103
  • Journal article (peer-reviewed)abstract
    • An overview of graphical techniques that are useful when dealing with process monitoring is given. Focus is on contemporaneous correlation. Specifically, principal component analysis (PCA), a method akin to a Pareto analysis is demonstrated. The geometry of PCA to enhance intuition is described.
  •  
15.
  • Bisgaard, Søren, et al. (author)
  • Quality quandaries : Forecasting with seasonal time series models
  • 2008
  • In: Quality Engineering. - : Informa UK Limited. - 0898-2112 .- 1532-4222. ; 20:2, s. 250-260
  • Journal article (peer-reviewed)abstract
    • Forecasting is increasingly a part of the standard list among quality engineers specifically the domain of Six Sigma and quality engineering that deals with operational problems in manufacturing and service organizations. One of the most versatile approaches is the so-called Box-Jenkins approach using regular and seasonal integrated autoregressive moving average. The international airline data has been used with a seasonal autoregressive integrated moving average time series model to demonstrate how seasonal ARIMA models can be used to model cyclic data and how the model can be used for short term forecasting.
  •  
16.
  • Bisgaard, Søren, et al. (author)
  • Quality quandaries : Process regime changes
  • 2007
  • In: Quality Engineering. - : Informa UK Limited. - 0898-2112 .- 1532-4222. ; 19:1, s. 83-87
  • Journal article (peer-reviewed)abstract
    • Gaining understanding of process behavior and exploring the relationship between process variables are important prerequisites for quality improvement. In any diagnosis of a process, the quality engineer needs to try to understand and interpret relationships between inputs and outputs as well as between intermediate variables. Regime changes occassionally occur in the process engineering context. The tell tale sign of a regime change is most easily seen in scatter plots. Geographical analysis is proven to be useful in the early diagnostic phase of analyzing processes suspected ot having undergone regime changes.
  •  
17.
  • Bisgaard, Søren, et al. (author)
  • Quality quandaries : Using a time series model for process adjustment and control
  • 2008
  • In: Quality Engineering. - : Informa UK Limited. - 0898-2112 .- 1532-4222. ; 20:1, s. 134-141
  • Journal article (peer-reviewed)abstract
    • The behavior of a chemical manufacturing process can be characterized with a time series model. A time series model can control and adjust the manufacturing process. Time series control of a process is about the prediction that the process will deviate excessively from the target in the next time period and make the predicted difference to make compensatory adjustment in the opposite direction. A detailed example on how a nonstationary time series model can be be used to develop two types of charts has been provided, one for periodic adjustments of the process counteracting the naturally occurring common cause variability and one more traditional control chart based on the residuals to look for special causes. The nonstationary time series model requires the acknowledgment that processes are inherently nonstationary and work from that more realistic assumption rather than the traditional Shewhart model of a fixed error distribution around a constant mean. The process, once too far off from a given target, will be adjusted by bringing its level back to target while the data coming from the process continues to conform to the assumed nonstationary time series model.
  •  
18.
  • Bisgaard, Søren, et al. (author)
  • Quality quandaries : Box-cox transformations and time series modeling - Part II
  • 2008
  • In: Quality Engineering. - : Informa UK Limited. - 0898-2112 .- 1532-4222. ; 20:4, s. 516-523
  • Journal article (peer-reviewed)abstract
    • The sales data from the engineering firm called Company X has a type of problem called "Big Q" that involve issues about time series analysis and the use of data transformations. According to Chatfield and Prothero (CP), the forecasts produced using a seasonal ARIMA model fitted to the log of the sales data produced unrealistic forecasts. The application of Box-Cox transformations to Company X's sales data provided a "useful" transformation of these data. The CP tried to find "useful" models that characterize the dynamics in the particular data appropriately, and thus produced sensible forecasts. The forecasting model proposed by CP and the alternative model proposed by Box and Jenkins were analyzed. As a result, both types of models provided to be quite good reasonable forecasts.
  •  
19.
  •  
20.
  •  
21.
  •  
22.
  • Bisgaard, Søren, et al. (author)
  • Quality quandaries : Beware of autocorrelation in regression
  • 2007
  • In: Quality Engineering. - : Informa UK Limited. - 0898-2112 .- 1532-4222. ; 19:2, s. 143-148
  • Journal article (peer-reviewed)abstract
    • In the quality engineering context, the problem of trying to assess whether there exists a relationship between several inputs and an output of a process is often encountered. The main reason for spurious relationships between time series is that two unrelated time series that are internally autocorrelated sometimes by chance can produce very large cross correlations. Perhaps the safest approach to assessing the relationship between the input and the output of a process when the data is autocorrelated is to use prewhitening
  •  
23.
  • Bisgaard, Søren, et al. (author)
  • Quality quandaries : Studying input-output relationships, part II
  • 2006
  • In: Quality Engineering. - : Informa UK Limited. - 0898-2112 .- 1532-4222. ; 18:3, s. 405-410
  • Journal article (peer-reviewed)abstract
    • When analyzing process data to see if there exists an input variable that can be used to control an output variable, one should be aware of the possibility of spurious relationships. One way to check for this possibility is to carefully analyze the residuals. If they show signs of autocorrelation, the apparent relationship may be spurious. An effective method for checking such relationship is that of William S. Gosset.
  •  
24.
  •  
25.
  • Bisgaard, Søren, et al. (author)
  • Robust product design : Saving trials with split-plot confounding
  • 2001
  • In: Quality Engineering. - : Informa UK Limited. - 0898-2112 .- 1532-4222. ; 13:3, s. 525-530
  • Journal article (peer-reviewed)abstract
    • Robust production experimentation is described as an important quality engineering activity. A cake mix experiment performed to illustrate split-plot confounding was used to eliminate the low resolution of standard inner and outer array designs. A necessary amount of information was furnished by the split plot design due to switching between fractions. The robustness of the product was improved by identifying the interaction between enviromental and design factors
  •  
26.
  •  
27.
  • Box, George E.P., et al. (author)
  • Performance Evaluation of Dynamic Monitoring Systems : The Waterfall Chart
  • 2003
  • In: Quality Engineering. - 0898-2112 .- 1532-4222. ; 16:2, s. 183-191
  • Journal article (peer-reviewed)abstract
    • Computers are increasingly employed to monitor the performance of complex systems. An important issue is how to evaluate the performance of such monitors. In this article we introduce a three-dimensional representation that we call a "waterfall chart" of the probability of an alarm as a function of time and the condition of the system. It combines and shows the conceptual relationship between the cumulative distribution function of the run length and the power function. The value of this tool is illustrated with an application to Page's one-sided Cusum algorithm. However, it can be applied in general for any monitoring system.
  •  
28.
  • Cacciarelli, Davide, et al. (author)
  • A novel fault detection and diagnosis approach based on orthogonal autoencoders
  • 2022
  • In: Computers and Chemical Engineering. - : Elsevier. - 0098-1354 .- 1873-4375. ; 163
  • Journal article (peer-reviewed)abstract
    • In recent years, there have been studies focusing on the use of different types of autoencoders (AEs) for monitoring complex nonlinear data coming from industrial and chemical processes. However, in many cases the focus was placed on detection. As a result, practitioners are encountering problems in trying to interpret such complex models and obtaining candidate variables for root cause analysis once an alarm is raised. This paper proposes a novel statistical process control (SPC) framework based on orthogonal autoencoders (OAEs). OAEs regularize the loss function to ensure no correlation among the features of the latent variables. This is extremely beneficial in SPC tasks, as it allows for the invertibility of the covariance matrix when computing the Hotelling T2 statistic, significantly improving detection and diagnosis performance when the process variables are highly correlated. To support the fault diagnosis and identification analysis, we propose an adaptation of the integrated gradients (IG) method. Numerical simulations and the benchmark Tennessee Eastman Process are used to evaluate the performance of the proposed approach by comparing it to traditional approaches as principal component analysis (PCA) and kernel PCA (KPCA). In the analysis, we explore how the information useful for fault detection and diagnosis is stored in the intermediate layers of the encoder network. We also investigate how the correlation structure of the data affects the detection and diagnosis of faulty variables. The results show how the combination of OAEs and IG represents a compelling and ready-to-use solution, offering improved detection and diagnosis performances over the traditional methods.
  •  
29.
  • Cacciarelli, Davide, et al. (author)
  • Active learning for data streams: a survey
  • 2024
  • In: Machine Learning. - : Springer Nature. - 0885-6125 .- 1573-0565. ; 113:1, s. 185-239
  • Research review (peer-reviewed)abstract
    • Online active learning is a paradigm in machine learning that aims to select the most informative data points to label from a data stream. The problem of minimizing the cost associated with collecting labeled observations has gained a lot of attention in recent years, particularly in real-world applications where data is only available in an unlabeled form. Annotating each observation can be time-consuming and costly, making it difficult to obtain large amounts of labeled data. To overcome this issue, many active learning strategies have been proposed in the last decades, aiming to select the most informative observations for labeling in order to improve the performance of machine learning models. These approaches can be broadly divided into two categories: static pool-based and stream-based active learning. Pool-based active learning involves selecting a subset of observations from a closed pool of unlabeled data, and it has been the focus of many surveys and literature reviews. However, the growing availability of data streams has led to an increase in the number of approaches that focus on online active learning, which involves continuously selecting and labeling observations as they arrive in a stream. This work aims to provide an overview of the most recently proposed approaches for selecting the most informative observations from data streams in real time. We review the various techniques that have been proposed and discuss their strengths and limitations, as well as the challenges and opportunities that exist in this area of research.
  •  
30.
  •  
31.
  • Cacciarelli, Davide, et al. (author)
  • Robust online active learning
  • 2024
  • In: Quality and Reliability Engineering International. - : John Wiley & Sons. - 0748-8017 .- 1099-1638. ; 40:1, s. 277-296
  • Journal article (peer-reviewed)abstract
    • In many industrial applications, obtaining labeled observations is not straightforward as it often requires the intervention of human experts or the use of expensive testing equipment. In these circumstances, active learning can be highly beneficial in suggesting the most informative data points to be used when fitting a model. Reducing the number of observations needed for model development alleviates both the computational burden required for training and the operational expenses related to labeling. Online active learning, in particular, is useful in high-volume production processes where the decision about the acquisition of the label for a data point needs to be taken within an extremely short time frame. However, despite the recent efforts to develop online active learning strategies, the behavior of these methods in the presence of outliers has not been thoroughly examined. In this work, we investigate the performance of online active linear regression in contaminated data streams. Our study shows that the currently available query strategies are prone to sample outliers, whose inclusion in the training set eventually degrades the predictive performance of the models. To address this issue, we propose a solution that bounds the search area of a conditional D-optimal algorithm and uses a robust estimator. Our approach strikes a balance between exploring unseen regions of the input space and protecting against outliers. Through numerical simulations, we show that the proposed method is effective in improving the performance of online active learning in the presence of outliers, thus expanding the potential applications of this powerful tool.
  •  
32.
  •  
33.
  • Cacciarelli, Davide, et al. (author)
  • Stream-based active learning with linear models
  • 2022
  • In: Knowledge-Based Systems. - : Elsevier. - 0950-7051 .- 1872-7409. ; 254
  • Journal article (peer-reviewed)abstract
    • The proliferation of automated data collection schemes and the advances in sensorics are increasing the amount of data we are able to monitor in real-time. However, given the high annotation costs and the time required by quality inspections, data is often available in an unlabeled form. This is fostering the use of active learning for the development of soft sensors and predictive models. In production, instead of performing random inspections to obtain product information, labels are collected by evaluating the information content of the unlabeled data. Several query strategy frameworks for regression have been proposed in the literature but most of the focus has been dedicated to the static pool-based scenario. In this work, we propose a new strategy for the stream-based scenario, where instances are sequentially offered to the learner, which must instantaneously decide whether to perform the quality check to obtain the label or discard the instance. The approach is inspired by the optimal experimental design theory and the iterative aspect of the decision-making process is tackled by setting a threshold on the informativeness of the unlabeled data points. The proposed approach is evaluated using numerical simulations and the Tennessee Eastman Process simulator. The results confirm that selecting the examples suggested by the proposed algorithm allows for a faster reduction in the prediction error.
  •  
34.
  • Capaci, Francesca, et al. (author)
  • A two-step procedure for fault detection in the Tennessee Eastman Process simulator
  • 2016
  • Conference paper (peer-reviewed)abstract
    • High-technological and complex production processes and high availability and sample frequencies of data in large scale industrial processes need the concurrent development of appropriate statistical control tools and monitoring techniques. Therefore, multivariate control charts based on latent variables are essential tools to detect and isolate process faults.Several Statistical Process Control (SPC) charts have been developed for multivariate and megavariate data, such as the Hotelling T2, MCUSUM and MEWMA control charts as well as charts based on principal component analysis (PCA) and dynamic PCA (DPCA). The ability of SPC procedures based on PCA (Kourti, MacGregor 1995) or DPCA (Ku et al. 1995) to detect and isolate process disturbances for a large number of highly correlated (and time-dependent in the case of DPCA) variables has been demonstrated in the literature. However, we argue that the fault isolation capability and the fault detection rate for processes can be improved further for processes operating under feedback control loops (in closed loop).The purpose of this presentation is to illustrate a two-step method where [1] the variables are pre-classified prior to the analysis and [2] the monitoring scheme based on latent variables is implemented. Step 1 involves a structured qualitative classification of the variables to guide the choice of which variables to monitor in Step 2. We argue that the proposed method will be useful for many practitioners of SPC based on latent variables techniques in processes operating in closed loop. It will allow clearer fault isolation and detection and an easier implementation of corrective actions. A case study based on the data available from the Tennessee Eastman Process simulator under feedback control loops (Matlab) will be presented. The results from the proposed method are compared with currently available methods through simulations in R statistics software.
  •  
35.
  • Capaci, Francesca (author)
  • Adapting Experimental and Monitoring Methods for Continuous Processes under Feedback Control : Challenges, Examples, and Tools
  • 2019
  • Doctoral thesis (other academic/artistic)abstract
    • Continuous production covers a significant part of today’s industrial manufacturing. Consumer goods purchased on a frequent basis, such as food, drugs, and cosmetics, and capital goods such as iron, chemicals, oil, and ore come through continuous processes. Statistical process control (SPC) and design of experiments (DoE) play important roles as quality control and product and process improvement methods. SPC reduces product and process variation by eliminating assignable causes, while DoE shows how products and processes may be improved through systematic experimentation and analysis. Special issues emerge when applying these methods to continuous process settings, such as the need to simultaneously analyze massive time series of autocorrelated and cross-correlated data. Another important characteristic of most continuous processes is that they operate under engineering process control (EPC), as in the case of feedback controllers. Feedback controllers transform processes into closed-loop systems and thereby increase the process and analysis complexity and application of SPC and DoE methods that need to be adapted accordingly. For example, the quality characteristics or process variables to be monitored in a control chart or the experimental factors in an experiment need to be chosen considering the presence of feedback controllers.The main objective of this thesis is to suggest adapted strategies for applying experimental and monitoring methods (namely, DoE and SPC) to continuous processes under feedback control. Specifically, this research aims to [1] identify, explore, and describe the potential challenges when applying SPC and DoE to continuous processes; [2] propose and illustrate new or adapted SPC and DoE methods to address some of the issues raised by the presence of feedback controllers; and [3] suggest potential simulation tools that may be instrumental in SPC and DoE methods development.The results are summarized in five appended papers. Through a literature review, Paper A outlines the SPC and DoE implementation challenges for managers, researchers, and practitioners. For example, the problems due to process transitions, the multivariate nature of data, serial correlation, and the presence of EPC are discussed. Paper B describes the issues and potential strategies in designing and analyzing experiments on processes operating under closed- loop control. Two simulated examples in the Tennessee Eastman (TE) process simulator show the benefits of using DoE methods to improve these industrial processes. Paper C provides guidelines on how to use the revised TE process simulator under a decentralized control strategy as a testbed for SPC and DoE methods development in continuous processes. Papers D and E discuss the concurrent use of SPC in processes under feedback control. Paper D further illustrates how step and ramp disturbances manifest themselves in single-input single-output processes controlled by variations in the proportional-integral-derivative control and discusses the implications for process monitoring. Paper E describes a two-step monitoring procedure for multivariate processes and explains the process and controller performance when out-of-controlprocess conditions occur.
  •  
36.
  • Capaci, Francesca (author)
  • Contributions to the Use of Statistical Methods for Improving Continuous Production
  • 2017
  • Licentiate thesis (other academic/artistic)abstract
    • Complexity of production processes, high computing capabilities, and massive datasets characterize today’s manufacturing environments, such as those of continuous andbatch production industries. Continuous production has spread gradually acrossdifferent industries, covering a significant part of today’s production. Commonconsumer goods such as food, drugs, and cosmetics, and industrial goods such as iron,chemicals, oil, and ore come from continuous processes. To stay competitive intoday’s market requires constant process improvements in terms of both effectivenessand efficiency. Statistical process control (SPC) and design of experiments (DoE)techniques can play an important role in this improvement strategy. SPC attempts toreduce process variation by eliminating assignable causes, while DoE is used toimprove products and processes by systematic experimentation and analysis. However,special issues emerge when applying these methods in continuous process settings.Highly automated and computerized processes provide an exorbitant amount ofserially dependent and cross-correlated data, which may be difficult to analyzesimultaneously. Time series data, transition times, and closed-loop operation areexamples of additional challenges that the analyst faces.The overall objective of this thesis is to contribute to using of statisticalmethods, namely SPC and DoE methods, to improve continuous production.Specifically, this research serves two aims: [1] to explore, identify, and outlinepotential challenges when applying SPC and DoE in continuous processes, and [2] topropose simulation tools and new or adapted methods to overcome the identifiedchallenges.The results are summarized in three appended papers. Through a literaturereview, Paper A outlines SPC and DoE implementation challenges for managers,researchers, and practitioners. For example, problems due to process transitions, themultivariate nature of data, serial correlation, and the presence of engineering processcontrol (EPC) are discussed. Paper B further explores one of the DoE challengesidentified in Paper A. Specifically, Paper B describes issues and potential strategieswhen designing and analyzing experiments in processes operating under closed-loopcontrol. Two simulated examples in the Tennessee Eastman (TE) process simulatorshow the benefits of using DoE techniques to improve and optimize such industrialprocesses. Finally, Paper C provides guidelines, using flow charts, on how to use thecontinuous process simulator, “The revised TE process simulator,” run with adecentralized control strategy as a test bed for developing SPC and DoE methods incontinuous processes. Simulated SPC and DoE examples are also discussed.
  •  
37.
  • Capaci, Francesca, et al. (author)
  • Exploring the Use of Design of Experiments in Industrial Processes Operating Under Closed-Loop Control
  • 2017
  • In: Quality and Reliability Engineering International. - : John Wiley & Sons. - 0748-8017 .- 1099-1638. ; 33:7, s. 1601-1614
  • Journal article (peer-reviewed)abstract
    • Industrial manufacturing processes often operate under closed-loop control, where automation aims to keep important process variables at their set-points. In process industries such as pulp, paper, chemical and steel plants, it is often hard to find production processes operating in open loop. Instead, closed-loop control systems will actively attempt to minimize the impact of process disturbances. However, we argue that an implicit assumption in most experimental investigations is that the studied system is open loop, allowing the experimental factors to freely affect the important system responses. This scenario is typically not found in process industries. The purpose of this article is therefore to explore issues of experimental design and analysis in processes operating under closed-loop control and to illustrate how Design of Experiments can help in improving and optimizing such processes. The Tennessee Eastman challenge process simulator is used as a test-bed to highlight two experimental scenarios. The first scenario explores the impact of experimental factors that may be considered as disturbances in the closed-loop system. The second scenario exemplifies a screening design using the set-points of controllers as experimental factors. We provide examples of how to analyze the two scenarios
  •  
38.
  • Capaci, Francesca, et al. (author)
  • Managerial implications for improvingcontinuous production processes
  • 2017
  • Conference paper (peer-reviewed)abstract
    • Data analytics remains essential for process improvement and optimization. Statistical process control and design of experiments are among the most powerful process and product improvement methods available. However, continuous process environments challenge the application of these methods. In this article, we highlight SPC and DoE implementation challenges described in the literature for managers, researchers and practitioners interested in continuous production process improvement. The results may help managers support the implementation of these methods and make researchers and practitioners aware of methodological challenges in continuous process environments.
  •  
39.
  • Capaci, Francesca, et al. (author)
  • On Monitoring Industrial Processes under Feedback Control
  • 2020
  • In: Quality and Reliability Engineering International. - : John Wiley & Sons. - 0748-8017 .- 1099-1638. ; 36:8, s. 2720-2737
  • Journal article (peer-reviewed)abstract
    • The concurrent use of statistical process control and engineering process con-trol involves monitoring manipulated and controlled variables. One multivari-ate control chart may handle the statistical monitoring of all variables, butobserving the manipulated and controlled variables in separate control chartsmay improve understanding of how disturbances and the controller perfor-mance affect the process. In this article, we illustrate how step and ramp dis-turbances manifest themselves in a single-input–single-output system bystudying their resulting signatures in the controlled and manipulated variables.The system is controlled by variations of the widely used proportional-integral-derivative(PID) control scheme. Implications for applying control charts forthese scenarios are discussed.
  •  
40.
  • Capaci, Francesca, et al. (author)
  • Simulating and Analyzing Experiments in the Tennessee Eastman Process Simulator
  • 2015
  • In: ENBIS-15.
  • Conference paper (peer-reviewed)abstract
    • In many of today’s continuous processes, the data collection is usually performed automatically yielding exorbitant amount of data on various quality characteristics and inputs to the system. Moreover, such data are usually collected at high frequency introducing significant serial dependence in time. This violates the independent data assumption of many industrial statistics methods used in process improvement studies. These studies often involve controlled experiments to unearth the causal relationships to be used for robustness and optimization purposes.However real production processes are not suitable for studying new experimental methodologies, partly because unknown disturbances/experimental settings may lead to erroneous conclusions. Moreover large scale experimentation in production processes is frowned upon due to consequent disturbances and production delays. Hence realistic simulation of such processes offers an excellent opportunity for experimentation and methodological development.One commonly used process simulator is the Tennessee Eastman (TE) challenge chemical process simulator (Downs & Vogel, 1993)[1]. The process produces two products from four reactants, containing 41 measured variables and 12 manipulated variables. In addition to the process description, the problem statement defines process constraints, 20 types of process disturbances, and six operating modes corresponding to different production rates and mass ratios in the product stream.The purpose of this paper is to illustrate the use of the TE process with an appropriate feedback control as a test-bed for the methodological developments of new experimental design and analysis techniques.The paper illustrates how two-level experimental designs can be used to identify how the input factors affect the outputs in a chemical process.Simulations using Matlab/Simulink software are used to study the impact of e.g. process disturbances, closed loop control and autocorrelated data on different experimental arrangements.The experiments are analysed using a time series analysis approach to identify input-output relationships in a process operating in closed-loop with multivariate responses. The dynamics of the process are explored and the necessary run lengths for stable effect estimates are discussed.
  •  
41.
  • Capaci, Francesca, et al. (author)
  • Simulating Experiments in Closed-Loop Control Systems
  • 2016
  • In: ENBIS-16 in Sheffield.
  • Conference paper (peer-reviewed)abstract
    • Design of Experiments (DoE) literature extensively discusses how to properly plan, conduct and analyze experiments for process and product improvement. However, it is typically assumed that the experiments are run on processes operating in open-loop: the changes in experimental factors are directly visible in process responses and are not hidden by (automatic) feedback control. Under this assumption, DoE methods have been successfully applied in process industries such as chemical, pharmaceutical and biological industries.However, the increasing instrumentation, automation and interconnectedness are changing how the processes are run. Processes often involve engineering process control as in the case of closed-loop systems. The closed-loop environment adds complexity to experimentation and analysis since the experimenter must account for the control actions that may aim to keep a response variable at its set-point value.  The common approach to experimental design and analysis will likely need adjustments in the presence of closed-loop controls. Careful consideration is for instance needed when the experimental factors are chosen. Moreover, the impact of the experimental factors may not be directly visible as changes in the response variables (Hild, Sanders, & Cooper, 2001). Instead other variables may need to be used as proxies for the intended response variable(s).The purpose of this presentation is to illustrate how experiments in closed-loop system can be planned and analyzed. A case study based on the Tennessee Eastman Process simulator run with a decentralized feedback control strategy (Matlab) (Lawrence Ricker, 1996) is discussed and presented. 
  •  
42.
  • Capaci, Francesca, et al. (author)
  • The Revised Tennessee Eastman Process Simulator as Testbed for SPC and DoE Methods
  • 2019
  • In: Quality Engineering. - : Taylor & Francis. - 0898-2112 .- 1532-4222. ; 31:2, s. 212-229
  • Journal article (peer-reviewed)abstract
    • Engineering process control and high-dimensional, time-dependent data present great methodological challenges when applying statistical process control (SPC) and design of experiments (DoE) in continuous industrial processes. Process simulators with an ability to mimic these challenges are instrumental in research and education. This article focuses on the revised Tennessee Eastman process simulator providing guidelines for its use as a testbed for SPC and DoE methods. We provide flowcharts that can support new users to get started in the Simulink/Matlab framework, and illustrate how to run stochastic simulations for SPC and DoE applications using the Tennessee Eastman process.
  •  
43.
  • Capehart, Shay R., et al. (author)
  • Designing fractional factorial split-plot experiments using integer programming
  • 2011
  • In: International Journal of Experimental Design and Process Optimisation. - 2040-2252 .- 2040-2260. ; 2:1, s. 34-57
  • Journal article (peer-reviewed)abstract
    • Split-plot designs are commonly used in industrial experiments when there are hard-to-change and easy-to-change factors. Due to the number of factors and resource limitations, it is more practical to run a fractional factorial split-plot (FFSP) design. These designs are variations of the fractional factorial (FF) design, with the restricted randomisation structure to account for the whole plots and subplots. We discuss the formulation of FFSP designs using integer programming (IP) to achieve various design criteria. We specifically look at the maximum number of clear two-factor interactions and variations on this criterion.
  •  
44.
  • Conseil-Gudla, Helene, et al. (author)
  • Transient risk of water layer formation on PCBAs in different climates: Climate data analysis and experimental study
  • 2022
  • In: Microelectronics and reliability. - : Elsevier. - 0026-2714 .- 1872-941X. ; 136
  • Journal article (peer-reviewed)abstract
    • The reliability of electronic devices depends on the environmental loads at which they are exposed. Climatic conditions vary greatly from one geographical location to another (from hot and humid to cold and dry areas), and the temperature and humidity vary from season to season and from day to day. High levels of temperature and relative humidity mean high water content in the air, but saturated conditions (i.e. 100 % RH) can also be reached at low temperatures. This paper analyses the relationship between temperature, dew point temperature, their difference (here called ΔT), and occurrence and time period of dew point closeness to temperature on transient condensation effects on electronics.This paper has two parts: (i) Data analysis of typical climate profiles within the different zones of the Köppen -Geiger classification to pick up conditions where ΔT is very low (for example ≤0.4 °C). Various summary statistics of these events are calculated in order to assess the temperature at which these events happen, their durations and their frequency and (ii) Empirical investigation of the effect of ΔT ≤ 0.4 °C on the reliability of electronics by mimicking an electronic device, for which the time period of the ΔT is varied in one set of experiments, and the ambient temperature is varied in the other. The effect of the packaging of the electronics is also studied in this section.The statistical study of the climate profiles shows that the transient events (ΔT ≤ 0.4 °C) occur in almost every location, at different temperature levels, with a duration of at least one observation (where observations were hourly in the database). The experimental results show that presence of the enclosure, cleanliness and bigger pitch size reduce the levels of leakage current, while similar high levels of leakage current are observed for the different durations of the transient events, indicating that these climatic transient conditions can have a big impact on the electronics reliability.
  •  
45.
  • Dehlendorff, Christian, et al. (author)
  • Analysis of computer experiments with multiple noise sources
  • 2010
  • In: Quality and Reliability Engineering International. - : Wiley. - 0748-8017 .- 1099-1638. ; 26:2, s. 137-146
  • Journal article (peer-reviewed)abstract
    • In this paper we present a modeling framework for analyzing computermodels with two types of variations. The paper is based on a case study of an orthopedic surgical unit, which has both controllable and uncontrollable factors. Our results show that this structure of variation can be modeled effectively with linear mixed effects models and generalized additive models
  •  
46.
  • Dehlendorff, Christian, et al. (author)
  • Conditional Value at Risk as a Measure for Waiting Time in Simulations of Hospital Units
  • 2010
  • In: Quality Technology & Quantitative Management. - : Informa UK Limited. - 1684-3703 .- 1811-4857. ; 7:3, s. 321-336
  • Journal article (peer-reviewed)abstract
    • The utility of conditional value at risk (CVaR) of a sample of waiting times as a measure for reducing long waiting times is evaluated with special focus on patient waiting times in a hospital. CVaR is the average of the longest waiting times, i.e., a measure at the tail of the waiting time distribution. The presented results are based on a discrete event simulation (DES) model of an orthopedic surgical unit at a university hospital in Denmark. Our analysis shows that CVaR offers a highly reliable performance measure. The measure targets the longest waiting times and these are generally accepted to be the most problematic from the points of view of both the patients and the management. Moreover, CVaR can be seen as a compromise between the well known measures: average waiting time and the maximum waiting time
  •  
47.
  • Dehlendorff, Christian, et al. (author)
  • Designing simulation experiments with controllable and uncontrollable factors
  • 2008
  • In: 2008 Winter Simuation Conference. - Piscataway, NJ : IEEE Communications Society. - 9781424427079 ; , s. 2909-2915
  • Conference paper (peer-reviewed)abstract
    • In this study we propose a new method for designing computer experiments inspired by the split plot designs used in physical experimentation. The basic layout is that each set of controllable factor settings corresponds to a whole plot for which a number of subplots, each corresponding to one combination of settings of the uncontrollable factors, is employed. The caveat is a desire that the subplots within each whole plot cover the design space uniformly. A further desire is that in the combined design, where all experimental runs are considered at once, the uniformity of the design space coverage should be guaranteed. Our proposed method allows for a large number of uncontrollable and controllable settings to be run in a limited number of runs while uniformly covering the design space for the uncontrollable factors
  •  
48.
  • Dehlendorff, Christian, et al. (author)
  • Designing simulation experiments with controllable and uncontrollable factors for applications in healthcare
  • 2011
  • In: The Journal of the Royal Statistical Society, Series C. - : Oxford University Press (OUP). - 0035-9254 .- 1467-9876. ; 60:1, s. 31-49
  • Journal article (peer-reviewed)abstract
    • We propose a new methodology for designing computer experiments that was inspired by the split-plot designs that are often used in physical experimentation. The methodology has been developed for a simulation model of a surgical unit in a Danish hospital. We classify the factors as controllable and uncontrollable on the basis of their characteristics in the physical system. The experiments are designed so that, for a given setting of the controllable factors, the various settings of the uncontrollable factors cover the design space uniformly. Moreover the methodology allows for overall uniform coverage in the combined design when all settings of the uncontrollable factors are considered at once
  •  
49.
  • Elias, Russel J., et al. (author)
  • An overview of short-term statistical forecasting methods
  • 2006
  • In: International Journal of Management Science and Engineering Management. - : Informa UK Limited. - 1750-9653 .- 1750-9661. ; 1:1, s. 17-36
  • Journal article (peer-reviewed)abstract
    • An overview of statistical forecasting methodology is given, focusing on techniques appropriate to short- and medium-term forecasts. Topics include basic definitions and terminology, smoothing methods, ARIMA models, regression methods, dynamic regression models, and transfer functions. Techniques for evaluating and monitoring forecast performance are also summarized
  •  
50.
  • Elias, Russel J., et al. (author)
  • Demand signal modelling: A short-range panel forecasting algorithm for semiconductor firm device-level demand
  • 2008
  • In: European Journal of Industrial Engineering. - 1751-5254 .- 1751-5262. ; 2:3, s. 253-278
  • Journal article (peer-reviewed)abstract
    • A model-based approach to the forecasting of short-range product demand within the semiconductor industry is presented. Device-level forecast models are developed via a novel two-stage stochastic algorithm that permits leading indicators to be optimally blended with smoothed estimates of unit-level demand. Leading indicators include backlog, bookings, delinquencies, inventory positions, and distributor resales. Group level forecasts are easily obtained through upwards aggregation of the device level forecasts. The forecasting algorithm is demonstrated at two major US-based semiconductor manufacturers. The first application involves a product family consisting of 254 individual devices with a 26-month training dataset and eight-month ex situ validation dataset. A subsequent demonstration refines the approach, and is demonstrated across a panel of six high volume devices with a 29-month training dataset and a 13-month ex situ validation dataset. In both implementations, significant improvement is realised versus legacy forecasting systems
  •  
Skapa referenser, mejla, bekava och länka
  • Result 1-50 of 128
Type of publication
journal article (101)
conference paper (18)
research review (4)
book (2)
other publication (1)
doctoral thesis (1)
show more...
licentiate thesis (1)
show less...
Type of content
peer-reviewed (118)
other academic/artistic (9)
pop. science, debate, etc. (1)
Author/Editor
Kulahci, Murat (127)
Bisgaard, Søren (27)
Vanhatalo, Erik (15)
Bergquist, Bjarne (13)
Capaci, Francesca (11)
Montgomery, Douglas ... (9)
show more...
Palazoglu, Ahmet (8)
Cacciarelli, Davide (6)
Tyssedal, John Sølve (6)
Khan, Abdul Rauf (6)
Spooner, Max (5)
Frumosu, Flavia Dali ... (5)
Gajjar, Shriram (5)
Dehlendorff, Christi ... (4)
Andersen, Klaus Kaae ... (4)
Andersen, Morten (3)
Almimi, Ashraf A. (3)
Udugama, Isuru A. (3)
Gernaey, Krist V. (3)
Box, George E.P. (3)
Schioler, Henrik (3)
McClary, Daniel W. (3)
Li, Jing (2)
Vännman, Kerstin (2)
Andersen, Emil B. (2)
Bayer, Christoph (2)
Kauppila, Osmo (2)
Graves, Spencer B. (2)
Marko, Kenneth A. (2)
James, John V. (2)
van Gilder, John F. (2)
Ting, Tom (2)
Zatorski, Hal (2)
Wu, Cuiping (2)
De Ketelaere, Bart (2)
Liang, David (2)
Conseil-Gudla, Helen ... (2)
Ambat, Rajan (2)
Sessa, Maurizio (2)
Clemmensen, Line Har ... (2)
Elias, Russel J. (2)
Nielsen, Bo Friis (2)
Frumosu, Flavia D. (2)
Ørnskov Rønsch, Geor ... (2)
Shan, Shuo (2)
Gronskyte, Ruta (2)
Hviid, Marchen Sonja (2)
Guyonvarch, Estelle (2)
Ramin, Elham (2)
Spooner, Max Peter (2)
show less...
University
Luleå University of Technology (128)
Umeå University (1)
Language
English (127)
Swedish (1)
Research subject (UKÄ/SCB)
Engineering and Technology (119)
Natural sciences (12)
Medical and Health Sciences (1)
Social Sciences (1)

Year

Kungliga biblioteket hanterar dina personuppgifter i enlighet med EU:s dataskyddsförordning (2018), GDPR. Läs mer om hur det funkar här.
Så här hanterar KB dina uppgifter vid användning av denna tjänst.

 
pil uppåt Close

Copy and save the link in order to return to this view