SwePub
Sök i SwePub databas

  Utökad sökning

Träfflista för sökning "WFRF:(Vanhatalo Erik) "

Sökning: WFRF:(Vanhatalo Erik)

  • Resultat 1-25 av 41
Sortera/gruppera träfflistan
   
NumreringReferensOmslagsbildHitta
1.
  • Bergquist, Bjarne, et al. (författare)
  • A Bayesian analysis of unreplicated two-level factorials using effects sparsity, hierarchy, and heredity
  • 2011
  • Ingår i: Quality Engineering. - : Informa UK Limited. - 0898-2112 .- 1532-4222. ; 23:2, s. 152-166
  • Tidskriftsartikel (refereegranskat)abstract
    • This article proposes a Bayesian procedure to calculate posterior probabilities of active effects for unreplicated two-level factorials. The results from a literature survey are used to specify individual prior probabilities for the activity of effects and the posterior probabilities are then calculated in a three-step procedure where the principles of effects sparsity, hierarchy, and heredity are successively considered. We illustrate our approach by reanalyzing experiments found in the literature.
  •  
2.
  • Bergquist, Bjarne, et al. (författare)
  • Alive and kicking–but will Quality Management be around tomorrow? : A Swedish academia perspective
  • 2012
  • Konferensbidrag (refereegranskat)abstract
    • Purpose: There is a lack of a recognized conception of quality management (QM) comprises of, as well as a clear roadmap of where QM is heading. The purpose of this article is to investigate how QM is perceived today by scholars at three Swedish universities, but also how and into what QM is expected to develop into in twenty years.Methodology: Data have been collected through three structured workshops using affinity diagrams with scholars teaching and performing research in the QM field affiliated with three different Swedish universities.Findings: The results indicate that current QM is perceived similarly among the universities today, although the taxonomy differs slightly. QM is described as a fairly wide discipline consisting of a set of core of principles that in turn guide which methods and tools that currently by many are perceived as the core of the discipline. The outlook for the future differs more where three possible development directions for QM are seen: [1] searching for a “discipline X” where QM can contribute while keeping its toolbox, [2] focus on a core based on the traditional quality technology toolbox with methods and tools, and [3] a risk that QM, as it is today, may seize to exist and be diffused into other disciplines. Originality/value: This article contributes with a viewpoint on QM today and its future development from the academicians’ perspective.
  •  
3.
  • Bergquist, Bjarne, et al. (författare)
  • Alive and kicking–but will Quality Management be around tomorrow? A Swedish academia perspective
  • 2012
  • Ingår i: Quality Innovation Prosperity. - : Technical University of Kosice, Faculty of Materials, Metallurgy and Recycling. - 1335-1745 .- 1338-984X. ; 16:2, s. 1-18
  • Tidskriftsartikel (refereegranskat)abstract
    • The purpose of this article is to describe how Quality Management (QM) is perceived today by scholars at three Swedish universities, and into what QM is expected to develop into in twenty years. Data were collected through structured workshops using affinity diagrams with scholars teaching and performing research in the QM field. The results show that QM currently is perceived as consisting of a set of core of principles, methods and tools. The future outlook includes three possible development directions for QM are seen: [1] searching for a “discipline X” where QM can contribute while keeping its toolbox, [2] focus on a core based on the traditional quality technology toolbox with methods and tools, and [3] a risk that QM, as it is today, may seize to exist and be diffused into other disciplines.
  •  
4.
  • Bergquist, Bjarne, et al. (författare)
  • Cleaning of Railway Track Measurement Data forBetter Maintenance Decisions
  • 2019
  • Ingår i: Proceedings of the 5<sup>th</sup> International Workshop and Congress on eMaintenance. - : Luleå University of Technology. ; , s. 9-15
  • Konferensbidrag (refereegranskat)abstract
    • Data of sufficient quality, quantity and validity constitute a sometimes overlooked basis for eMaintenance. Missing data, heterogeneous data types, calibration problems, or non-standard distributions are common issues of operation and maintenance data. Railway track geometry data used for maintenance planning exhibit all the above issues. They also have unique features stemming from their collection by measurement cars running along the railway network. As the track is a linear asset, measured geometry data need to be precisely located to be useful. However, since the sensors on the measurement car are moving along the track, the observations’ geographical sampling positions come with uncertainty. Another issue is that different seasons and othertime restrictions (e.g. related to the timetable) prohibit regular sampling. Hence, prognostics related to remaining useful life (RUL) are challenging since most forecasting methods require a fixed sampling frequency.This paper discusses methods for data cleaning, data condensation and data extraction from large datasets collected by measurement cars. We discuss missing data replacement, dealing with autocorrelation or cross-correlation, and consequences of not fulfilling methodological pre-conditions such as estimating probabilities of failures using data that do not follow the assumed distributions or data that are dependent. We also discuss outlier detection, dealing with data coming from multiple distributions, of unknown calibrations and other issues seen in railway track geometry data. We also discuss the consequences of not addressing or mishandling quality issues of such data. 
  •  
5.
  • Bergquist, Bjarne, et al. (författare)
  • In-situ measurement in the iron ore pellet distribution chain using active RFID technology
  • 2020
  • Ingår i: Powder Technology. - : Elsevier. - 0032-5910 .- 1873-328X. ; 361, s. 791-802
  • Tidskriftsartikel (refereegranskat)abstract
    • The active radio frequency identification (RFID) technique is used for in-situ measurement of acceleration and temperature in the distribution chain of iron ore pellets. The results of this paper are based on two experiments, in which active RFID transponders were released into train wagons or product bins. RFID exciters and readers were installed downstream in a harbour storage silo to retrieve data from the active transponders. Acceleration peaks and temperatures were recorded. The results imply that in-situ data can aid the understanding of induced stresses along the distribution chain to, for example, reduce pellet breakage and dusting. In-situ data can also increase understanding of product mixing behaviour and product residence times in silos. Better knowledge of stresses, product mixing and residence times are beneficial to process and product quality improvement, to better understand the transportation process, and to reduce environmental impacts due to dusting.
  •  
6.
  • Bergquist, Bjarne, et al. (författare)
  • Power Analysis of Methods for Analysing Unreplicated Factorial Experiments
  • 2013
  • Konferensbidrag (refereegranskat)abstract
    • Several methods for formal analysis of unreplicated factorial type experiments have been proposed in the literature. Based on a simulation study, five formal methods found in the literature based on the effect sparsity principle have been studied. The simulation included 23 and 24 type factorials with one, two, or four active effects. The simulated signal-to-noise ratios for the effects were all between two and four, and the Type I and Type II errors of the analysis methods were analysed. Preliminary results show that Bayesian models are more powerful in these contexts, especially if informative priors based on the effect heredity and effect hierarchy principles are used.
  •  
7.
  • Capaci, Francesca, et al. (författare)
  • A two-step procedure for fault detection in the Tennessee Eastman Process simulator
  • 2016
  • Konferensbidrag (refereegranskat)abstract
    • High-technological and complex production processes and high availability and sample frequencies of data in large scale industrial processes need the concurrent development of appropriate statistical control tools and monitoring techniques. Therefore, multivariate control charts based on latent variables are essential tools to detect and isolate process faults.Several Statistical Process Control (SPC) charts have been developed for multivariate and megavariate data, such as the Hotelling T2, MCUSUM and MEWMA control charts as well as charts based on principal component analysis (PCA) and dynamic PCA (DPCA). The ability of SPC procedures based on PCA (Kourti, MacGregor 1995) or DPCA (Ku et al. 1995) to detect and isolate process disturbances for a large number of highly correlated (and time-dependent in the case of DPCA) variables has been demonstrated in the literature. However, we argue that the fault isolation capability and the fault detection rate for processes can be improved further for processes operating under feedback control loops (in closed loop).The purpose of this presentation is to illustrate a two-step method where [1] the variables are pre-classified prior to the analysis and [2] the monitoring scheme based on latent variables is implemented. Step 1 involves a structured qualitative classification of the variables to guide the choice of which variables to monitor in Step 2. We argue that the proposed method will be useful for many practitioners of SPC based on latent variables techniques in processes operating in closed loop. It will allow clearer fault isolation and detection and an easier implementation of corrective actions. A case study based on the data available from the Tennessee Eastman Process simulator under feedback control loops (Matlab) will be presented. The results from the proposed method are compared with currently available methods through simulations in R statistics software.
  •  
8.
  • Capaci, Francesca (författare)
  • Adapting Experimental and Monitoring Methods for Continuous Processes under Feedback Control : Challenges, Examples, and Tools
  • 2019
  • Doktorsavhandling (övrigt vetenskapligt/konstnärligt)abstract
    • Continuous production covers a significant part of today’s industrial manufacturing. Consumer goods purchased on a frequent basis, such as food, drugs, and cosmetics, and capital goods such as iron, chemicals, oil, and ore come through continuous processes. Statistical process control (SPC) and design of experiments (DoE) play important roles as quality control and product and process improvement methods. SPC reduces product and process variation by eliminating assignable causes, while DoE shows how products and processes may be improved through systematic experimentation and analysis. Special issues emerge when applying these methods to continuous process settings, such as the need to simultaneously analyze massive time series of autocorrelated and cross-correlated data. Another important characteristic of most continuous processes is that they operate under engineering process control (EPC), as in the case of feedback controllers. Feedback controllers transform processes into closed-loop systems and thereby increase the process and analysis complexity and application of SPC and DoE methods that need to be adapted accordingly. For example, the quality characteristics or process variables to be monitored in a control chart or the experimental factors in an experiment need to be chosen considering the presence of feedback controllers.The main objective of this thesis is to suggest adapted strategies for applying experimental and monitoring methods (namely, DoE and SPC) to continuous processes under feedback control. Specifically, this research aims to [1] identify, explore, and describe the potential challenges when applying SPC and DoE to continuous processes; [2] propose and illustrate new or adapted SPC and DoE methods to address some of the issues raised by the presence of feedback controllers; and [3] suggest potential simulation tools that may be instrumental in SPC and DoE methods development.The results are summarized in five appended papers. Through a literature review, Paper A outlines the SPC and DoE implementation challenges for managers, researchers, and practitioners. For example, the problems due to process transitions, the multivariate nature of data, serial correlation, and the presence of EPC are discussed. Paper B describes the issues and potential strategies in designing and analyzing experiments on processes operating under closed- loop control. Two simulated examples in the Tennessee Eastman (TE) process simulator show the benefits of using DoE methods to improve these industrial processes. Paper C provides guidelines on how to use the revised TE process simulator under a decentralized control strategy as a testbed for SPC and DoE methods development in continuous processes. Papers D and E discuss the concurrent use of SPC in processes under feedback control. Paper D further illustrates how step and ramp disturbances manifest themselves in single-input single-output processes controlled by variations in the proportional-integral-derivative control and discusses the implications for process monitoring. Paper E describes a two-step monitoring procedure for multivariate processes and explains the process and controller performance when out-of-controlprocess conditions occur.
  •  
9.
  • Capaci, Francesca (författare)
  • Contributions to the Use of Statistical Methods for Improving Continuous Production
  • 2017
  • Licentiatavhandling (övrigt vetenskapligt/konstnärligt)abstract
    • Complexity of production processes, high computing capabilities, and massive datasets characterize today’s manufacturing environments, such as those of continuous andbatch production industries. Continuous production has spread gradually acrossdifferent industries, covering a significant part of today’s production. Commonconsumer goods such as food, drugs, and cosmetics, and industrial goods such as iron,chemicals, oil, and ore come from continuous processes. To stay competitive intoday’s market requires constant process improvements in terms of both effectivenessand efficiency. Statistical process control (SPC) and design of experiments (DoE)techniques can play an important role in this improvement strategy. SPC attempts toreduce process variation by eliminating assignable causes, while DoE is used toimprove products and processes by systematic experimentation and analysis. However,special issues emerge when applying these methods in continuous process settings.Highly automated and computerized processes provide an exorbitant amount ofserially dependent and cross-correlated data, which may be difficult to analyzesimultaneously. Time series data, transition times, and closed-loop operation areexamples of additional challenges that the analyst faces.The overall objective of this thesis is to contribute to using of statisticalmethods, namely SPC and DoE methods, to improve continuous production.Specifically, this research serves two aims: [1] to explore, identify, and outlinepotential challenges when applying SPC and DoE in continuous processes, and [2] topropose simulation tools and new or adapted methods to overcome the identifiedchallenges.The results are summarized in three appended papers. Through a literaturereview, Paper A outlines SPC and DoE implementation challenges for managers,researchers, and practitioners. For example, problems due to process transitions, themultivariate nature of data, serial correlation, and the presence of engineering processcontrol (EPC) are discussed. Paper B further explores one of the DoE challengesidentified in Paper A. Specifically, Paper B describes issues and potential strategieswhen designing and analyzing experiments in processes operating under closed-loopcontrol. Two simulated examples in the Tennessee Eastman (TE) process simulatorshow the benefits of using DoE techniques to improve and optimize such industrialprocesses. Finally, Paper C provides guidelines, using flow charts, on how to use thecontinuous process simulator, “The revised TE process simulator,” run with adecentralized control strategy as a test bed for developing SPC and DoE methods incontinuous processes. Simulated SPC and DoE examples are also discussed.
  •  
10.
  • Capaci, Francesca, et al. (författare)
  • Exploring the Use of Design of Experiments in Industrial Processes Operating Under Closed-Loop Control
  • 2017
  • Ingår i: Quality and Reliability Engineering International. - : John Wiley & Sons. - 0748-8017 .- 1099-1638. ; 33:7, s. 1601-1614
  • Tidskriftsartikel (refereegranskat)abstract
    • Industrial manufacturing processes often operate under closed-loop control, where automation aims to keep important process variables at their set-points. In process industries such as pulp, paper, chemical and steel plants, it is often hard to find production processes operating in open loop. Instead, closed-loop control systems will actively attempt to minimize the impact of process disturbances. However, we argue that an implicit assumption in most experimental investigations is that the studied system is open loop, allowing the experimental factors to freely affect the important system responses. This scenario is typically not found in process industries. The purpose of this article is therefore to explore issues of experimental design and analysis in processes operating under closed-loop control and to illustrate how Design of Experiments can help in improving and optimizing such processes. The Tennessee Eastman challenge process simulator is used as a test-bed to highlight two experimental scenarios. The first scenario explores the impact of experimental factors that may be considered as disturbances in the closed-loop system. The second scenario exemplifies a screening design using the set-points of controllers as experimental factors. We provide examples of how to analyze the two scenarios
  •  
11.
  • Capaci, Francesca, et al. (författare)
  • Managerial implications for improvingcontinuous production processes
  • 2017
  • Konferensbidrag (refereegranskat)abstract
    • Data analytics remains essential for process improvement and optimization. Statistical process control and design of experiments are among the most powerful process and product improvement methods available. However, continuous process environments challenge the application of these methods. In this article, we highlight SPC and DoE implementation challenges described in the literature for managers, researchers and practitioners interested in continuous production process improvement. The results may help managers support the implementation of these methods and make researchers and practitioners aware of methodological challenges in continuous process environments.
  •  
12.
  • Capaci, Francesca, et al. (författare)
  • On Monitoring Industrial Processes under Feedback Control
  • 2020
  • Ingår i: Quality and Reliability Engineering International. - : John Wiley & Sons. - 0748-8017 .- 1099-1638. ; 36:8, s. 2720-2737
  • Tidskriftsartikel (refereegranskat)abstract
    • The concurrent use of statistical process control and engineering process con-trol involves monitoring manipulated and controlled variables. One multivari-ate control chart may handle the statistical monitoring of all variables, butobserving the manipulated and controlled variables in separate control chartsmay improve understanding of how disturbances and the controller perfor-mance affect the process. In this article, we illustrate how step and ramp dis-turbances manifest themselves in a single-input–single-output system bystudying their resulting signatures in the controlled and manipulated variables.The system is controlled by variations of the widely used proportional-integral-derivative(PID) control scheme. Implications for applying control charts forthese scenarios are discussed.
  •  
13.
  • Capaci, Francesca, et al. (författare)
  • Simulating and Analyzing Experiments in the Tennessee Eastman Process Simulator
  • 2015
  • Ingår i: ENBIS-15.
  • Konferensbidrag (refereegranskat)abstract
    • In many of today’s continuous processes, the data collection is usually performed automatically yielding exorbitant amount of data on various quality characteristics and inputs to the system. Moreover, such data are usually collected at high frequency introducing significant serial dependence in time. This violates the independent data assumption of many industrial statistics methods used in process improvement studies. These studies often involve controlled experiments to unearth the causal relationships to be used for robustness and optimization purposes.However real production processes are not suitable for studying new experimental methodologies, partly because unknown disturbances/experimental settings may lead to erroneous conclusions. Moreover large scale experimentation in production processes is frowned upon due to consequent disturbances and production delays. Hence realistic simulation of such processes offers an excellent opportunity for experimentation and methodological development.One commonly used process simulator is the Tennessee Eastman (TE) challenge chemical process simulator (Downs & Vogel, 1993)[1]. The process produces two products from four reactants, containing 41 measured variables and 12 manipulated variables. In addition to the process description, the problem statement defines process constraints, 20 types of process disturbances, and six operating modes corresponding to different production rates and mass ratios in the product stream.The purpose of this paper is to illustrate the use of the TE process with an appropriate feedback control as a test-bed for the methodological developments of new experimental design and analysis techniques.The paper illustrates how two-level experimental designs can be used to identify how the input factors affect the outputs in a chemical process.Simulations using Matlab/Simulink software are used to study the impact of e.g. process disturbances, closed loop control and autocorrelated data on different experimental arrangements.The experiments are analysed using a time series analysis approach to identify input-output relationships in a process operating in closed-loop with multivariate responses. The dynamics of the process are explored and the necessary run lengths for stable effect estimates are discussed.
  •  
14.
  • Capaci, Francesca, et al. (författare)
  • Simulating Experiments in Closed-Loop Control Systems
  • 2016
  • Ingår i: ENBIS-16 in Sheffield.
  • Konferensbidrag (refereegranskat)abstract
    • Design of Experiments (DoE) literature extensively discusses how to properly plan, conduct and analyze experiments for process and product improvement. However, it is typically assumed that the experiments are run on processes operating in open-loop: the changes in experimental factors are directly visible in process responses and are not hidden by (automatic) feedback control. Under this assumption, DoE methods have been successfully applied in process industries such as chemical, pharmaceutical and biological industries.However, the increasing instrumentation, automation and interconnectedness are changing how the processes are run. Processes often involve engineering process control as in the case of closed-loop systems. The closed-loop environment adds complexity to experimentation and analysis since the experimenter must account for the control actions that may aim to keep a response variable at its set-point value.  The common approach to experimental design and analysis will likely need adjustments in the presence of closed-loop controls. Careful consideration is for instance needed when the experimental factors are chosen. Moreover, the impact of the experimental factors may not be directly visible as changes in the response variables (Hild, Sanders, & Cooper, 2001). Instead other variables may need to be used as proxies for the intended response variable(s).The purpose of this presentation is to illustrate how experiments in closed-loop system can be planned and analyzed. A case study based on the Tennessee Eastman Process simulator run with a decentralized feedback control strategy (Matlab) (Lawrence Ricker, 1996) is discussed and presented. 
  •  
15.
  • Capaci, Francesca, et al. (författare)
  • The Revised Tennessee Eastman Process Simulator as Testbed for SPC and DoE Methods
  • 2019
  • Ingår i: Quality Engineering. - : Taylor & Francis. - 0898-2112 .- 1532-4222. ; 31:2, s. 212-229
  • Tidskriftsartikel (refereegranskat)abstract
    • Engineering process control and high-dimensional, time-dependent data present great methodological challenges when applying statistical process control (SPC) and design of experiments (DoE) in continuous industrial processes. Process simulators with an ability to mimic these challenges are instrumental in research and education. This article focuses on the revised Tennessee Eastman process simulator providing guidelines for its use as a testbed for SPC and DoE methods. We provide flowcharts that can support new users to get started in the Simulink/Matlab framework, and illustrate how to run stochastic simulations for SPC and DoE applications using the Tennessee Eastman process.
  •  
16.
  • Englund, Stefan, et al. (författare)
  • Granular Flow and Segregation Behavior
  • 2016
  • Konferensbidrag (refereegranskat)abstract
    • 1. Purpose of the presentation. Granular materials such as grain, gravel, powder or pellets can be thought as intermediate state of matter: They can sustain shear like a solid up to a point, but they can also flow (Behringer 1995). However, differences in particulate sizes, shapes or densities have been known to cause segregation when granular materials are flowing. Surface segregation has often been studied. The mechanisms of segregation on a surface are described in many articles (Makse 1999)(Gray, Gajjar et al. 2015)(Lumay, Boschini et al. 2013). Descriptions of segregation behaviour of granular flow below surfaces are less common. Literature related to bulk flow mostly describe a bulk containing a variety of granular sizes (Engblom, Saxén et al. 2012)(Jaehyuk Choi and Arshad Kudrolli and Martin,Z.Bazant 2005). Warehouses such as silos or binges constitute major segregation and mixing points in many granular material transport chains. Such warehouses also subject the granular media to flow or impact induced stresses. Traceability in these kind of continues or semi continues granular flow environments face many challenges. Adding in-situ sensors, so called PATs, is one way to trace material in a granular flow. It is, however, difficult to predict if the sensors experience the same physical stresses as the average granules do if the PATs segregate. To contain required electronics, these sensors with casings may need to be made larger than the bulk particles it is supposed to follow. It is therefore important to understand when larger particles segregate and how to design sensor casings to prevent segregation. However segregation of larger sized or different shaped particles added as single objects to homogeny sized particle flow has, to our knowledge not yet been studied and that is the purpose of this study.2. Results. We show the significant factors which affect segregation behaviour and how these modify segregation behaviour. Depending on shape on silo and type of flow during discharge we also show how shape, size and density on individual grains is depending on velocity rate in granular flow. 3. Research Limitations/Implications. The time consuming method of manually retrieving data of each individual particle and surrounding bulk material limit the volume of data that can be retrieved. Further research will implement Particle Image Velocimetry technology (PIV) and customised software to analyse metadata from experiments in a much more efficient way.4. Practical implications. Practical outcome as a result of this research is connected to the ability to trace batches in continues and semi continues supply chains in time and space. The possibility to design a decision model to a specific supply chain for more customized controlled quality and, as far as we know, completely new possibilities related root cause analyses of quality issues in the production or supply chain.5. Value of presentation. Even if the research is made in relation to local mining industry and the supply chain related to iron ore pellets, based on their value of this research, the greatest value is expected to pharmaceutical or any law and regulation controlled industry where it is such efficient traceability of any product on the market is essential.2. Method. Experiments have been performed using granules of different shapes and densities to study flow and segregation behaviour. The experiments have been performed in a transparent 2D model of a silo, designed to replicate warehouses along an iron ore pellets distribution chain. Bulk material consisting of granules representing iron ore have been discharged together with larger objects of different sizes representing sensors or RFID tags. Shape, size and density are modified on the larger objects while studying mixing, flow behaviour and segregation tendencies using video. Video analyses have been used to measure the flow speed and flow distribution of the bulk and of the larger objects. The video material and individual particles is then statistically analysed to clarify significant factors in segregation behaviours related to the size, form and density of the particles. The results are based on Design Expert, Minitab and customized Matlab software.
  •  
17.
  • Holmbom, Martin, et al. (författare)
  • Performance-based logistics – an illusive panacea or a concept for the future?
  • 2014
  • Ingår i: Journal of Manufacturing Technology Management. - 1741-038X .- 1758-7786. ; 25:7, s. 958-979
  • Tidskriftsartikel (refereegranskat)abstract
    • Purpose– The purpose of this paper is to summarize previously reported benefits, drawbacks and important aspects for implementation of performance-based logistics (PBL), and to identify knowledge gaps.Design/methodology/approach– This is a literature review based on 101 articles. The reviewed articles are relevant to PBL in particular, but also to performance contracting, product-service systems (PSS) and servitization in general. The research method involved database searches, filtering results and reviewing publications.Findings– PBL is a business concept that aims to reduce the customer's total costs for capital-intensive products and increase the supplier's profit. The design of the contract, performance measurements and payment models are important aspects for successful implementation. However, the authors find a reason for concern to be the lack of empirical evidence of the profitability of PBL for the customer and the supplier.Originality/value– This literature review of PBL also includes publications from the related research areas: performance contracting, PSS and servitization. Developing PBL can benefit from results in these research areas.
  •  
18.
  •  
19.
  • Kvarnström, Björn, et al. (författare)
  • Using RFID to improve traceability in process industry : experiments in a distribution chain for iron ore pellets
  • 2010
  • Ingår i: Journal of Manufacturing Technology Management. - : Emerald. - 1741-038X .- 1758-7786. ; 21:1, s. 139-154
  • Tidskriftsartikel (refereegranskat)abstract
    • Purpose: The purpose of the article is to explore the application of Radio Frequency Identification (RFID) to improve traceability in a flow of granular products and to illustrate examples of special issues that need to be considered when using the RFID technique in a process industry setting.Design/methodology/approach: The article outlines a case study at a Swedish mining company including experiments to test the suitability of RFID to trace iron ore pellets (a granular product) in parts of the distribution chain.Findings: The results show that the RFID technique can be used to improve traceability in granular product flows. A number of special issues concerning the use of RFID in process industries are also highlighted, for example, the problems to control the orientation of the transponder in the read area and the risk of product contamination in the supply chain.Research limitations/implications: Even though only a single case has been studied, the results are of a general interest for industries that have granular product flows. However, future research in other industries should be performed to validate the results.Practical implications: The application of RFID described in this article makes it possible to increase productivity and product quality by improving traceability in product flows where traceability normally is problematic. Originality/value: Prior research has mainly focused on RFID applications in discontinuous processes. By contrast, this article presents a novel application of the RFID technique in a continuous process together with specific issues connected to the use of RFID.
  •  
20.
  • Larsson Turtola, Simon, et al. (författare)
  • Integrating mixture experiments and six sigma methodology to improve fibre‐reinforced polymer composites
  • 2022
  • Ingår i: Quality and Reliability Engineering International. - : John Wiley & Sons. - 0748-8017 .- 1099-1638. ; 38:4, s. 2233-2254
  • Tidskriftsartikel (refereegranskat)abstract
    • This article illustrates a Six Sigma project aimed at reducing manufacturing-induced visual deviations for fibre-reinforced polymer (FRP) composites. For a European composites manufacturer, such visual deviations lead to scrapping of cylindrical composite bodies and subsequent environmental impact. The composite bodies are manufactured through vacuum infusion, where a resin mixture impregnates a fibreglass preform and cures, transforming from liquid to solid state. We illustrate the define-measure-analyse-improve-control (DMAIC) steps of the Six Sigma project. Specific emphasis is placed on the measure and analyse steps featuring a 36-run computer-generated mixture experiment with six resin mixture components and six responses. Experimental analysis establishes causal relationships between mixture components and correlated resin characteristics, which can be used to control resin characteristics. Two new resin mixtures were developed and tested in the improve step using the understanding developed in previous steps. Manufacturing-induced visual deviations were greatly reduced by adjusting the resin mixture to induce a slower curing process. Further refinement of the mixture was made in the control step. A production scrap rate of 5% due to visual deviations was measured during a monitoring period of 5 months after the resin mixture change. The scrap rate was substantially improved compared to the historical level (60%). The successful experimental investigation integrated in this Six Sigma project is expected to generate increased quality, competitiveness, and substantial savings.
  •  
21.
  • Lundkvist, Peder, et al. (författare)
  • Identifying Process Dynamics through a Two-Level Factorial Experiment
  • 2014
  • Ingår i: Quality Engineering. - : Informa UK Limited. - 0898-2112 .- 1532-4222. ; 26:2, s. 154-167
  • Tidskriftsartikel (refereegranskat)abstract
    • Industrial experiments are often subjected to critical disturbances and in a small design with few runs the loss of experimental runs may dramatically reduce analysis power. This article considers a common situation in process industry where the observed responses are represented by time series. A time series analysis approach to analyze two-level factorial designs affected by disturbances is developed and illustrated by analyzing a blast furnace experiment. In particular, a method based on transfer function-noise modeling is compared with a ‘traditional’ analysis using averages of the response in each run as the single response in an analysis of variance (ANOVA).
  •  
22.
  • Lundkvist, Peder, et al. (författare)
  • Statistical methods – still ignored? The testimony of Swedish alumni
  • 2020
  • Ingår i: Total Quality Management and Business Excellence. - : Taylor & Francis. - 1478-3363 .- 1478-3371. ; 31:3-4, s. 245-262
  • Tidskriftsartikel (refereegranskat)abstract
    • Researchers have promoted statistical improvement methods as essential for product and process improvement for decades. However, studies show that their use has been moderate at best. This study aims to assess the use of statistical process control (SPC), process capability analysis, and design of experiments (DoE) over time. The study also highlights important barriers for the wider use of these methods in Sweden as a follow-up study of a similar Swedish study performed in 2005 and of two Basque-based studies performed in 2009 and 2010. While the survey includes open-ended questions, the results are mainly descriptive and confirm results of previous studies. This study shows that the use of the methods has become more frequent compared to the 2005 study. Larger organisations (>250 employees) use the methods more frequently than smaller organisations, and the methods are more widely utilised in the industry than in the service sector. SPC is the most commonly used of the three methods while DoE is least used. Finally, the greatest barriers to increasing the use of statistical methods were: insufficient resources regarding time and money, low commitment of middle and senior managers, inadequate statistical knowledge, and lack of methods to guide the user through experimentations.
  •  
23.
  • Sedghi, Mahdieh, 1984-, et al. (författare)
  • A Taxonomy of Railway Track Maintenance Planning and Scheduling : A Review and Research Trends
  • 2021
  • Ingår i: Reliability Engineering & System Safety. - : Elsevier. - 0951-8320 .- 1879-0836. ; 215
  • Forskningsöversikt (refereegranskat)abstract
    • Railway track maintenance and renewal are vital for railway safety, train punctuality, and travel comfort. Therefore, having cost-effective maintenance is critical in managing railway infrastructure assets. There has been a considerable amount of research performed on mathematical and decision support models for improving the application of railway track maintenance planning and scheduling. This article reviews the literature in decision support models for railway track maintenance planning and scheduling and transforms the results into a problem taxonomy. Furthermore, the article discusses current approaches in optimising maintenance planning and scheduling, research trends, and possible gaps in the related decision-making models.
  •  
24.
  • Sedghi, Mahdieh, 1984- (författare)
  • Data-driven predictive maintenance planning and scheduling
  • 2020
  • Licentiatavhandling (övrigt vetenskapligt/konstnärligt)abstract
    • The railway track network is one of the major modes of transportation and among a country’s most valuable infrastructure assets. Maintenance and renewal of railway infrastructure have a vital role in safety performance, the quality of the ride, train punctuality, and the life cycle cost of assets. Due to the large proportion of maintenance costs, increasing the efficiency of maintenance through optimised planning can result in high amounts of cost-saving. Moreover, from a safety perspective, late maintenance intervention can result in defective track and rollingstock components, which in severe cases, can cause severe accidents such as derailments.An effective maintenance management system is required to ensure the availability of the infrastructure system and meet the increasing capacity demand. The recent rapid technological revolution and increasing deployment of sensors and connected devices created new possibilities to increase the maintenance strategy effectiveness in the railway network. The purpose of this thesis is to expand the knowledge and methods for planning and scheduling of railway infrastructure maintenance. The research vision is to find quantitative approaches for integrated tactical planning and operational scheduling of predictive condition-based maintenance which can be put to practical use and improve the efficiency of the railway system.First, a thorough literature review study is performed to identify improvement policies for maintenance planning and scheduling in the literature and also to analyse the current approaches in optimising the maintenance planning and scheduling problem. Second, a novel data-driven multi-level decision-making framework to improve the efficiency of maintenance planning and scheduling is developed. The proposed framework aims to support the selection of track segments for maintenance by providing a practical degradation prediction model based on available condition measurement data. The framework considers the uncertainty of future predictions using the probability of surpassing a maintenance limit instead of using the predicted value. Moreover, an extensive total maintenance cost formulation is developed to include both direct and indirect preventive and corrective costs to observe the effect of using cost optimisation and grouping algorithms at the operational scheduling level.The performance of the proposed framework is evaluated through a case study based on data from a track section of the iron ore line between Boden and Luleå. The results indicate that the proposed approach can lead to cost savings in both optimal and grouping plans. This framework may be a useful decision support tool in the automated planning and scheduling of maintenance based on track geometry measurements.
  •  
25.
  •  
Skapa referenser, mejla, bekava och länka
  • Resultat 1-25 av 41

Kungliga biblioteket hanterar dina personuppgifter i enlighet med EU:s dataskyddsförordning (2018), GDPR. Läs mer om hur det funkar här.
Så här hanterar KB dina uppgifter vid användning av denna tjänst.

 
pil uppåt Stäng

Kopiera och spara länken för att återkomma till aktuell vy