SwePub
Sök i SwePub databas

  Utökad sökning

Träfflista för sökning "L773:0748 8017 OR L773:1099 1638 "

Sökning: L773:0748 8017 OR L773:1099 1638

  • Resultat 1-50 av 60
Sortera/gruppera träfflistan
   
NumreringReferensOmslagsbildHitta
1.
  • Deleryd, Mats, et al. (författare)
  • Process capability plots—a quality improvement tool
  • 1999
  • Ingår i: Quality and Reliability Engineering International. - 0748-8017 .- 1099-1638. ; 15:3, s. 213-227
  • Tidskriftsartikel (refereegranskat)abstract
    • We introduce the concept of process capability plots, which are powerful tools to monitor and improve the capability of industrial processes. An advantage of using a process capability plot, compared with using a traditional process capability index alone, when deciding whether a process can be considered capable or not, is that we will instantly get information about the location and spread of the studied characteristic. When the process is non-capable, the plots are helpful when trying to understand if it is the variability, the deviation from target or both that need to be reduced to improve the capability. In this way the proposed graphical methods give a clear direction of quality improvement. We evaluate two different process capability plots, the (δ*, γ*)-plot and the confidence rectangle plot, from a theoretical as well as a practical point of view. When studying them from a theoretical point of view, among other things, a simulation study is conducted to investigate the ability of each of the two methods to identify that a process is capable when it actually is. The comparison from a practical point of view is made by discussing the advantages and disadvantages of the two methods in different practical situations. Based on the above-mentioned comparisons, the recommendation is that the practitioner should use the (δ*, γ*)-plot.
  •  
2.
  • Sandvik Wiklund, Pia, et al. (författare)
  • Finding active factors from unreplicated fractional factorials utilizing the total time on test (TTT) technique
  • 1999
  • Ingår i: Quality and Reliability Engineering International. - 0748-8017 .- 1099-1638. ; 15:3, s. 191-203
  • Tidskriftsartikel (refereegranskat)abstract
    • Much research has been devoted to improving the process of identifying active factors from designed experiments. Generally, the proposed methods rely on an estimate of the experimental error. Here we present a method based on the TTT (total time on test) plot, where the scaled TTT transform enables an evaluation of the contrasts independently of the experimental error. The method can be separated into two parts. The first part consists of a transformed TTT plot for a visual evaluation of data. The second part is more formal and utilizes the cumulative TTT statistic for testing the significance of contrasts. A simulation study shows the power of the method compared with competing methods. Five data sets are used to show that the conclusions drawn are consistent with those obtained using other suggested methods.
  •  
3.
  •  
4.
  • Albing, Malin (författare)
  • Process capability indices for Weibull distributions and upper specification limits
  • 2009
  • Ingår i: Quality and Reliability Engineering International. - : Wiley. - 0748-8017 .- 1099-1638. ; 25:3, s. 317-334
  • Tidskriftsartikel (refereegranskat)abstract
    • We consider a previously proposed class of capability indices that are useful when the quality characteristic of interest has a skewed, zero-bound distribution with a long tail towards large values and there is an upper specification with a pre-specified target value, T = 0. We investigate this class of process capability indices when the underlying distribution is a Weibull distribution and focus on the situation when the Weibull distribution is highly skewed. We propose an estimator of the index in the studied class, based on the maximum likelihood estimators of the parameters in the Weibull distribution, and derive the asymptotic distribution for this estimator. Furthermore, we suggest a decision rule based on the estimated index and its asymptotic distribution, and presents a power comparison between the proposed estimator and a previously studied estimator. A simulation study is also performed to investigate the true significance level when the sample size is small or moderate. An example from a Swedish industry is presented.
  •  
5.
  • Allahkarami, Zeynab, et al. (författare)
  • Mixed-effects model for reliability assessment of dump trucks in heterogeneous operating environment: A case study
  • 2022
  • Ingår i: Quality and Reliability Engineering International. - : John Wiley & Sons. - 0748-8017 .- 1099-1638. ; 38:5, s. 2881-2898
  • Tidskriftsartikel (refereegranskat)abstract
    • Reliability of mining equipment is influenced by different operational and environmental risk factors. However, in practice, including all relevant factors in reliability analysis is not possible. Ignoring the risk factors leads to unobserved heterogeneity and biased estimation of reliability characteristics. This research proposes a semi-parametric mixed-effects model to assess the simultaneous effects of observed and unobserved risk factors on reliability. Furthermore, the model can be used for investigating inter-cluster dependency and variability. To illustrate the model, a comprehensive case study is presented using data from three mines in Iran. The results show that the model can address the effect of unobserved risk factors. Moreover, it is found that the hazard of dump trucks is significantly associated with operator skill, season and the elevation difference between dumping and loading points.
  •  
6.
  • Andersen, Emil B., et al. (författare)
  • An easy to use GUI for simulating big data using Tennessee Eastman process
  • 2022
  • Ingår i: Quality and Reliability Engineering International. - : John Wiley & Sons. - 0748-8017 .- 1099-1638. ; 38:1, s. 264-282
  • Tidskriftsartikel (refereegranskat)abstract
    • Data-driven process monitoring and control techniques and their application to industrial chemical processes are gaining popularity due to the current focus on Industry 4.0, digitalization and the Internet of Things. However, for the development of such techniques, there are significant barriers that must be overcome in obtaining sufficiently large and reliable datasets. As a result, the use of real plant and process data in developing and testing data-driven process monitoring and control tools can be difficult without investing significant efforts in acquiring, treating, and interpreting the data. Therefore, researchers need a tool that effortlessly generates large amounts of realistic and reliable process data without the requirement for additional data treatment or interpretation. In this work, we propose a data generation platform based on the Tennessee Eastman Process simulation benchmark. A graphical user interface (GUI) developed in MATLAB Simulink is presented that enables users to generate massive amounts of data for testing applicability of big data concepts in the realm of process control for continuous time-dependent processes. An R-Shiny app that interacts with the data generation tool is also presented for illustration purposes. The app can visualize the results generated by the Tennessee Eastman Process and can carry out a standard fault detection and diagnosis studies based on PCA. The data generator GUI is available free of charge for research purposes at https://github.com/dtuprodana/TEP. 
  •  
7.
  • Bergquist, Bjarne, et al. (författare)
  • Data Analysis for Condition-Based Railway Infrastructure Maintenance
  • 2015
  • Ingår i: Quality and Reliability Engineering International. - : Wiley. - 0748-8017 .- 1099-1638. ; 31:5, s. 773-781
  • Tidskriftsartikel (refereegranskat)abstract
    • Condition assessment is crucial to optimize condition-based maintenance actions of assets such as railway infrastructure, where a faulty state might have severe consequences. Hence, railways are regularly inspected to detect failure events and prevent the inspected item (e.g. rail) to reach a faulty state with potentially safety critical consequences (e.g. derailment). However, the preventive measures (e.g. condition-based maintenance) initiated by the inspection results may cause traffic disturbances, especially if the expected time to a faulty state is short. The alarm limits are traditionally safety related and often based on geometrical properties of the inspected item. Maintenance limits would reduce the level of emergency, producing earlier alarms and increasing possibilities of planned preventive rather than acute maintenance. However, selecting these earlier maintenance limits in a systematic way while balancing the risk of undetected safety-critical faults and false alarms is challenging. Here, we propose a statistically based approach using condition data of linear railway infrastructure assets. The data were obtained from regular inspections done by a railway track measurement wagon. The condition data were analysed by a control chart approach to evaluate the possibility for earlier detection of derailment hazardous faults using both temporal and spatial information. The study indicates that that the proposed approach could be used for condition assessment of tracks. Control charts led to earlier fault warnings compared to the traditional approach, facilitating planned condition-based maintenance actions and thereby a reduction of track downtime
  •  
8.
  •  
9.
  • Cacciarelli, Davide, et al. (författare)
  • Robust online active learning
  • 2024
  • Ingår i: Quality and Reliability Engineering International. - : John Wiley & Sons. - 0748-8017 .- 1099-1638. ; 40:1, s. 277-296
  • Tidskriftsartikel (refereegranskat)abstract
    • In many industrial applications, obtaining labeled observations is not straightforward as it often requires the intervention of human experts or the use of expensive testing equipment. In these circumstances, active learning can be highly beneficial in suggesting the most informative data points to be used when fitting a model. Reducing the number of observations needed for model development alleviates both the computational burden required for training and the operational expenses related to labeling. Online active learning, in particular, is useful in high-volume production processes where the decision about the acquisition of the label for a data point needs to be taken within an extremely short time frame. However, despite the recent efforts to develop online active learning strategies, the behavior of these methods in the presence of outliers has not been thoroughly examined. In this work, we investigate the performance of online active linear regression in contaminated data streams. Our study shows that the currently available query strategies are prone to sample outliers, whose inclusion in the training set eventually degrades the predictive performance of the models. To address this issue, we propose a solution that bounds the search area of a conditional D-optimal algorithm and uses a robust estimator. Our approach strikes a balance between exploring unseen regions of the input space and protecting against outliers. Through numerical simulations, we show that the proposed method is effective in improving the performance of online active learning in the presence of outliers, thus expanding the potential applications of this powerful tool.
  •  
10.
  • Capaci, Francesca, et al. (författare)
  • Exploring the Use of Design of Experiments in Industrial Processes Operating Under Closed-Loop Control
  • 2017
  • Ingår i: Quality and Reliability Engineering International. - : John Wiley & Sons. - 0748-8017 .- 1099-1638. ; 33:7, s. 1601-1614
  • Tidskriftsartikel (refereegranskat)abstract
    • Industrial manufacturing processes often operate under closed-loop control, where automation aims to keep important process variables at their set-points. In process industries such as pulp, paper, chemical and steel plants, it is often hard to find production processes operating in open loop. Instead, closed-loop control systems will actively attempt to minimize the impact of process disturbances. However, we argue that an implicit assumption in most experimental investigations is that the studied system is open loop, allowing the experimental factors to freely affect the important system responses. This scenario is typically not found in process industries. The purpose of this article is therefore to explore issues of experimental design and analysis in processes operating under closed-loop control and to illustrate how Design of Experiments can help in improving and optimizing such processes. The Tennessee Eastman challenge process simulator is used as a test-bed to highlight two experimental scenarios. The first scenario explores the impact of experimental factors that may be considered as disturbances in the closed-loop system. The second scenario exemplifies a screening design using the set-points of controllers as experimental factors. We provide examples of how to analyze the two scenarios
  •  
11.
  • Capaci, Francesca, et al. (författare)
  • On Monitoring Industrial Processes under Feedback Control
  • 2020
  • Ingår i: Quality and Reliability Engineering International. - : John Wiley & Sons. - 0748-8017 .- 1099-1638. ; 36:8, s. 2720-2737
  • Tidskriftsartikel (refereegranskat)abstract
    • The concurrent use of statistical process control and engineering process con-trol involves monitoring manipulated and controlled variables. One multivari-ate control chart may handle the statistical monitoring of all variables, butobserving the manipulated and controlled variables in separate control chartsmay improve understanding of how disturbances and the controller perfor-mance affect the process. In this article, we illustrate how step and ramp dis-turbances manifest themselves in a single-input–single-output system bystudying their resulting signatures in the controlled and manipulated variables.The system is controlled by variations of the widely used proportional-integral-derivative(PID) control scheme. Implications for applying control charts forthese scenarios are discussed.
  •  
12.
  • Castagliola, Philippe, et al. (författare)
  • Average run length when monitoring capability indices using EWMA
  • 2008
  • Ingår i: Quality and Reliability Engineering International. - : Wiley. - 0748-8017 .- 1099-1638. ; 24:8, s. 941-955
  • Tidskriftsartikel (refereegranskat)abstract
    • In order to monitor unstable but capable processes Castagliola and Vännman have recently suggested a procedure based on an EWMA approach, called EWMA capability chart, for monitoring Vännman's Cp(u,v)-family of capability indices and showed how their proposed approach efficiently monitors capable processes by detecting a decrease or increase in the capability level. The goal of this paper is to investigate the efficiency of this capability chart in terms of ARL. The procedure used for computing this ARL is presented and simple guidelines for obtaining approximations to the optimal EWMA parameters are proposed.
  •  
13.
  • Castagliola, Philippe, et al. (författare)
  • Monitoring capability indices using an EWMA approach
  • 2007
  • Ingår i: Quality and Reliability Engineering International. - : Wiley. - 0748-8017 .- 1099-1638. ; 23:7, s. 769-790
  • Tidskriftsartikel (refereegranskat)abstract
    • When performing a capability analysis it is recommended to first check that the process is stable, for example, by using control charts. However, there are occasions when a process cannot be stabilized, but it is nevertheless capable. Then the classical control charts fail to efficiently monitor the process position and variability. In this paper we propose a new strategy to solve this problem, where capability indices are monitored in place of the classical sample statistics such as the mean, median, standard deviation, or range. The proposed procedure uses the Cp(u,v) family of capability indices proposed by Vännman combined with a logarithmic transformation and an EWMA approach. One important property of the procedure presented here is that the control limits used for the monitoring of capability indices only depend on the capability level assumed for the process. The experimental results presented in this paper demonstrates how this new approach efficiently monitors capable processes by detecting changes in the capability level.
  •  
14.
  • Chakhunashvili, Alexander, 1974, et al. (författare)
  • An EWMA Solution to Detect Shifts in a Bernoulli Process in an Out-of-Control Environment
  • 2006
  • Ingår i: Quality and Reliability Engineering International. - : Wiley. - 1099-1638 .- 0748-8017. ; 22:4, s. 419-428
  • Tidskriftsartikel (refereegranskat)abstract
    • The Exponentially Weighted Moving Average (EWMA) control chart has mainly been used to monitor continuous data, usually under the normality assumption. In addition, a number of EWMA control charts have been proposed for Poisson data. Here, however, we suggest applying the EWMA to hypergeometric data originating from a multivariate Bernoulli process. The problem studied in this paper concerns the wear-out of electronics testers resulting in unnecessary and costly reparations of electronic units. Assuming that the testing process is in statistical control, although the quality of the tested units is not, we can detect the wear-out of a tester by finding assignable causes of variation in that tester. This reasoning forms the basis of a new EWMA procedure designed to detect shifts in a Bernoulli process in an out-of-control environment. Copyright © 2005 John Wiley & Sons, Ltd.
  •  
15.
  • Dehlendorff, Christian, et al. (författare)
  • Analysis of computer experiments with multiple noise sources
  • 2010
  • Ingår i: Quality and Reliability Engineering International. - : Wiley. - 0748-8017 .- 1099-1638. ; 26:2, s. 137-146
  • Tidskriftsartikel (refereegranskat)abstract
    • In this paper we present a modeling framework for analyzing computermodels with two types of variations. The paper is based on a case study of an orthopedic surgical unit, which has both controllable and uncontrollable factors. Our results show that this structure of variation can be modeled effectively with linear mixed effects models and generalized additive models
  •  
16.
  • Frumosu, Flavia D., et al. (författare)
  • Big data analytics using semi‐supervised learning methods
  • 2018
  • Ingår i: Quality and Reliability Engineering International. - : John Wiley & Sons. - 0748-8017 .- 1099-1638. ; 34:7, s. 1413-1423
  • Tidskriftsartikel (refereegranskat)abstract
    • The expanding availability of complex data structures requires development of new analysis methods for process understanding and monitoring. In manufacturing, this is primarily due to high‐frequency and high‐dimensional data available through automated data collection schemes and sensors. However, particularly for fast production rate situations, data on the quality characteristics of the process output tend to be scarcer than the available process data. There has been a considerable effort in incorporating latent structure–based methods in the context of complex data. The research question addressed in this paper is to make use of latent structure–based methods in the pursuit of better predictions using all available data including the process data for which there are no corresponding output measurements, ie, unlabeled data. Inspiration for the research question comes from an industrial setting where there is a need for prediction with extremely low tolerances. A semi‐supervised principal component regression method is compared against benchmark latent structure–based methods, principal components regression, and partial least squares, on simulated and experimental data. In the analysis, we show the circumstances in which it becomes more advantageous to use the semi‐supervised principal component regression over these competing methods.
  •  
17.
  • Frumosu, Flavia D., et al. (författare)
  • Outliers detection using an iterative strategy for semi‐supervised learning
  • 2019
  • Ingår i: Quality and Reliability Engineering International. - : John Wiley & Sons. - 0748-8017 .- 1099-1638. ; 35:5, s. 1408-1423
  • Tidskriftsartikel (refereegranskat)abstract
    • As a direct consequence of production systems' digitalization, high‐frequency and high‐dimensional data has become more easily available. In terms of data analysis, latent structures‐based methods are often employed when analyzing multivariate and complex data. However, these methods are designed for supervised learning problems when sufficient labeled data are available. Particularly for fast production rates, quality characteristics data tend to be scarcer than available process data generated through multiple sensors and automated data collection schemes. One way to overcome the problem of scarce outputs is to employ semi‐supervised learning methods, which use both labeled and unlabeled data. It has been shown that it is advantageous to use a semi‐supervised approach in case of labeled data and unlabeled data coming from the same distribution. In real applications, there is a chance that unlabeled data contain outliers or even a drift in the process, which will affect the performance of the semi‐supervised methods. The research question addressed in this work is how to detect outliers in the unlabeled data set using the scarce labeled data set. An iterative strategy is proposed using a combined Hotelling's T2 and Q statistics and applied using a semi‐supervised principal component regression (SS‐PCR) approach on both simulated and real data sets.
  •  
18.
  • Garmabaki, Amir, et al. (författare)
  • Reliability Modelling of Multiple Repairable Units
  • 2016
  • Ingår i: Quality and Reliability Engineering International. - : Wiley. - 0748-8017 .- 1099-1638. ; 32:7, s. 2329-2343
  • Tidskriftsartikel (refereegranskat)abstract
    • This paper proposes a model selection framework for analysing the failure data of multiple repairable units when they are working in different operational and environmental conditions. The paper provides an approach for splitting the non-homogeneous failure data set into homogeneous groups, based on their failure patterns and statistical trend tests. In addition, when the population includes units with an inadequate amount of failure data, the analysts tend to exclude those units from the analysis. A procedure is presented for modelling the reliability of a multiple repairable units under the influence of such a group to prevent parameter estimation error. We illustrate the implementation of the proposed model by applying it on 12 frequency converters in the Swedish railway system. The results of the case study show that the reliability model of multiple repairable units within a large fleet may consist of a mixture of different stochastic models, i.e. the HPP/RP, TRP, NHPP and BPP. Therefore, relying only on a single model to represent the behaviour of the whole fleet may not be valid and may lead to wrong parameter estimation.
  •  
19.
  • Graves, Spencer B., et al. (författare)
  • Accelerated testing of on-board diagnostics
  • 2007
  • Ingår i: Quality and Reliability Engineering International. - : Wiley. - 0748-8017 .- 1099-1638. ; 23:2, s. 189-201
  • Tidskriftsartikel (refereegranskat)abstract
    • Modern products frequently feature monitors designed to detect actual or impending malfunctions. False alarms (Type I errors) or excessive delays in detecting real malfunctions (Type II errors) can seriously reduce monitor utility. Sound engineering practice includes physical evaluation of error rates. Type II error rates are relatively easy to evaluate empirically. However, adequate evaluation of a low Type I error rate is difficult without using accelerated testing concepts, inducing false alarms using artificially low thresholds and then selecting production thresholds by appropriate extrapolation, as outlined here. This acceleration methodology allows for informed determination of detection thresholds and confidence in monitor performance with substantial reductions over current alternatives in time and cost required for monitor development
  •  
20.
  • Gremyr, Ida, 1975, et al. (författare)
  • Principles of Robust Design Metholology
  • 2008
  • Ingår i: Quality and Reliability Engineering International. - : Wiley. - 1099-1638 .- 0748-8017. ; 24:1, s. 23-35
  • Tidskriftsartikel (refereegranskat)abstract
    • The literature on robust design has focused chiefly on the development of methods for identifying robust design solutions. In this paper we present a literature review of conflicts and agreements on the principles of robust design. Through this review four central principles of robust design are identified: awareness of variation, insensitivity to noise factors, application of various methods, and application in all stages of a design process. These principles are comprised into the following definition of robust design methodology: Robust design methodology means systematic efforts to achieve insensitivity to noise factors. These efforts are founded on an awareness of variation and can be applied in all stages of product design.
  •  
21.
  • Gupta, Shilpa D., et al. (författare)
  • Analysis of signal-response systems using generalized linear mixed models
  • 2010
  • Ingår i: Quality and Reliability Engineering International. - : Wiley. - 0748-8017 .- 1099-1638. ; 26:4, s. 375-385
  • Tidskriftsartikel (refereegranskat)abstract
    • Robust parameter design is one of the important tools used in Design for Six Sigma. In this article, we present an application of the generalized linear mixed model (GLMM) approach to robust design and analysis of signal-response systems. We propose a split-plot approach to the signal-response system characterized by two variance components-within-profile variance and between-profile variance. We demonstrate that explicit modeling of variance components using GLMMs leads to more precise point estimates of important model coefficients with shorter confidence intervals
  •  
22.
  • Gustafson, Anna, et al. (författare)
  • Reliability analysis and comparison between automatic and manual load haul dump machines
  • 2015
  • Ingår i: Quality and Reliability Engineering International. - : Wiley. - 0748-8017 .- 1099-1638. ; 31:3, s. 523-531
  • Tidskriftsartikel (refereegranskat)abstract
    • Today's trend of replacing manually operated vehicles with automated ones will have an impact not only on machine design, working environment and procedures but also on machine breakdown and maintenance procedures. In the harsh environment of underground mines, the transition from manual to automatic operation is believed to fundamentally change the basis for break downs, maintenance and machine design. In this paper, differences and similarities between manual and automatic underground loading equipment is analysed from a reliability point of view. The analysis is based on a case study performed at a Swedish underground mine. In the contrary of common thoughts, this paper proves that there is a difference between the manual and semi-automatic machines and in particular for the transmission, in favour of the manual one. This paper also shows a path for detailed reliability analysis, and the results may be used for improving maintenance programmes for other types of mobile equipment
  •  
23.
  • Johannesson, Pär, 1969, et al. (författare)
  • A Robustness Approach to Reliability
  • 2013
  • Ingår i: Quality and Reliability Engineering International. - : Wiley. - 1099-1638 .- 0748-8017. ; 29:1, s. 17-32
  • Forskningsöversikt (refereegranskat)abstract
    • Reliability of products is here regarded with respect to failure avoidance rather than probability of failure. To avoid failures,we emphasize variation and suggest some powerful tools for handling failures due to variation. Thus, instead of technicalcalculation of probabilities from data that usually are too weak for correct results, we emphasize the statistical thinking thatputs the designers focus on the critical product functions.Making the design insensitive to unavoidable variation is called robust design and is handled by (i) identification andclassification of variation, (ii) design of experiments to find robust solutions, and (iii) statistically based estimations of propersafety margins.Extensions of the classical failure mode and effect analysis (FMEA) are presented. The first extension consists of identifyingfailure modes caused by variation in the traditional bottom–up FMEA analysis. The second variation mode and effect analysis(VMEA) is a top–down analysis, taking the product characteristics as a starting point and analyzing how sensitive thesecharacteristics are to variation.In cases when there is sufficient detailed information of potential failure causes, the VMEA can be applied in its mostadvanced mode, the probabilistic VMEA. Variation is then measured as statistical standard deviations, and sensitivities aremeasured as partial derivatives. This method gives the opportunity to dimension tolerances and safety margins to avoidfailures caused by both unavoidable variation and lack of knowledge regarding failure processes.
  •  
24.
  • Johannesson, Pär, 1969, et al. (författare)
  • Variation mode and effect analysis: an application to fatigue life prediction
  • 2009
  • Ingår i: Quality and Reliability Engineering International. - : Wiley. - 0748-8017 .- 1099-1638. ; 25:2, s. 167-179
  • Tidskriftsartikel (refereegranskat)abstract
    • We present an application of the probabilistic branch of variation mode and effect analysis (VMEA) implemented as a first-order, second-moment reliability method. First order means that the failure function is approximated to be linear around the nominal values with respect to the main influencing variables, while second moment means that only means and variances are taken into account in the statistical procedure. We study the fatigue life of a jet engine component and aim at a safety margin that takes all sources of prediction uncertainties into account. Scatter is defined as random variation due to natural causes, such as non-homogeneous material, geometry variation within tolerances, load variation in usage, and other uncontrolled variations. Other uncertainties are unknown systematic errors, such as model errors in the numerical calculation of fatigue life, statistical errors in estimates of parameters, and unknown usage profile. By treating also systematic errors as random variables, the whole safety margin problem is put into a common framework of second-order statistics. The final estimated prediction variance of the logarithmic life is obtained by summing the variance contributions of all sources of scatter and other uncertainties, and it represents the total uncertainty in the life prediction. Motivated by the central limit theorem, this logarithmic life random variable may be regarded as normally distributed, which gives possibilities to calculate relevant safety margins.
  •  
25.
  •  
26.
  • Kansal, Y., et al. (författare)
  • Coverage-based vulnerability discovery modeling to optimize disclosure time using multiattribute approach
  • 2019
  • Ingår i: Quality and Reliability Engineering International. - : John Wiley & Sons. - 0748-8017 .- 1099-1638. ; 35:1, s. 62-73
  • Tidskriftsartikel (refereegranskat)abstract
    • Software vulnerabilities trend over time has been proposed by various researchers and academicians in recent years. But none of them have considered operational coverage function in vulnerability discovery modeling. In this research paper, we have proposed a generalized statistical model that determines the relationship between operational coverage function and the number of expected vulnerabilities. During the operational phase, possible vulnerable sites are covered and vulnerabilities present at a particular site are discovered with some probability. We have assumed that the proposed model follows the nonhomogeneous Poisson process properties; thus, different distributions are used to formulate the model. The numerical illustration shows that the proposed model performs better and has the good fitness to the Google Chrome data. The second focus of this research paper is to evaluate the total cost incurred by the developer after software release and to identify the optimal vulnerability disclosure time through multiobjective utility function. The proposed vulnerability discovery helps in optimization. The optimal time problem depends on the combined effect of cost, risk, and effort.
  •  
27.
  • Kulahci, Murat (författare)
  • Blocking two-level factorial experiments
  • 2007
  • Ingår i: Quality and Reliability Engineering International. - : Wiley. - 0748-8017 .- 1099-1638. ; 23:3, s. 283-289
  • Tidskriftsartikel (refereegranskat)abstract
    • Blocking is commonly used in experimental design to eliminate unwanted variation by creating more homogeneous conditions for experimental treatments within each block. While it has been a standard practice in experimental design, blocking fractional factorials still presents many challenges due to differences between treatment and blocking variables. Lately, new design criteria such as the total number of clear effects and fractional resolution have been proposed to design blocked two-level fractional factorial experiments. This article presents a flexible matrix representation for two-level fractional factorials that will allow experimenters and software developers to block such experiments based on any design criterion that is suitable with the experimental conditions.
  •  
28.
  • Kulahci, Murat, et al. (författare)
  • Partial confounding and projective properties of plackett-burman designs
  • 2007
  • Ingår i: Quality and Reliability Engineering International. - : Wiley. - 0748-8017 .- 1099-1638. ; 23:7, s. 791-800
  • Tidskriftsartikel (refereegranskat)abstract
    • Screening experiments are typically used when attempting to identify a few active factors in a larger pool of potentially significant factors. In general, two-level regular factorial designs are used, but Plackett-Burman (PB) designs provide a useful alternative. Although PB designs are run-efficient, they confound the main effects with fractions of strings of two-factor interactions, making the analysis difficult. However, recent discoveries regarding the projective properties of PB designs suggest that if only a few factors are active, the original design can be reduced to a full factorial, with additional trials frequently forming attractive patterns. In this paper, we show that there is a close relationship between the partial confounding in certain PB designs and their projective properties. With the aid of examples, we demonstrate how this relationship may help experimenters better appreciate the use of PB designs.
  •  
29.
  • Kulahci, Murat, et al. (författare)
  • Special Issue : Design for Six Sigma
  • 2010
  • Ingår i: Quality and Reliability Engineering International. - : Wiley. - 0748-8017 .- 1099-1638. ; 26:4 Spec Issue, s. 315-
  • Tidskriftsartikel (övrigt vetenskapligt/konstnärligt)
  •  
30.
  • Kumar, Dhananjay (författare)
  • Proportional hazards modelling of repairable systems
  • 1995
  • Ingår i: Quality and Reliability Engineering International. - : Wiley. - 0748-8017 .- 1099-1638. ; 11:5, s. 361-369
  • Tidskriftsartikel (refereegranskat)abstract
    • The purpose of this paper is to illustrate some situations under which the proportional hazards model (PHM) and its extensions can be used for identification of the most important covariates influencing a repairable system. First of all an overview of the application of the PHM in engineering is presented. Then the concepts of the PHM and its extensions, such as stratified PHM, PHM in the case of nonhomogeneous Poisson processes and PHM in the case of jumps in the hazard rate or different intensity function at failures of a large number of copies of a repairable system, are presented. Selection of a suitable extension of the PHM for given data on the basis of residual plots is also discussed. Finally applications of the PHM and its extensions are illustrated with a suitable example. Only the semiparametric method has been considered. The assumptions made in the PHM for the analysis of repairable systems have been explained graphically as far as possible. Perfect, minimal or imperfect repairs carried out on repairable systems can be taken into consideration for the reliability analysis using the PHM.
  •  
31.
  • Larsson Turtola, Simon, et al. (författare)
  • Integrating mixture experiments and six sigma methodology to improve fibre‐reinforced polymer composites
  • 2022
  • Ingår i: Quality and Reliability Engineering International. - : John Wiley & Sons. - 0748-8017 .- 1099-1638. ; 38:4, s. 2233-2254
  • Tidskriftsartikel (refereegranskat)abstract
    • This article illustrates a Six Sigma project aimed at reducing manufacturing-induced visual deviations for fibre-reinforced polymer (FRP) composites. For a European composites manufacturer, such visual deviations lead to scrapping of cylindrical composite bodies and subsequent environmental impact. The composite bodies are manufactured through vacuum infusion, where a resin mixture impregnates a fibreglass preform and cures, transforming from liquid to solid state. We illustrate the define-measure-analyse-improve-control (DMAIC) steps of the Six Sigma project. Specific emphasis is placed on the measure and analyse steps featuring a 36-run computer-generated mixture experiment with six resin mixture components and six responses. Experimental analysis establishes causal relationships between mixture components and correlated resin characteristics, which can be used to control resin characteristics. Two new resin mixtures were developed and tested in the improve step using the understanding developed in previous steps. Manufacturing-induced visual deviations were greatly reduced by adjusting the resin mixture to induce a slower curing process. Further refinement of the mixture was made in the control step. A production scrap rate of 5% due to visual deviations was measured during a monitoring period of 5 months after the resin mixture change. The scrap rate was substantially improved compared to the historical level (60%). The successful experimental investigation integrated in this Six Sigma project is expected to generate increased quality, competitiveness, and substantial savings.
  •  
32.
  •  
33.
  • Li, Jing, et al. (författare)
  • Editorial : a Special Issue on Data Mining
  • 2014
  • Ingår i: Quality and Reliability Engineering International. - : Wiley. - 0748-8017 .- 1099-1638. ; 30:6, s. 813-
  • Tidskriftsartikel (övrigt vetenskapligt/konstnärligt)
  •  
34.
  • Lin, Jing, et al. (författare)
  • Reliability analysis for degradation of locomotive wheels using parametric Bayesian approach
  • 2014
  • Ingår i: Quality and Reliability Engineering International. - : Wiley. - 0748-8017 .- 1099-1638. ; 30:5, s. 657-667
  • Tidskriftsartikel (refereegranskat)abstract
    • This paper undertakes a reliability study using a Bayesian survival analysis framework to explore the impact of a locomotive wheel's installed position on its service lifetime and to predict its reliability characteristics. The Bayesian Exponential Regression Model, Bayesian Weibull Regression Model and Bayesian Log-normal Regression Model are used to analyze the lifetime of locomotive wheels using degradation data and taking into account the position of the wheel. This position is described by three different discrete covariates: the bogie, the axle and the side of the locomotive where the wheel is mounted. The goal is to determine reliability, failure distribution and optimal maintenance strategies for the wheel. The results show that: (i) under specified assumptions and a given topography, the position of the locomotive wheel could influence its reliability and lifetime; (ii) the Bayesian Log-normal Regression Model is a useful tool.
  •  
35.
  • Lindgren, Mats, et al. (författare)
  • Thermal and thermo-mechanical analysis for design evaluation of an automotive radar module
  • 2004
  • Ingår i: Quality and Reliability Engineering International. - : Wiley. - 0748-8017 .- 1099-1638. ; 20:7, s. 709-726
  • Tidskriftsartikel (refereegranskat)abstract
    • The influence of the substrate technology, assembly method, and housing material on the thermal, thermo-mechanical and cost performance of a radar module for automotive applications has been studied to address the product reliability aspects during the design phase. Flip chip and wire bonding have been evaluated for Multi-Chip Module—Laminate/Deposition (MCM-L/D) and Multi-Chip Module—Deposition (MCM-D) substrate technologies used for electronic packaging solutions in a harsh environment. Solder ball and direct attachment have been investigated as second-level assembly. As a result of thermal and thermo-mechanical simulations and cost analysis, radar module designs combining MCM-D and MCM-L/D with wire bonding have been revealed, which are preferable for use in different temperature environments with respect to two performance criteria, the maximum junction temperature and the manufacturing cost. Simulation-based guidelines have been developed for designing radar modules used in automotive applications while satisfying temperature and stress constraints provided for the module.
  •  
36.
  • Lorén, Sara, et al. (författare)
  • Second moment reliability evaluation vs. Monte Carlo simulations for weld fatigue strength
  • 2012
  • Ingår i: Quality and Reliability Engineering International. - : Wiley. - 0748-8017 .- 1099-1638. ; 28:8, s. 887-896
  • Tidskriftsartikel (refereegranskat)abstract
    • Monte Carlo simulations have become very popular in industrial applications as a tool to study variational influences on reliability assessments. The method is appealing because it can be done without any statistical knowledge and produces results that appear very informative. However, in most cases, the information gathered is no more than a complicated transformation of initial guesses because the statistical distributions of the dominating variational influences are unknown. The seemingly informative result may then be highly misleading, in particular, when the user lacks sufficient statistical knowledge. Instead, in cases where the input knowledge of the distributional properties is vague, it may be better to use a reliability method based on the actual knowledge, often not more than second moment characteristics. This can easily be done by using a method, based on variances, covariances, and sensitivity coefficients. Here, a specific problem of fatigue life of a welded structure is studied by (i) a Monte Carlo simulation method and (ii) a second moment method. Both methods are evaluated on a fatigue strain-life approach and use experimental data showing variation in weld geometry and material strength parameters. The two methods are compared and discussed in view of the engineering problem of reliability with respect to fatigue damage.
  •  
37.
  • Mahmoudi, Jafar (författare)
  • SIL analysis of subsea control system components based on a typical OREDA database
  • 2021
  • Ingår i: Quality and Reliability Engineering International. - : John Wiley and Sons Ltd. - 0748-8017 .- 1099-1638. ; 37:8, s. 3297-3313
  • Tidskriftsartikel (refereegranskat)abstract
    • Proper performance evaluation of subsea system components is of high significance for reliable operation and remote monitoring or the replacement of the components before the occurrence of any failure. As a part of subsea systems, subsea control system (SCS) plays a key role in accomplishing a reliable performance. Hence, achieving knowledge of the components’ failure rates is highly important in the safety analysis of SCS. To the author's knowledge, limited work is done on the safety analysis of SCS using failure rates for a multitude of components. Also, the number of research papers that are based on industrial works is restricted. Hence, this paper aims to provide a noticeable contribution in fulfilling the referred gap. For this purpose, a safety integrity-level (SIL) analysis is proposed based on a typical OREDA database. In the implementation of the proposed SIL, a failure mode classification table is provided for a selection of SCS components. This is followed by the estimation of several parameters, such as the total time in service, as well as obtaining the values of critical failure rates. The analysis indicates that signal failure is the failure mode occurring more than the other ones. Also, the subsea electronic module yields the highest value of critical failure rates. Besides, a comparison of parameter values is provided for two different versions of the utilized database.
  •  
38.
  • Montgomery, Douglas C., et al. (författare)
  • Estimation of missing observations in two-level split-plot designs
  • 2008
  • Ingår i: Quality and Reliability Engineering International. - : Wiley. - 0748-8017 .- 1099-1638. ; 24:2, s. 127-152
  • Tidskriftsartikel (refereegranskat)abstract
    • Inserting estimates for the missing observations from split-plot designs restores their balanced or orthogonal structure and alleviates the difficulties in the statistical analysis. In this article, we extend a method due to Draper and Stoneman to estimate the missing observations from unreplicated two-level factorial and fractional factorial split-plot (FSP and FFSP) designs. The missing observations, which can either be from the same whole plot, from different whole plots, or comprise entire whole plots, are estimated by equating to zero a number of specific contrast columns equal to the number of the missing observations. These estimates are inserted into the design table and the estimates for the remaining effects (or alias chains of effects as the case with FFSP designs) are plotted on two half-normal plots: one for the whole-plot effects and the other for the subplot effects. If the smaller effects do not point at the origin, then different contrast columns to some or all of the initial ones should be discarded and the plots re-examined for bias. Using examples, we show how the method provides estimates for the missing observations that are very close to their actual values
  •  
39.
  • Pavasson, Jonas, et al. (författare)
  • Reliability Prediction Based on Variation Mode and Effect Analysis
  • 2013
  • Ingår i: Quality and Reliability Engineering International. - : Wiley. - 0748-8017 .- 1099-1638. ; 29:5, s. 699-708
  • Tidskriftsartikel (refereegranskat)abstract
    • The possibility of predicting the reliability of hardware for both components and systems is important in engineering design. Today, there are several methods for predicting the reliability of hardware systems and for identifying the causes of failure and failure modes, for example, fault tree analysis and failure mode and effect analysis. Many failures are caused by variations resulting in a substantial effect on safety or functional requirements. To identify, to assess and to manage unwanted sources of variation, a method called probabilistic variation mode and effect analysis (VMEA) has been developed. With a prescribed reliability, VMEA can be used to derive safety factors in different applications. However, there are few reports on how to derive the reliability based on probabilistic VMEA, especially for transmission clutch shafts. Hence, the objective of this article was to show how to derive system reliability based on probabilistic VMEA. In particular, wheel loader automatic transmission clutch shaft reliability is investigated to show how different sources of variation affect reliability. In this article, a new method for predicting system reliability based on probabilistic VMEA is proposed. The method is further verified by a case study on a clutch shaft. It is shown that the reliability of the clutch shaft was close to 1.0 and that the most significant variation contribution was due to mean radius of the friction surface and friction of the disc.
  •  
40.
  •  
41.
  • Raharjo, Hendry, 1978, et al. (författare)
  • On integrating Kano's model dynamics into QFD for multiple product design
  • 2010
  • Ingår i: Quality and Reliability Engineering International. - : Wiley. - 1099-1638 .- 0748-8017. ; 26:4, s. 351-363
  • Tidskriftsartikel (refereegranskat)abstract
    • The Design for Six Sigma (DFSS) framework places a strong emphasis on the need to conform to the customer needs or voices early in the design phase. The key problem in the context of a Rapidly changing environment due to influx of new technology and innovation, is that things may become obsolete much faster than ever. What now delights the customer will become an expected need in the Near Future. Such dynamics is, unfortunately, very often overlooked and has not been adequately addressed in the literature. To fill in this niche, this paper proposes a methodology to advance the use of the Quality Function Deployment (QFD), As One Of The Widely accepted tools into products or services design process, with respect to Kano's model dynamics. Specifically, based on the information from Kano questionnaire, it provides a quantitative approach to observe and follow the change over time. Not only can it show how strong a certainement Kano's category changes over time, but it can also forecast the future needs, Which is useful to Tackle the Customer's preference change during product creation process. Afterwards, the forecasted customer needs can be used within an optimization framework for multiple product design. An illustrative example is provided to give some practical insights.
  •  
42.
  • Rotari, Marta, et al. (författare)
  • Variable selection wrapper in presence of correlated input variables for random forest models
  • 2024
  • Ingår i: Quality and Reliability Engineering International. - : John Wiley & Sons. - 0748-8017 .- 1099-1638. ; 40:1, s. 297-312
  • Tidskriftsartikel (refereegranskat)abstract
    • In most data analytic applications in manufacturing, understanding the data-driven models plays a crucial role in complementing the engineering knowledge about the production process. Identifying relevant input variables, rather than only predicting the response through some “black-box” model, is of great interest in many applications. There is, therefore, a growing focus on describing the contributions of the input variables to the model in the form of “variable importance”, which is readily available in certain machine learning methods such as random forest (RF). Once a ranking based on the importance measure of the variables is established, the question of how many variables are truly relevant in predicting the output variable rises. In this study, we focus on the Boruta algorithm, which is a wrapper around the RF model. It is a variable selection tool that assesses the variable importance measure for the RF model. It has been previously shown in the literature that the correlation among the input variables, which is often a common occurrence in high dimensional data, distorts and overestimates the importance of variables. The Boruta algorithm is also affected by this resulting in a larger set of input variables deemed important. To overcome this issue, in this study, we propose an extension of the Boruta algorithm for the correlated data by exploiting the conditional importance measure. This extension greatly improves the Boruta algorithm in the case of high correlation among variables and provides a more precise ranking of the variables that significantly contribute to the response. We believe this approach can be used in many industrial applications by providing more transparency and understanding of the process.
  •  
43.
  • Sabahno, Hamed, et al. (författare)
  • A variable parameters multivariate control chart for simultaneous monitoring of the process mean and variability with measurement errors
  • 2020
  • Ingår i: Quality and Reliability Engineering International. - : John Wiley & Sons. - 0748-8017 .- 1099-1638. ; 36:4, s. 1161-1196
  • Tidskriftsartikel (refereegranskat)abstract
    • Evaluating the effect of measurement errors on either adaptive or simultaneous control charts has been a topic of interest for the researchers in the recent years. Nevertheless, the effect of measurement errors on both adaptive and simultaneous monitoring control charts has not been considered yet. In this paper, through extensive numerical studies, we evaluate the effect of measurement errors on an adaptive (variable parameters) simultaneous multivariate control chart for the mean vector and the variance-covariance matrix of p quality characteristics assumed to follow a multivariate normal distribution. In order to do so, (a) we use eight performance measures computed using a Markov chain model, (b) we consider the effects of multiple measurements as well as the error model's parameters, and (c) we also consider the overall performance of this adaptive simultaneous chart including the chart parameters values optimization, which have never been considered so far for this scheme. At last, a real case is presented in order to illustrate the proposed scheme.
  •  
44.
  • Sabahno, Hamed, et al. (författare)
  • Evaluating the effect of measurement errors on the performance of the variable sampling intervals Hotelling's T 2 control charts
  • 2018
  • Ingår i: Quality and Reliability Engineering International. - : John Wiley & Sons. - 0748-8017 .- 1099-1638. ; 34:8, s. 1785-1799
  • Tidskriftsartikel (refereegranskat)abstract
    • The effect of measurement errors on the performance of multivariate adaptive control charts has not been considered yet. In this article, we investigate the effect of measurement errors on the performance of the variable sampling intervals (VSI) Hotelling's T2 control chart in the case of known parameters. A linearly covariate error model is used as the measurement error function. In order to measure the chart's performance, we use the average time to signal criterion, which is obtained by using a Markov Chain model. Through a numerical analysis, we evaluate the negative effect of measurement errors on the performance of the VSI Hotelling's T2 control chart, and we also investigate the effect of multiple measurements as well as the value of linearly covariate error model's parameters on the main properties of the VSI Hotelling's T2 control chart. At last, we present an illustrative example. 
  •  
45.
  •  
46.
  • Sedghi, Mahdieh, 1984-, et al. (författare)
  • Data‐driven maintenance planning and scheduling based on predicted railway track condition
  • 2022
  • Ingår i: Quality and Reliability Engineering International. - : John Wiley & Sons. - 0748-8017 .- 1099-1638. ; 38:7, s. 3689-3709
  • Tidskriftsartikel (refereegranskat)abstract
    • Timely planning and scheduling of railway infrastructure maintenance interventions are crucial for increased safety, improved availability, and reduced cost. We propose a data-driven decision-support framework integrating track condition predictions with tactical maintenance planning and operational scheduling. The framework acknowledges prediction uncertainties by using a Wiener process-based prediction model at the tactical level. We also develop planning and scheduling algorithms at the operational level. One algorithm focuses on cost-optimisation, and one algorithm considers the multi-component characteristics of the railway track by grouping track segments near each other for one maintenance activity. The proposed framework's performance is evaluated using track geometry measurement data from a 34 km railway section in northern Sweden, focusing on the tamping maintenance action. We analyse maintenance costs and demonstrate potential efficiency increases by applying the decision-support framework.
  •  
47.
  •  
48.
  •  
49.
  • Tano, Ingrid, 1968-, et al. (författare)
  • Comparing Confidence Intervals for Multivariate Process capability Indices
  • 2012
  • Ingår i: Quality and Reliability Engineering International. - : Wiley. - 0748-8017 .- 1099-1638. ; 28:4, s. 481-495
  • Tidskriftsartikel (refereegranskat)abstract
    • Multivariate process capability indices (MPCIs) are needed for process capability analysis when the quality of a process is determined by several univariate quality characteristics that are correlated. There are several different MPCIs described in the literature, but confidence intervals have been derived for only a handful of these. In practice, the conclusion about process capability must be drawn from a random sample. Hence, confidence intervals or tests for MPCIs are important. With a case study as a start and under the assumption of multivariate normality, we review and compare four different available methods for calculating confidence intervals of MPCIs that generalize the univariate index Cp. Two of the methods are based on the ratio of a tolerance region to a process region, and two are based on the principal component analysis. For two of the methods, we derive approximate confidence intervals, which are easy to calculate and can be used for moderate sample sizes. We discuss issues that need to be solved before the studied methods can be applied more generally in practice. For instance, three of the methods have approximate confidence levels only, but no investigation has been carried out on how good these approximations are. Furthermore, we highlight the problem with the correspondence between the index value and the probability of nonconformance. We also elucidate a major drawback with the existing MPCIs on the basis of the principal component analysis. Our investigation shows the need for more research to obtain an MPCI with confidence interval such that conclusions about the process capability can be drawn at a known confidence level and that a stated value of the MPCI limits the probability of nonconformance in a known way. 
  •  
50.
  • Tyssedal, John Sølve, et al. (författare)
  • Analysis of split-plot designs with mirror image pairs as sub-plots
  • 2005
  • Ingår i: Quality and Reliability Engineering International. - : Wiley. - 0748-8017 .- 1099-1638. ; 21:5, s. 539-551
  • Tidskriftsartikel (refereegranskat)abstract
    • In this article we present a procedure to analyze split-plot experiments with mirror image pairs as sub-plots when third- and higher-order interactions can be assumed negligible. Although performing a design in a split-plot manner induces correlation among observations, we show that with such designs the essential search for potentially active factors can be done in two steps using ordinary least squares. The suggested procedure is tested out on a real example and on two simulated screening examples; one with a split-plot design based on a geometric design and one with a split-plot design based on a non-geometric Plackett and Burman design. The examples also illustrate the advantage of using non-geometric designs where the effects are partially aliased instead of being fully aliased as in highly fractionated fractional factorials
  •  
Skapa referenser, mejla, bekava och länka
  • Resultat 1-50 av 60
Typ av publikation
tidskriftsartikel (59)
forskningsöversikt (1)
Typ av innehåll
refereegranskat (56)
övrigt vetenskapligt/konstnärligt (4)
Författare/redaktör
Kulahci, Murat (21)
Vännman, Kerstin (11)
Vanhatalo, Erik (9)
Bergman, Bo, 1943 (7)
Bergquist, Bjarne (6)
Kumar, Uday (2)
visa fler...
Li, Jing (2)
Svensson, Thomas (2)
Albing, Malin (2)
Montgomery, Douglas ... (2)
de Maré, Jacques, 19 ... (2)
Arvidsson, Martin, 1 ... (2)
Chakhunashvili, Alex ... (2)
Barabadi, Abbas (1)
Gremyr, Ida, 1975 (1)
Ahmadi, Alireza (1)
Söderholm, Peter (1)
Ghodrati, Behzad (1)
Garmabaki, Amir (1)
Schunnesson, Håkan (1)
Leisner, Peter (1)
Belov, Ilja (1)
Karlberg, Magnus (1)
Mahmoudi, Jafar (1)
Parida, Aditya (1)
Mahmood, Yasser Ahme ... (1)
Bergman, Bo (1)
Migdalas, Athanasios (1)
Allahkarami, Zeynab (1)
Sayadi, Ahmad Reza (1)
Almimi, Ashraf A. (1)
Xie, Min (1)
Lin, Jing (1)
Andersen, Emil B. (1)
Udugama, Isuru A. (1)
Gernaey, Krist V. (1)
Khan, Abdul R. (1)
Bayer, Christoph (1)
Persson, P. (1)
Andersson, Johan (1)
Kapur, P. K. (1)
Johansson, Per, 1973 (1)
Asplund, Matthias (1)
Raharjo, Hendry, 197 ... (1)
Barone, Stefano, 197 ... (1)
Lindgren, Mats (1)
Deleryd, Mats (1)
Wiklund, Håkan (1)
Collosimo, Bianca Ma ... (1)
Sandvik Wiklund, Pia (1)
visa färre...
Lärosäte
Luleå tekniska universitet (46)
Chalmers tekniska högskola (8)
Umeå universitet (5)
RISE (4)
Högskolan Väst (3)
Göteborgs universitet (2)
visa fler...
Kungliga Tekniska Högskolan (2)
Linköpings universitet (1)
Jönköping University (1)
Mittuniversitetet (1)
Högskolan i Borås (1)
visa färre...
Språk
Engelska (60)
Forskningsämne (UKÄ/SCB)
Teknik (46)
Naturvetenskap (23)
Samhällsvetenskap (1)

År

Kungliga biblioteket hanterar dina personuppgifter i enlighet med EU:s dataskyddsförordning (2018), GDPR. Läs mer om hur det funkar här.
Så här hanterar KB dina uppgifter vid användning av denna tjänst.

 
pil uppåt Stäng

Kopiera och spara länken för att återkomma till aktuell vy