SwePub
Sök i SwePub databas

  Utökad sökning

Träfflista för sökning "WFRF:(Koning Arjan J.) "

Sökning: WFRF:(Koning Arjan J.)

  • Resultat 1-28 av 28
Sortera/gruppera träfflistan
   
NumreringReferensOmslagsbildHitta
1.
  • Rochman, D., et al. (författare)
  • A Bayesian Monte Carlo method for fission yield covariance information
  • 2016
  • Ingår i: Annals of Nuclear Energy. - : Elsevier BV. - 0306-4549 .- 1873-2100. ; 95, s. 125-134
  • Tidskriftsartikel (refereegranskat)abstract
    • The present work proposes a Bayesian method to combine theoretical fission yields with a set of reference data. These two sources of information are merged using a Monte Carlo process, and leads to a so-called Bayesian Monte Carlo update. Examples are presented for the independent fission yields of four major actinides, using the GEF code as a source of theoretical calculations and an evaluated library of fission yields for the reference data. The impact of the updated fission yields and their covariances is shown for two distinct applications: a UO2 pincell with burn-up up to 40 GWD/tHM and decay heat calculations of a thermal neutron pulse on U-235 and Pu-239.
  •  
2.
  • Rochman, D., et al. (författare)
  • A Statistical Analysis of Evaluated Neutron Resonances with TARES for JEFF-3.3, JENDL-4.0, ENDF/B-VIII.0 and TENDL-2019
  • 2020
  • Ingår i: Nuclear Data Sheets. - : ACADEMIC PRESS INC ELSEVIER SCIENCE. - 0090-3752 .- 1095-9904. ; 163, s. 163-190
  • Tidskriftsartikel (refereegranskat)abstract
    • In this paper, a statistical analysis of resonance evaluations from the most recent nuclear data libraries is performed with the code TARES. A description of the TARES framework is provided, but also for its use in the production of resonance parameters for the entire TENDL library. Various observables are calculated (e.g., thermal cross sections, capture integrals but also D-0 and other average resonance quantities) for different libraries and compared to experimental values when available. Finally, for isotopes with no or little information, details are provided for the production of statistical resonances, based on a variety of compound nucleus reaction models.
  •  
3.
  • Rochman, Dimitri, et al. (författare)
  • The TENDL library : Hope, reality and future
  • 2017
  • Ingår i: Nd 2016 Bruges. - Les Ulis : EDP Sciences. - 9782759890200
  • Konferensbidrag (refereegranskat)abstract
    • The TALYS Evaluated Nuclear Data Library (TENDL) has now 8 releases since 2008. Considerable experience has been acquired for the production of such general-purpose nuclear data library based on the feedback from users, evaluators and processing experts. The backbone of this achievement is simple and robust: completeness, quality and reproducibility. If TENDL is extensively used in many fields of applications, it is necessary to understand its strong points and remaining weaknesses. Alternatively, the essential knowledge is not the TENDL library itself, but rather the necessary method and tools, making the library a side product and focusing the efforts on the evaluation knowledge. The future of such approach will be discussed with the hope of nearby greater success.
  •  
4.
  • Alhassan, Erwin, et al. (författare)
  • Benchmark selection methodology for reactor calculations and nuclear data uncertainty reduction
  • 2015
  • Ingår i: Annals of Nuclear Energy. - 0306-4549 .- 1873-2100.
  • Tidskriftsartikel (refereegranskat)abstract
    • Criticality, reactor physics and shielding benchmarks are expected to play important roles in GEN-IV design, safety analysis and in the validation of analytical tools used to design these reactors. For existing reactor technology, benchmarks are used for validating computer codes and for testing nuclear data libraries. Given the large number of benchmarks available, selecting these benchmarks for specic applications can be rather tedious and difficult. Until recently, the selection process has been based usually on expert judgement which is dependent on the expertise and the experience of the user and there by introducing a user bias into the process. This approach is also not suitable for the Total Monte Carlo methodology which lays strong emphasis on automation, reproducibility and quality assurance. In this paper a method for selecting these benchmarks for reactor calculation and for nuclear data uncertainty reduction based on the Total Monte Carlo (TMC) method is presented. For reactor code validation purposes, similarities between a real reactor application and one or several benchmarks are quantied using a similarity index while the Pearson correlation coecient is used to select benchmarks for nuclear data uncertainty reduction. Also, a correlation based sensitivity method is used to identify the sensitivity of benchmarks to particular nuclear reactions. Based on the benchmark selection methodology, two approaches are presented for reducing nuclear data uncertainty using integral benchmark experiments as an additional constraint in the TMC method: a binary accept/reject and a method of assigning file weights using the likelihood function. Finally, the methods are applied to a full lead-cooled fast reactor core and a set of criticality benchmarks. Signicant reductions in Pu-239 and Pb-208 nuclear data uncertainties were obtained after implementing the two methods with some benchmarks.
  •  
5.
  • Alhassan, E., et al. (författare)
  • In search of the best nuclear data file for proton induced reactions : Varying both models and their parameters
  • 2020
  • Ingår i: ND 2019. - : EDP Sciences. - 9782759891061
  • Konferensbidrag (refereegranskat)abstract
    • A lot of research work has been carried out in fine tuning model parameters to reproduce experimental data for neutron induced reactions. This however is not the case for proton induced reactions where large deviations still exist between model calculations and experiments for some cross sections. In this work, we present a method for searching both the model and model parameter space in order to identify the 'best' nuclear reaction models with their parameter sets that reproduces carefully selected experimental data. Three sets of experimental data from EXFOR are used in this work: (1) cross sections of the target nucleus (2) cross sections of the residual nuclei and (3) angular distributions. Selected models and their parameters were varied simultaneously to produce a large set of random nuclear data files. The goodness of fit between our adjustments and experimental data was achieved by computing a global reduced chi square which took into consideration the above listed experimental data. The method has been applied for the adjustment of proton induced reactions on Co-59 between 1 to 100 MeV. The adjusted files obtained are compared with available experimental data and evaluations from other nuclear data libraries.
  •  
6.
  • Alhassan, Erwin, et al. (författare)
  • Iterative Bayesian Monte Carlo for nuclear data evaluation
  • 2022
  • Ingår i: NUCLEAR SCIENCE AND TECHNIQUES. - : Springer Nature. - 1001-8042 .- 2210-3147. ; 33:4
  • Tidskriftsartikel (refereegranskat)abstract
    • In this work, we explore the use of an iterative Bayesian Monte Carlo (iBMC) method for nuclear data evaluation within a TALYS Evaluated Nuclear Data Library (TENDL) framework. The goal is to probe the model and parameter space of the TALYS code system to find the optimal model and parameter sets that reproduces selected experimental data. The method involves the simultaneous variation of many nuclear reaction models as well as their parameters included in the TALYS code. The `best' model set with its parameter set was obtained by comparing model calculations with selected experimental data. Three experimental data types were used: (1) reaction cross sections, (2) residual production cross sections, and (3) the elastic angular distributions. To improve our fit to experimental data, we update our 'best' parameter set-the file that maximizes the likelihood function-in an iterative fashion. Convergence was determined by monitoring the evolution of the maximum likelihood estimate (MLE) values and was considered reached when the relative change in the MLE for the last two iterations was within 5%. Once the final 'best' file is identified, we infer parameter uncertainties and covariance information to this file by varying model parameters around this file. In this way, we ensured that the parameter distributions are centered on our evaluation. The proposed method was applied to the evaluation of p+ Co-59 between 1 and 100 MeV. Finally, the adjusted files were compared with experimental data from the EXFOR database as well as with evaluations from the TENDL-2019, JENDL/He-2007 and JENDL-4.0/HE nuclear data libraries.
  •  
7.
  • Alhassan, Erwin, et al. (författare)
  • On the use of integral experiments for uncertainty reduction of reactor macroscopic parameters within the TMC methodology
  • 2016
  • Ingår i: Progress in nuclear energy (New series). - : Elsevier BV. - 0149-1970 .- 1878-4224. ; 88, s. 43-52
  • Tidskriftsartikel (refereegranskat)abstract
    • The current nuclear data uncertainties observed in reactor safety parameters for some nuclides call for safety concerns especially with respect to the design of GEN-IV reactors and must therefore be reduced significantly. In this work, uncertainty reduction using criticality benchmark experiments within the Total Monte Carlo methodology is presented. Random nuclear data libraries generated are processed and used to analyze a set of criticality benchmarks. Since the calculated results for each random nuclear data used are different, an algorithm was used to select (or assign weights to) the libraries which give a good description of experimental data for the analyses of the benchmarks. The selected or weighted libraries were then used to analyze the ELECTRA reactor. By using random nuclear data libraries constrained with only differential experimental data as our prior, the uncertainties observed were further reduced by constraining the files with integral experimental data to obtain a posteriori uncertainties on the k(eff). Two approaches are presented and compared: a binary accept/reject and a method of assigning file weights based on the likelihood function. Significant reductions in (PU)-P-239 and Pb-208 nuclear data uncertainties in the k(eff) were observed after implementing the two methods with some criticality benchmarks for the ELELIRA reactor. (C) 2015 Elsevier Ltd. All rights reserved.
  •  
8.
  • Alhassan, Erwin, et al. (författare)
  • Reducing A Priori 239Pu Nuclear Data Uncertainty In The Keff Using A Set Of Criticality Benchmarks With Different Nuclear Data Libraries
  • 2015
  • Konferensbidrag (övrigt vetenskapligt/konstnärligt)abstract
    • In the Total Monte Carlo (TMC) method [1] developed at the Nuclear Research and Consultancy Group for nuclear data uncertainty propagation, model calculations are compared with differential experimental data and a specific a priori uncertainty is assigned to each model parameter. By varying the model parameters all together within model parameter uncertainties, a full covariance matrix is obtained with its off diagonal elements if desired [1]. In this way, differential experimental data serve as a constraint for the model parameters used in the TALYS nuclear reactions code for the production of random nuclear data files. These files are processed into usable formats and used in transport codes for reactor calculations and for uncertainty propagation to reactor macroscopic parameters of interest. Even though differential experimental data together with their uncertainties are included (implicitly) in the production of these random nuclear data files in the TMC method, wide spreads in parameter distributions have been observed, leading to large uncertainties in reactor parameters for some nuclides for the European Lead cooled Training Reactor [2]. Due to safety concerns and the development of GEN-IV reactors with their challenging technological goals, the present uncertainties should be reduced significantly if the benefits from advances in modelling and simulations are to be utilized fully [3]. In Ref.[4], a binary accept/reject approach and a more rigorous method of assigning file weights based on the likelihood function were proposed and presented for reducing nuclear data uncertainties using a set of integral benchmarks obtained from the International Handbook of Evaluated Criticality Safety Benchmark Experiments (ICSBEP). These methods are depended on the reference nuclear data library used, the combined benchmark uncertainty and the relevance of each benchmark for reducing nuclear data uncertainties for a particular reactor system. Since each nuclear data library normally comes with its own nominal values and covariance matrices, reactor calculations and uncertainties computed with these libraries differ from library to library. In this work, we apply the binary accept/reject approach and the method of assigning file weights based on the likelihood function for reducing a priori 239Pu nuclear data uncertainties for the European Lead Cooled Training Reactor (ELECTRA) using a set of criticality benchmarks. Prior and posterior uncertainties computed for ELECTRA using ENDF/B-VII.1, JEFF-3.2 and JENDL-4.0 are compared after including experimental information from over 10 benchmarks.[1] A.J. Koning and D. Rochman, Modern Nuclear Data Evaluation with the TALYS Code System. Nuclear Data Sheets 113 (2012) 2841-2934. [2] E. Alhassan, H. Sjöstrand, P. Helgesson, A. J. Koning, M. Österlund, S. Pomp, D. Rochman, Uncertainty and correlation analysis of lead nuclear data on reactor parameters for the European Lead Cooled Training reactor (ELECTRA). Annals of Nuclear Energy 75 (2015) 26-37. [3] G. Palmiotti, M. Salvatores, G. Aliberti, H. Hiruta, R. McKnight, P. Oblozinsky, W. Yang, A global approach to the physics validation of simulation codes for future nuclear systems, Annals of Nuclear Energy 36 (3) (2009) 355-361. [4] E. Alhassan, H. Sjöstrand, J. Duan, P. Helgesson, S. Pomp, M. Österlund, D. Rochman, A.J. Koning, Selecting benchmarks for reactor calculations: In proc. PHYSOR 2014 - The Role of Reactor Physics toward a Sustainable Future, kyoto, Japan, Sep. 28 - 3 Oct. (2014).
  •  
9.
  • Alhassan, Erwin, et al. (författare)
  • Selecting benchmarks for reactor calculations
  • 2014
  • Ingår i: PHYSOR 2014 - The Role of Reactor Physics toward a Sustainable Future.
  • Konferensbidrag (refereegranskat)abstract
    • Criticality, reactor physics, fusion and shielding benchmarks are expected to play important roles in GENIV design, safety analysis and in the validation of analytical tools used to design these reactors. For existing reactor technology, benchmarks are used to validate computer codes and test nuclear data libraries. However the selection of these benchmarks are usually done by visual inspection which is dependent on the expertise and the experience of the user and there by resulting in a user bias in the process. In this paper we present a method for the selection of these benchmarks for reactor applications based on Total Monte Carlo (TMC). Similarities betweenan application case and one or several benchmarks are quantified using the correlation coefficient. Based on the method, we also propose an approach for reducing nuclear data uncertainty using integral benchmark experiments as an additional constrain on nuclear reaction models: a binary accept/reject criterion. Finally, the method was applied to a full Lead Fast Reactor core and a set of criticality benchmarks.
  •  
10.
  • Alhassan, Erwin, et al. (författare)
  • Selecting benchmarks for reactor simulations : an application to a Lead Fast Reactor
  • 2016
  • Ingår i: Annals of Nuclear Energy. - : Elsevier BV. - 0306-4549 .- 1873-2100. ; 96, s. 158-169
  • Tidskriftsartikel (refereegranskat)abstract
    • For several decades reactor design has been supported by computer codes for the investigation of reactor behavior under both steady state and transient conditions. The use of computer codes to simulate reactor behavior enables the investigation of various safety scenarios saving time and cost. There has been an increase in the development of in-house (local) codes by various research groups in recent times for preliminary design of specific or targeted nuclear reactor applications. These codes must be validated and calibrated against experimental benchmark data with their evolution and improvements. Given the large number of benchmarks available, selecting these benchmarks for reactor calculations and validation of simulation codes for specific or target applications can be rather tedious and difficult. In the past, the traditional approach based on expert judgement using information provided in various handbooks, has been used for the selection of these benchmarks. This approach has been criticized because it introduces a user bias into the selection process. This paper presents a method for selecting these benchmarks for reactor calculations for specific reactor applications based on the Total Monte Carlo (TMC) method. First, nuclear model parameters are randomly sampled within a given probability distribution and a large set of random nuclear data files are produced using the TALYS code system. These files are processed and used to analyze a target reactor system and a set of criticality benchmarks. Similarity between the target reactor system and one or several benchmarks is quantified using a similarity index. The method has been applied to the European Lead Cooled Reactor (ELECTRA) and a set of plutonium and lead sensitive criticality benchmarks using the effective multiplication factor (keffkeff). From the study, strong similarity were observed in the keffkeff between ELECTRA and some plutonium and lead sensitive criticality benchmarks. Also, for validation purposes, simulation results for a list of selected criticality benchmarks simulated with the MCNPX and SERPENT codes using different nuclear data libraries have been compared with experimentally measured benchmark keff values.
  •  
11.
  • Alhassan, Erwin, et al. (författare)
  • Uncertainty analysis of Lead cross sections on reactor safety for ELECTRA
  • 2016
  • Ingår i: SNA + MC 2013 - Joint International Conference on Supercomputing in Nuclear Applications + Monte Carlo. - Les Ulis, France : EDP Sciences.
  • Konferensbidrag (refereegranskat)abstract
    • The Total Monte Carlo (TMC) method was used in this study to assess the impact of Pb-206, 207 and 208 nucleardata uncertainties on k-eff , beta-eff, coolant temperature coefficient, the coolant void worth for the ELECTRA reactor. Relatively large uncertainties were observed in the k-eff and the coolant void worth for all the isotopes with significant contribution coming from Pb-208 nuclear data. The large Pb-208 nuclear data uncertainty observed was further investigated by studying the impact of partial channels on the k-eff and beta-eff. Various sections of ENDF file: elasticscattering (n,el), inelastic scattering (n,inl), neutron capture (n,gamma), (n,2n), resonance parameters and the angular distribution were varied randomly and distributions in k-eff and beta-eff obtained. The dominant contributions to the uncertainty in the k-eff from Pb-208 came from uncertainties in the resonance parameters; however, elastic scattering cross section and the angular distribution also had significant impact. The impact of nuclear data uncertainties on the beta-eff was observed to be small.
  •  
12.
  • Alhassan, Erwin, et al. (författare)
  • Uncertainty and correlation analysis of lead nuclear data on reactor parameters for the European Lead Cooled Training Reactor
  • 2015
  • Ingår i: Annals of Nuclear Energy. - : Elsevier BV. - 0306-4549 .- 1873-2100. ; 75, s. 26-37
  • Tidskriftsartikel (refereegranskat)abstract
    • The Total Monte Carlo (TMC) method was used in this study to assess the impact of Pb-204, 206, 207, 208 nuclear data uncertainties on reactor safety parameters for the ELECTRA reactor. Relatively large uncertainties were observed in the k-eff and the coolant void worth (CVW) for all isotopes except for Pb-204 with signicant contribution coming from Pb-208 nuclear data; the dominant eectcame from uncertainties in the resonance parameters; however, elastic scattering cross section and the angular distributions also had signicant impact. It was also observed that the k-eff distribution for Pb-206, 207, 208 deviates from a Gaussian distribution with tails in the high k-eff region. An uncertainty of 0.9% on the k-eff and 3.3% for the CVW due to lead nuclear data were obtained. As part of the work, cross section-reactor parameter correlations were also studied using a Monte Carlo sensitivity method. Strong correlations were observed between the k-eff and (n,el) cross section for all the lead isotopes. The correlation between the (n,inl) and the k-eff was also found to be signicant.
  •  
13.
  • Helgesson, Petter, 1986-, et al. (författare)
  • Combining Total Monte Carlo and Unified Monte Carlo : Bayesian nuclear data uncertainty quantification from auto-generated experimental covariances
  • 2017
  • Ingår i: Progress in nuclear energy (New series). - : Elsevier. - 0149-1970 .- 1878-4224. ; 96, s. 76-96
  • Tidskriftsartikel (refereegranskat)abstract
    • The Total Monte Carlo methodology (TMC) for nuclear data (ND) uncertainty propagation has been subject to some critique because the nuclear reaction parameters are sampled from distributions which have not been rigorously determined from experimental data. In this study, it is thoroughly explained how TMC and Unified Monte Carlo-B (UMC-B) are combined to include experimental data in TMC. Random ND files are weighted with likelihood function values computed by comparing the ND files to experimental data, using experimental covariance matrices generated from information in the experimental database EXFOR and a set of simple rules. A proof that such weights give a consistent implementation of Bayes' theorem is provided. The impact of the weights is mainly studied for a set of integral systems/applications, e.g., a set of shielding fuel assemblies which shall prevent aging of the pressure vessels of the Swedish nuclear reactors Ringhals 3 and 4.In this implementation, the impact from the weighting is small for many of the applications. In some cases, this can be explained by the fact that the distributions used as priors are too narrow to be valid as such. Another possible explanation is that the integral systems are highly sensitive to resonance parameters, which effectively are not treated in this work. In other cases, only a very small number of files get significantly large weights, i.e., the region of interest is poorly resolved. This convergence issue can be due to the parameter distributions used as priors or model defects, for example.Further, some parameters used in the rules for the EXFOR interpretation have been varied. The observed impact from varying one parameter at a time is not very strong. This can partially be due to the general insensitivity to the weights seen for many applications, and there can be strong interaction effects. The automatic treatment of outliers has a quite large impact, however.To approach more justified ND uncertainties, the rules for the EXFOR interpretation shall be further discussed and developed, in particular the rules for rejecting outliers, and random ND files that are intended to describe prior distributions shall be generated. Further, model defects need to be treated.
  •  
14.
  •  
15.
  • Helgesson, Petter, 1986-, et al. (författare)
  • New 59Ni data including uncertainties and consequences for gas production in steel in LWR spectraNew 59Ni data including uncertainties and consequences for gas production in steel in LWR spectra
  • 2015
  • Konferensbidrag (övrigt vetenskapligt/konstnärligt)abstract
    • Abstract: With ageing reactor fleets, the importance of estimating material damage parameters in structural materials is increasing. 59Ni is not naturally abundant, but as noted in, e.g., Ref. [1], the two-step reaction 58Ni(n,γ)59Ni(n,α)56Fe gives a very important contribution to the helium production and damage energy in stainless steel in thermal spectra, because of the extraordinarily large thermal (n,α) cross section for 59Ni (for most other nuclides, the (n,α) reaction has a threshold). None of the evaluated data libraries contain uncertainty information for (n,α) and (n,p) for 59Ni for thermal energies and the resonance region. Therefore, new such data is produced in this work, including random data to be used with the Total Monte Carlo methodology [2] for nuclear data uncertainty propagation.                  The limited R-matrix format (“LRF = 7”) of ENDF-6 is used, with the Reich-Moore approximation (“LRF = 3” is just a subset of Reich-Moore). The neutron and gamma widths are obtained from TARES [2], with uncertainties, and are translated into LRF = 7. The α and proton widths are obtained from the little information available in EXFOR [3] (assuming large uncertainties because of lacking documentation) or from sampling from unresolved resonance parameters from TALYS [2], and they are split into different channels (different excited states of the recoiling nuclide, etc.). Finally, the cross sections are adjusted to match the experiments at thermal energies, with uncertainties.                  The data is used to estimate the gas production rates for different systems, including the propagated nuclear data uncertainty. Preliminary results for SS304 in a typical thermal spectrum, show that including 59Ni at its peak concentration increases the helium production rate by a factor of 4.93 ± 0.28 including a 5.7 ± 0.2 % uncertainty due to the 59Ni data. It is however likely that the uncertainty will increase substantially from including the uncertainty of other nuclides and from re-evaluating the experimental thermal cross sections.
  •  
16.
  •  
17.
  • Helgesson, Petter, 1986-, et al. (författare)
  • Sampling of systematic errors to estimate likelihood weights in nuclear data uncertainty propagation
  • 2016
  • Ingår i: Nuclear Instruments and Methods in Physics Research Section A. - : Elsevier. - 0168-9002 .- 1872-9576. ; 807, s. 137-149
  • Tidskriftsartikel (refereegranskat)abstract
    • In methodologies for nuclear data (ND) uncertainty assessment and propagation based on random sampling, likelihood weights can be used to infer experimental information into the distributions for the ND. As the included number of correlated experimental points grows large, the computational time for the matrix inversion involved in obtaining the likelihood can become a practical problem. There are also other problems related to the conventional computation of the likelihood, e.g., the assumption that all experimental uncertainties are Gaussian. In this study, a way to estimate the likelihood which avoids matrix inversion is investigated; instead, the experimental correlations are included by sampling of systematic errors. It is shown that the model underlying the sampling methodology (using univariate normal distributions for random and systematic errors) implies a multivariate Gaussian for the experimental points (i.e., the conventional model). It is also shown that the likelihood estimates obtained through sampling of systematic errors approach the likelihood obtained with matrix inversion as the sample size for the systematic errors grows large. In studied practical cases, it is seen that the estimates for the likelihood weights converge impractically slowly with the sample size, compared to matrix inversion. The computational time is estimated to be greater than for matrix inversion in cases with more experimental points, too. Hence, the sampling of systematic errors has little potential to compete with matrix inversion in cases where the latter is applicable. Nevertheless, the underlying model and the likelihood estimates can be easier to intuitively interpret than the conventional model and the likelihood function involving the inverted covariance matrix. Therefore, this work can both have pedagogical value and be used to help motivating the conventional assumption of a multivariate Gaussian for experimental data. The sampling of systematic errors could also be used in cases where the experimental uncertainties are not Gaussian, and for other purposes than to compute the likelihood, e.g., to produce random experimental data sets for a more direct use in ND evaluation.
  •  
18.
  • Helgesson, Petter, 1986-, et al. (författare)
  • Towards Transparent, Reproducible And Justified Nuclear Data Uncertainty Propagation For Lwr Applications
  • 2015
  • Konferensbidrag (övrigt vetenskapligt/konstnärligt)abstract
    • Any calculated quantity is practically meaningless without estimates on the uncertainty of theobtained results, not the least when it comes to, e.g., safety parameters in a nuclear reactor. Oneof the sources of uncertainty in reactor physics computations or simulations are the uncertaintiesof the so called nuclear data, i.e., cross sections, angular distributions, fission yields, etc. Thecurrently dominating method for propagating nuclear data uncertainties (using covariance dataand sensitivity analysis) suffers from several limitations, not the least in how the the covariancedata is produced – the production relies to a large extent on personal judgment of nuclear dataevaluators, leading to results which are difficult to reproduce from fundamental principles.Further, such a method assumes linearity, it in practice limits both input and output to bemodeled as Gaussian distributions, and the covariance data in the established nuclear datalibraries is incomplete.“Total Monte Carlo” (TMC) is a nuclear data uncertainty propagation method based on randomsampling of nuclear reaction model parameters which aims to resolve these issues. The methodhas been applied to various applications, ranging from pin cells and criticality safety benchmarksto full core neutronics as well as models including thermo-hydraulics and transients. However,TMC has been subject to some critique since the distributions of the nuclear model parameters,and hence of the nuclear data, has not been deduced from really rigorous statistical theory. Thispresentation briefly discusses the ongoing work on how to use experimental data to approachjustified results from TMC, including the effects of correlations between experimental datapoints and the assessment of such correlations. In this study, the random nuclear data libraries areprovided with likelihood weights based on their agreement to the experimental data, as a meansto implement Bayes' theorem.Further, it is presented how TMC is applied to an MCNP-6 model of shielding fuel assemblies(SFA) at Ringhals 3 and 4. Since the damage from the fast neutron flux may limit the lifetimes ofthese reactors, parts of the fuel adjacent to the pressure vessel is replaced by steel (the SFA) toprotect the vessel, in particular the four points along the belt-line weld which have been exposedto the largest fluence over time. The 56Fe data uncertainties are considered, and the estimatedrelative uncertainty at a quarter of the pressure vessel is viewed in Figure 1 (right) as well as theflux pattern itself (left). The uncertainty in the flux reduction at a selected sensitive point is 2.5± 0.2 % (one standard deviation). Applying the likelihood weights does not have muchimpact for this case, which could indicate that the prior distribution for the 56Fe data is too“narrow” (the used libraries are not really intended to describe a prior distribution), and that thetrue uncertainty is substantially greater. Another explanation could be that the dominating sourceof uncertainty is the high-energy resonances which are treated inefficiently by such weights.In either case, the efforts to approach justified, transparent, reproducible and highly automatizednuclear data uncertainties shall continue. On top of using libraries that are intended to describeprior distributions and treating the resonance region appropriately, the experimental correlationsshould be better motivated and the treatment of outliers shall be improved. Finally, it is probablynecessary to use experimental data in a more direct sense where a lot of experimental data isavailable, since the nuclear models are imperfect.Figure 1. The high energy neutron flux at the reactor pressure vessel in the SFA model, and thecorresponding propagated 56Fe data uncertainty.
  •  
19.
  • Koning, Arjan, et al. (författare)
  • TENDL : Complete Nuclear Data Library for Innovative Nuclear Science and Technology
  • 2019
  • Ingår i: Nuclear Data Sheets. - : ACADEMIC PRESS INC ELSEVIER SCIENCE. - 0090-3752 .- 1095-9904. ; 155, s. 1-55
  • Tidskriftsartikel (refereegranskat)abstract
    • The TENDL library is now established as one of the major nuclear data libraries in the world, striving for completeness and quality of nuclear data files for all isotopes, evaluation methods, processing and applied performance. To reach this status, some basic principles have been applied which sets it apart from other libraries: reproducible dedicated evaluations when differential data are available, through determination of nuclear models implemented in TALYS and their parameters, completeness (with or without experimental data), format and processing standardization, automation of production and reproducibility. In this paper, we will outline how such an approach has become a reality, and recall some of the past successes since the first TENDL release in 2008. Next, we will demonstrate the performance of the latest TENDL releases for different application fields, as well as new approaches for uncertainty quantification based on Bayesian inference methods and possible differential and integral adjustments. Also, current limitations of the library performances due to modelling and needs for new and more precise experimental data will be outlined.
  •  
20.
  • Neudecker, Denise, et al. (författare)
  • Templates of expected measurement uncertainties
  • 2023
  • Ingår i: EPJ NUCLEAR SCIENCES & TECHNOLOGIES. - : EDP Sciences. - 2491-9292. ; 9
  • Tidskriftsartikel (refereegranskat)abstract
    • The covariance committee of CSEWG (Cross Section Evaluation Working Group) established templates of expected measurement uncertainties for neutron-induced total, (n,gamma), neutron-induced charged-particle, and (n,xn) reaction cross sections as well as prompt fission neutron spectra, average prompt and total fission neutron multiplicities, and fission yields. Templates provide a list of what uncertainty sources are expected for each measurement type and observable, and suggest typical ranges of these uncertainties and correlations based on a survey of experimental data, associated literature, and feedback from experimenters. Information needed to faithfully include the experimental data in the nuclear-data evaluation process is also provided. These templates could assist (a) experimenters and EXFOR compilers in delivering more complete uncertainties and measurement information relevant for evaluations of new experimental data, and (b) evaluators in achieving a more comprehensive uncertainty quantification for evaluation purposes. This effort might ultimately lead to more realistic evaluated covariances for nuclear-data applications. In this topical issue, we cover the templates coming out of this CSEWG effort-typically, one observable per paper. This paper here prefaces this topical issue by introducing the concept and mathematical framework of templates, discussing potential use cases, and giving an example of how they can be applied (estimating missing experimental uncertainties of 235U(n,f) average prompt fission neutron multiplicities), and their impact on nuclear-data evaluations.
  •  
21.
  • Pomp, Stephan, et al. (författare)
  • Experiments and Theoretical Data for Studying the Impact of Fission Yield Uncertainties on the Nuclear Fuel Cycle with TALYS/GEF and the Total Monte Carlo Method
  • 2015
  • Ingår i: Nuclear Data Sheets. - : Elsevier BV. - 0090-3752 .- 1095-9904. ; 123:SI, s. 220-224
  • Tidskriftsartikel (refereegranskat)abstract
    • We describe the research program of the nuclear reactions research group at Uppsala University concerning experimental and theoretical efforts to quantify and reduce nuclear data uncertainties relevant for the nuclear fuel cycle. We briefly describe the Total Monte Carlo (TMC) methodology and how it can be used to study fuel cycle and accident scenarios, and summarize our relevant experimental activities. Input from the latter is to be used to guide the nuclear models and constrain parameter space for TMC. The TMC method relies on the availability of good nuclear models. For this we use the TALYS code which is currently being extended to include the GEF model for the fission channel. We present results from TALYS-1.6 using different versions of GEF with both default and randomized input parameters and compare calculations with experimental data for U-234(n,f) in the fast energy range. These preliminary studies reveal some systematic differences between experimental data and calculations but give overall good and promising results.
  •  
22.
  • Rochman, D., et al. (författare)
  • Nuclear data uncertainty for criticality-safety : Monte Carlo vs. linear perturbation
  • 2016
  • Ingår i: Annals of Nuclear Energy. - : Elsevier BV. - 0306-4549 .- 1873-2100. ; 92, s. 150-160
  • Tidskriftsartikel (refereegranskat)abstract
    • This work is presenting a comparison of results for different methods of uncertainty propagation due to nuclear data for 330 criticality-safety benchmarks. Covariance information is propagated to key using either Monte Carlo methods (NUSS: based on existing nuclear data covariances, and TMC: based on reaction model parameters) or sensitivity calculations from MCNP6 coupled with nuclear data covariances. We are showing that all three methods are globally equivalent for criticality calculations considering the two first moments of a distribution (average and standard deviation), but the Monte Carlo methods lead to actual probability distributions, where the third moment (skewness) should not be ignored for safety assessments.
  •  
23.
  • Rochman, D., et al. (författare)
  • Radiative neutron capture : Hauser Feshbach vs. statistical resonances
  • 2017
  • Ingår i: Physics Letters B. - : ELSEVIER SCIENCE BV. - 0370-2693 .- 1873-2445. ; 764, s. 109-113
  • Tidskriftsartikel (refereegranskat)abstract
    • The radiative neutron capture rates for isotopes of astrophysical interest are commonly calculated on the basis of the statistical Hauser Feshbach (HF) reaction model, leading to smooth and monotonically varying temperature-dependent Maxwellian-averaged cross sections (MACS). The HF approximation is known to be valid if the number of resonances in the compound system is relatively high. However, such a condition is hardly fulfilled for keV neutrons captured on light or exotic neutron-rich nuclei. For this reason, a different procedure is proposed here, based on the generation of statistical resonances. This novel technique, called the "High Fidelity Resonance" (HFR) method is shown to provide similar results as the HF approach for nuclei with a high level density but to deviate and be more realistic than HF predictions for light and neutron-rich nuclei or at relatively low sub-keV energies. The MACS derived with the HFR method are systematically compared with the traditional HF calculations for some 3300 neutron-rich nuclei and shown to give rise to significantly larger predictions with respect to the HF approach at energies of astrophysical relevance. For this reason, the HF approach should not be applied to light or neutron-rich nuclei. The Doppler broadening of the generated resonances is also studied and found to have a negligible impact on the calculated MACS.
  •  
24.
  • Rochman, D., et al. (författare)
  • Re-evaluation of the thermal neutron capture cross section of Nd-147
  • 2016
  • Ingår i: Annals of Nuclear Energy. - : Elsevier BV. - 0306-4549 .- 1873-2100. ; 94, s. 612-617
  • Tidskriftsartikel (refereegranskat)abstract
    • In this paper we are proposing a re-evaluation of the thermal-neutron induced capture cross section of Nd-147. A unique measurement exists from which this cross section was calculated in 1974. This original calculation is based on an assumed value for a specific gamma-ray fraction (called F-2), taken from the neighboring nucleus Nd-145. With the availability of reaction codes such as TALYS, such fraction can nowadays be calculated using specific reaction models and parameters. The new value of F-2 indicates a decrease of the thermal cross section by 45%, leading to 243 barns, instead of the 440 barns previously reported. This new cross section impacts the calculation of the number density for the well-known burn-up indicator Nd-148, but as shown, the change is close to the usual experimental uncertainties for the 148Nd number densities, thus having a limited impact on burn-up calculation.
  •  
25.
  •  
26.
  • Sjöstrand, Henrik, et al. (författare)
  • Propagation of nuclear data uncertainties for ELECTRA burn-up calculations
  • 2014
  • Ingår i: Nuclear Data Sheets. - : Elsevier BV. - 0090-3752 .- 1095-9904. ; 118, s. 527-530
  • Tidskriftsartikel (refereegranskat)abstract
    • The European Lead-Cooled Training Reactor (ELECTRA) has been proposed as a training reactor for fast systems within the Swedish nuclear program. It is a low -power fast reactor cooled by pure liquid lead. In this work, we propagate the uncertainties in 239Pu transport data to uncertainties in the fuel inventory of ELECTRA during the reactor life using the Total Monte Carlo approach(TMC). Within the TENDL project the nuclear models input parameters were randomized within their uncertainties and 740 239Pu nuclear data libraries were generated. These libraries are used as inputs to reactor codes, in our case SERPENT, to perform uncertainty analysis of nuclear reactor inventory during burn-up. The uncertainty in the inventory determines uncertainties in: the long term radio-toxicity, the decay heat, the evolution of reactivity parameters, gas pressure and volatile fission product content. In this work, a methodology called fast TMC is utilized, which reduces the overall calculation time. The uncertainty in the long-term radiotoxicity, decay heat, gas pressureand volatile fission products were found to be insignificant. However, the uncertainty of some minor actinides were observed to be rather large and therefore their impact on multiple recycling should be investigated further. It was also found that, criticality benchmarks can be used to reduce inventory uncertainties due to nuclear data. Further studies are needed to include fission yield uncertainties, more isotopes, and a larger set of benchmarks.
  •  
27.
  • Sjöstrand, Henrik, 1978-, et al. (författare)
  • Propagation Of Nuclear Data Uncertainties For Fusion Power Measurements
  • 2016
  • Konferensbidrag (refereegranskat)abstract
    • Fusion plasmas produce neutrons and by measuring the neutron emission the fusion power can be inferred. Accurate neutron yield measurements are paramount for the safe and efficient operation of fusion experiments and, eventually, fusion power plants.Neutron measurements are an essential part of the diagnostic system at large fusion machines such as JET and ITER. At JET, a system of activation foils provides the absolute calibration for the neutron yield determination.  The activation system uses the property of certain nuclei to emit radiation after being excited by neutron reactions. A sample of suitable nuclei is placed in the neutron flux close to the plasma and after irradiation the induced radiation is measured.  Knowing the neutron activation cross section one can calculate the time-integrated neutron flux at the sample position. To relate the local flux to the total neutron yield, the spatial flux response has to be identified. This describes how the local neutron emission affects the flux at the detector.  The required spatial flux response is commonly determined using neutron transport codes, e.g., MCNP.Nuclear data is used as input both in the calculation of the spatial flux response and when the flux at the irradiation site is inferred. Consequently, high quality nuclear data is essential for the proper determination of the neutron yield and fusion power.  However, uncertainties due to nuclear data are generally not fully taken into account in today’s uncertainty analysis for neutron yield calibrations using activation foils.  In this paper, the neutron yield uncertainty due to nuclear data is investigated using the so-called Total Monte Carlo Method. The work is performed using a detailed MCNP model of JET fusion machine.  In this work the uncertainties due to the cross sections and angular distributions in JET structural materials, as well as the activation cross sections, are analyzed. It is shown that a significant contribution to the neutron yield uncertainty can come from uncertainties in the nuclear data.
  •  
28.
  • Sjöstrand, Henrik, et al. (författare)
  • Total Monte Carlo evaluation for dose calculations
  • 2014
  • Ingår i: Radiation Protection Dosimetry. - : Oxford University Press (OUP). - 0144-8420 .- 1742-3406. ; 161:1-4, s. 312-315
  • Tidskriftsartikel (refereegranskat)abstract
    • Total Monte Carlo (TMC) is a method to propagate nuclear data (ND) uncertainties in transport codes, by using a large set of ND files, which covers the ND uncertainty. The transport code is run multiple times, each time with a unique ND file, and the result is a distribution of the investigated parameter, e.g. dose, where the width of the distribution is interpreted as the uncertainty due to ND. Until recently, this was computer intensive, but with a new development, fast TMC, more applications are accessible. The aim of this work is to test the fast TMC methodology on a dosimetry application and to propagate the 56Fe uncertainties on the predictions of the dose outside a proposed 14-MeV neutron facility. The uncertainty was found to be 4.2 %. This can be considered small; however, this cannot be generalised to all dosimetry applications and so ND uncertainties should routinely be included in most dosimetry modelling.
  •  
Skapa referenser, mejla, bekava och länka
  • Resultat 1-28 av 28

Kungliga biblioteket hanterar dina personuppgifter i enlighet med EU:s dataskyddsförordning (2018), GDPR. Läs mer om hur det funkar här.
Så här hanterar KB dina uppgifter vid användning av denna tjänst.

 
pil uppåt Stäng

Kopiera och spara länken för att återkomma till aktuell vy