SwePub
Sök i SwePub databas

  Extended search

Träfflista för sökning "WFRF:(Uhlmann Eric) "

Search: WFRF:(Uhlmann Eric)

  • Result 1-10 of 18
Sort/group result
   
EnumerationReferenceCoverFind
1.
  • Huntington-Klein, Nick, et al. (author)
  • Subjective evidence evaluation survey for many-analysts studies
  • 2024
  • In: Royal Society Open Science. - : The Royal Society. - 2054-5703 .- 2054-5703. ; 11:7
  • Journal article (peer-reviewed)abstract
    • Many-analysts studies explore how well an empirical claim withstands plausible alternative analyses of the same dataset by multiple, independent analysis teams. Conclusions from these studies typically rely on a single outcome metric (e.g. effect size) provided by each analysis team. Although informative about the range of plausible effects in a dataset, a single effect size from each team does not provide a complete, nuanced understanding of how analysis choices are related to the outcome. We used the Delphi consensus technique with input from 37 experts to develop an 18-item subjective evidence evaluation survey (SEES) to evaluate how each analysis team views the methodological appropriateness of the research design and the strength of evidence for the hypothesis. We illustrate the usefulness of the SEES in providing richer evidence assessment with pilot data from a previous many-analysts study.
  •  
2.
  • Maccari, Maria Elena, et al. (author)
  • Activated phosphoinositide 3-kinase δ syndrome: Update from the ESID Registry and comparison with other autoimmune-lymphoproliferative inborn errors of immunity.
  • 2023
  • In: The Journal of allergy and clinical immunology. - 1097-6825. ; 152:4
  • Journal article (peer-reviewed)abstract
    • Activated phosphoinositide-3-kinase δ syndrome (APDS) is an inborn error of immunity (IEI) with infection susceptibility and immune dysregulation, clinically overlapping with other conditions. Management depends on disease evolution, but predictors of severe disease are lacking.This study sought to report the extended spectrum of disease manifestations in APDS1 versus APDS2; compare these to CTLA4 deficiency, NFKB1 deficiency, and STAT3 gain-of-function (GOF) disease; and identify predictors of severity in APDS.Data was collected from the ESID (European Society for Immunodeficiencies)-APDS registry and was compared with published cohorts of the other IEIs.The analysis of 170 patients with APDS outlines high penetrance and early onset of APDS compared to the other IEIs. The large clinical heterogeneity even in individuals with the same PIK3CD variant E1021K illustrates how poorly the genotype predicts the disease phenotype and course. The high clinical overlap between APDS and the other investigated IEIs suggests relevant pathophysiological convergence of the affected pathways. Preferentially affected organ systems indicate specific pathophysiology: bronchiectasis is typical of APDS1; interstitial lung disease and enteropathy are more common in STAT3 GOF and CTLA4 deficiency. Endocrinopathies are most frequent in STAT3 GOF, but growth impairment is also common, particularly in APDS2. Early clinical presentation is a risk factor for severe disease in APDS.APDS illustrates how a single genetic variant can result in a diverse autoimmune-lymphoproliferative phenotype. Overlap with other IEIs is substantial. Some specific features distinguish APDS1 from APDS2. Early onset is a risk factor for severe disease course calling for specific treatment studies in younger patients.
  •  
3.
  • Silberzahn, Raphael, et al. (author)
  • Many analysts, one dataset : Making transparent how variations in analytical choices affect results
  • 2018
  • In: Advances in Methods and Practices in Psychological Science. - : Sage Publications. - 2515-2459 .- 2515-2467. ; 1:3, s. 337-356
  • Journal article (peer-reviewed)abstract
    • Twenty-nine teams involving 61 analysts used the same dataset to address the same research question: whether soccer referees are more likely to give red cards to dark skin toned players than light skin toned players. Analytic approaches varied widely across teams, and estimated effect sizes ranged from 0.89 to 2.93 in odds ratio units, with a median of 1.31. Twenty teams (69%) found a statistically significant positive effect and nine teams (31%) observed a non-significant relationship. Overall 29 differentanalyses used 21 unique combinations of covariates. We found that neither analysts' prior beliefs about the effect, nor their level of expertise, nor peer-reviewed quality of analysis readily explained variation in analysis outcomes. This suggests that significant variation in the results of analyses of complex data may be difficult to avoid, even by experts with honest intentions. Crowdsourcing data analysis, a strategy by which numerous research teams are recruited to simultaneously investigate the same research question, makes transparent how defensible, yet subjective analytic choices influence research results.
  •  
4.
  • Uhlmann, Eric, L., et al. (author)
  • Subjective Evidence Evaluation Survey For Multi-Analyst Studies
  • 2024
  • Other publication (other academic/artistic)abstract
    • Multi-analyst studies explore how well an empirical claim withstands plausible alternative analyses of the same data set by multiple, independent analysis teams. Conclusions from these studies typically rely on a single outcome metric (e.g., effect size) provided by each analysis team. Although informative about the range of plausible effects in a data set, a single effect size from each team does not provide a complete, nuanced understanding of how analysis choices are related to the outcome. We used the Delphi consensus technique with input from 37 experts to develop an 18-item Subjective Evidence Evaluation Survey (SEES) to evaluate how each analysis team views the methodological appropriateness of the research design and the strength of evidence for the hypothesis. We illustrate the usefulness of the SEES in providing richer evidence assessment with pilot data from a previous multi-analyst study.
  •  
5.
  • Aczel, Balazs, et al. (author)
  • Consensus-based guidance for conducting and reporting multi-analyst studies
  • 2021
  • In: eLIFE. - : eLife Sciences Publications. - 2050-084X. ; 10
  • Journal article (peer-reviewed)abstract
    • Any large dataset can be analyzed in a number of ways, and it is possible that the use of different analysis strategies will lead to different results and conclusions. One way to assess whether the results obtained depend on the analysis strategy chosen is to employ multiple analysts and leave each of them free to follow their own approach. Here, we present consensus-based guidance for conducting and reporting such multi-analyst studies, and we discuss how broader adoption of the multi-analyst approach has the potential to strengthen the robustness of results and conclusions obtained from analyses of datasets in basic and applied research.
  •  
6.
  • Graham, Jesse R., et al. (author)
  • The pipeline project: Pre-publication independent replications of a single laboratory's research pipeline
  • 2016
  • In: Journal of Experimental Social Psychology. - : Elsevier. - 1096-0465 .- 0022-1031. ; 66, s. 55-67
  • Journal article (peer-reviewed)abstract
    • This crowdsourced project introduces a collaborative approach to improving the reproducibility of scientific research, in which findings are replicated in qualified independent laboratories before (rather than after) they are published. Our goal is to establish a non-adversarial replication process with highly informative final results. To illustrate the Pre-Publication Independent Replication (PPIR) approach, 25 research groups conducted replications of all ten moral judgment effects which the last author and his collaborators had “in the pipeline” as of August 2014. Six findings replicated according to all replication criteria, one finding replicated but with a significantly smaller effect size than the original, one finding replicated consistently in the original culture but not outside of it, and two findings failed to find support. In total, 40% of the original findings failed at least one major replication criterion. Potential ways to implement and incentivize pre-publication independent replication on a large scale are discussed.
  •  
7.
  • Schweinsberg, Martin, et al. (author)
  • Same data, different conclusions : Radical dispersion in empirical results when independent analysts operationalize and test the same hypothesis
  • 2021
  • In: Organizational Behavior and Human Decision Processes. - : Elsevier BV. - 0749-5978 .- 1095-9920. ; 165, s. 228-249
  • Journal article (peer-reviewed)abstract
    • In this crowdsourced initiative, independent analysts used the same dataset to test two hypotheses regarding the effects of scientists' gender and professional status on verbosity during group meetings. Not only the analytic approach but also the operationalizations of key variables were left unconstrained and up to individual analysts. For instance, analysts could choose to operationalize status as job title, institutional ranking, citation counts, or some combination. To maximize transparency regarding the process by which analytic choices are made, the analysts used a platform we developed called DataExplained to justify both preferred and rejected analytic paths in real time. Analyses lacking sufficient detail, reproducible code, or with statistical errors were excluded, resulting in 29 analyses in the final sample. Researchers reported radically different analyses and dispersed empirical outcomes, in a number of cases obtaining significant effects in opposite directions for the same research question. A Boba multiverse analysis demonstrates that decisions about how to operationalize variables explain variability in outcomes above and beyond statistical choices (e.g., covariates). Subjective researcher decisions play a critical role in driving the reported empirical results, underscoring the need for open data, systematic robustness checks, and transparency regarding both analytic paths taken and not taken. Implications for orga-nizations and leaders, whose decision making relies in part on scientific findings, consulting reports, and internal analyses by data scientists, are discussed.
  •  
8.
  • Washburn, Anthony N., et al. (author)
  • Data from a pre-publication independent replication initiative examining ten moral judgement effects
  • 2016
  • In: Scientific Data. - : Nature Research (part of Springer Nature): Fully open access journals / Nature Publishing Group. - 2052-4463. ; 3
  • Journal article (peer-reviewed)abstract
    • We present the data from a crowdsourced project seeking to replicate findings in  independent laboratories before (rather than after) they are published. In this Pre-Publication Independent Replication (PPIR) initiative, 25 research groups attempted to replicate 10 moral judgment effects from a single laboratory's research pipeline of unpublished findings. The 10 effects were investigated using online/lab surveys containing psychological manipulations (vignettes) followed by questionnaires.
  •  
9.
  • Cyrus-Lai, Wilson, et al. (author)
  • Avoiding Bias in the Search for Implicit Bias
  • 2022
  • In: Psychological Inquiry. - : Taylor & Francis Ltd. - 1532-7965 .- 1047-840X. ; 33:3, s. 203-212
  • Journal article (peer-reviewed)abstract
    • To revitalize the study of unconscious bias, Gawronski, Ledgerwood, and Eastwick (this issue) propose a paradigm shift away from implicit measures of inter-group attitudes and beliefs. Specifically, researchers should capture discriminatory biases and demonstrate that participants are unaware of the influence of social category cues on their judgments and actions. Individual differences in scores on implicit measures will be useful to predict and better understand implicitly prejudiced behaviors, but the latter should be the collective focus of researchers interested in unconscious biases against social groups.
  •  
10.
  • Dreber Almenberg, Anna, et al. (author)
  • Is research in social psychology politically biased? Systematic empirical tests and a forecasting survey to address the controversy
  • 2018
  • In: Journal of Experimental Social Psychology. - : Elsevier. - 1096-0465 .- 0022-1031. ; 79:november, s. 188-199
  • Journal article (peer-reviewed)abstract
    • The present investigation provides the first systematic empirical tests for the role of politics in academic research. In a large sample of scientific abstracts from the field of social psychology, we find both evaluative differences, such that conservatives are described more negatively than liberals, and explanatory differences, such that conservatism is more likely to be the focus of explanation than liberalism. In light of the ongoing debate about politicized science, a forecasting survey permitted scientists to state a priori empirical predictions about the results, and then change their beliefs in light of the evidence. Participating scientists accurately predicted the direction of both the evaluative and explanatory differences, but at the same time significantly overestimated both effect sizes. Scientists also updated their broader beliefs about political bias in response to the empirical results, providing a model for addressing divisive scientific controversies across fields.
  •  
Skapa referenser, mejla, bekava och länka
  • Result 1-10 of 18
Type of publication
journal article (14)
other publication (4)
Type of content
peer-reviewed (14)
other academic/artistic (4)
Author/Editor
Uhlmann, Eric Luis (11)
Johannesson, Magnus (9)
Dreber Almenberg, An ... (7)
Viganola, Domenico (7)
Inbar, Yoel (7)
Schweinsberg, Martin (6)
show more...
Pfeiffer, Thomas (6)
Nilsonne, Gustav (5)
Eitan, Orly (5)
Thau, Stefan (5)
Silberzahn, Raphael (4)
Wagenmakers, Eric-Ja ... (4)
Schaerer, Michael (4)
Aczel, Balazs (3)
Szaszi, Barnabas (3)
Albers, Casper J. (3)
Botvinik-Nezer, Rote ... (3)
Busch, Niko A. (3)
Cataldo, Andrea M. (3)
van Dongen, Noah N. ... (3)
Hoekstra, Rink (3)
Holzmeister, Felix (3)
Kirchler, Michael (3)
Matzke, Dora (3)
van Ravenzwaaij, Don (3)
Sarafoglou, Alexandr ... (3)
Simons, Daniel J. (3)
Spellman, Barbara A. (3)
Wicherts, Jelte (3)
Cheung, Felix (3)
Vianello, Michelange ... (3)
Dreber, Anna (3)
Warren, Tierney (3)
Plessis, Christilene ... (3)
van den Akker, Olmo ... (2)
Hoffmann, Sabine (2)
Mangin, Jean-Francoi ... (2)
Bahník, Štěpán (2)
Vanpaemel, Wolf (2)
Carlsson, Rickard, 1 ... (2)
Van Bavel, Jay J. (2)
Wetter, Erik (2)
Wagenmakers, Eric Ja ... (2)
Huber, Jürgen (2)
Qureshi, Israr (2)
Sokolova, Tatiana (2)
Cyrus-Lai, Wilson (2)
Clemente, Elena Giul ... (2)
Cushman, Fiery A. (2)
Storage, Daniel (2)
show less...
University
Stockholm School of Economics (15)
Stockholm University (4)
Karolinska Institutet (4)
Linköping University (2)
Linnaeus University (2)
University of Gothenburg (1)
show more...
Royal Institute of Technology (1)
Swedish University of Agricultural Sciences (1)
show less...
Language
English (18)
Research subject (UKÄ/SCB)
Social Sciences (15)
Natural sciences (2)
Medical and Health Sciences (2)

Year

Kungliga biblioteket hanterar dina personuppgifter i enlighet med EU:s dataskyddsförordning (2018), GDPR. Läs mer om hur det funkar här.
Så här hanterar KB dina uppgifter vid användning av denna tjänst.

 
pil uppåt Close

Copy and save the link in order to return to this view