SwePub
Sök i SwePub databas

  Extended search

Träfflista för sökning "WFRF:(Wagenmakers Eric Jan M.) "

Search: WFRF:(Wagenmakers Eric Jan M.)

  • Result 1-7 of 7
Sort/group result
   
EnumerationReferenceCoverFind
1.
  • Graham, Jesse R., et al. (author)
  • The pipeline project: Pre-publication independent replications of a single laboratory's research pipeline
  • 2016
  • In: Journal of Experimental Social Psychology. - : Elsevier. - 1096-0465 .- 0022-1031. ; 66, s. 55-67
  • Journal article (peer-reviewed)abstract
    • This crowdsourced project introduces a collaborative approach to improving the reproducibility of scientific research, in which findings are replicated in qualified independent laboratories before (rather than after) they are published. Our goal is to establish a non-adversarial replication process with highly informative final results. To illustrate the Pre-Publication Independent Replication (PPIR) approach, 25 research groups conducted replications of all ten moral judgment effects which the last author and his collaborators had “in the pipeline” as of August 2014. Six findings replicated according to all replication criteria, one finding replicated but with a significantly smaller effect size than the original, one finding replicated consistently in the original culture but not outside of it, and two findings failed to find support. In total, 40% of the original findings failed at least one major replication criterion. Potential ways to implement and incentivize pre-publication independent replication on a large scale are discussed.
  •  
2.
  • Washburn, Anthony N., et al. (author)
  • Data from a pre-publication independent replication initiative examining ten moral judgement effects
  • 2016
  • In: Scientific Data. - : Nature Research (part of Springer Nature): Fully open access journals / Nature Publishing Group. - 2052-4463. ; 3
  • Journal article (peer-reviewed)abstract
    • We present the data from a crowdsourced project seeking to replicate findings in  independent laboratories before (rather than after) they are published. In this Pre-Publication Independent Replication (PPIR) initiative, 25 research groups attempted to replicate 10 moral judgment effects from a single laboratory's research pipeline of unpublished findings. The 10 effects were investigated using online/lab surveys containing psychological manipulations (vignettes) followed by questionnaires.
  •  
3.
  • Huntington-Klein, Nick, et al. (author)
  • Subjective evidence evaluation survey for many-analysts studies
  • 2024
  • In: Royal Society Open Science. - : The Royal Society. - 2054-5703 .- 2054-5703. ; 11:7
  • Journal article (peer-reviewed)abstract
    • Many-analysts studies explore how well an empirical claim withstands plausible alternative analyses of the same dataset by multiple, independent analysis teams. Conclusions from these studies typically rely on a single outcome metric (e.g. effect size) provided by each analysis team. Although informative about the range of plausible effects in a dataset, a single effect size from each team does not provide a complete, nuanced understanding of how analysis choices are related to the outcome. We used the Delphi consensus technique with input from 37 experts to develop an 18-item subjective evidence evaluation survey (SEES) to evaluate how each analysis team views the methodological appropriateness of the research design and the strength of evidence for the hypothesis. We illustrate the usefulness of the SEES in providing richer evidence assessment with pilot data from a previous many-analysts study.
  •  
4.
  • Silberzahn, Raphael, et al. (author)
  • Many analysts, one dataset : Making transparent how variations in analytical choices affect results
  • 2018
  • In: Advances in Methods and Practices in Psychological Science. - : Sage Publications. - 2515-2459 .- 2515-2467. ; 1:3, s. 337-356
  • Journal article (peer-reviewed)abstract
    • Twenty-nine teams involving 61 analysts used the same dataset to address the same research question: whether soccer referees are more likely to give red cards to dark skin toned players than light skin toned players. Analytic approaches varied widely across teams, and estimated effect sizes ranged from 0.89 to 2.93 in odds ratio units, with a median of 1.31. Twenty teams (69%) found a statistically significant positive effect and nine teams (31%) observed a non-significant relationship. Overall 29 differentanalyses used 21 unique combinations of covariates. We found that neither analysts' prior beliefs about the effect, nor their level of expertise, nor peer-reviewed quality of analysis readily explained variation in analysis outcomes. This suggests that significant variation in the results of analyses of complex data may be difficult to avoid, even by experts with honest intentions. Crowdsourcing data analysis, a strategy by which numerous research teams are recruited to simultaneously investigate the same research question, makes transparent how defensible, yet subjective analytic choices influence research results.
  •  
5.
  • Uhlmann, Eric, L., et al. (author)
  • Subjective Evidence Evaluation Survey For Multi-Analyst Studies
  • 2024
  • Other publication (other academic/artistic)abstract
    • Multi-analyst studies explore how well an empirical claim withstands plausible alternative analyses of the same data set by multiple, independent analysis teams. Conclusions from these studies typically rely on a single outcome metric (e.g., effect size) provided by each analysis team. Although informative about the range of plausible effects in a data set, a single effect size from each team does not provide a complete, nuanced understanding of how analysis choices are related to the outcome. We used the Delphi consensus technique with input from 37 experts to develop an 18-item Subjective Evidence Evaluation Survey (SEES) to evaluate how each analysis team views the methodological appropriateness of the research design and the strength of evidence for the hypothesis. We illustrate the usefulness of the SEES in providing richer evidence assessment with pilot data from a previous multi-analyst study.
  •  
6.
  • Aczel, Balazs, et al. (author)
  • Consensus-based guidance for conducting and reporting multi-analyst studies
  • 2021
  • In: eLIFE. - : eLife Sciences Publications. - 2050-084X. ; 10
  • Journal article (peer-reviewed)abstract
    • Any large dataset can be analyzed in a number of ways, and it is possible that the use of different analysis strategies will lead to different results and conclusions. One way to assess whether the results obtained depend on the analysis strategy chosen is to employ multiple analysts and leave each of them free to follow their own approach. Here, we present consensus-based guidance for conducting and reporting such multi-analyst studies, and we discuss how broader adoption of the multi-analyst approach has the potential to strengthen the robustness of results and conclusions obtained from analyses of datasets in basic and applied research.
  •  
7.
  • Benjamin, Daniel J., et al. (author)
  • Redefine statistical significance
  • 2018
  • In: Nature Human Behaviour. - : Nature Research (part of Springer Nature). - 2397-3374. ; 2:1, s. 6-10
  • Journal article (other academic/artistic)
  •  
Skapa referenser, mejla, bekava och länka
  • Result 1-7 of 7
Type of publication
journal article (6)
other publication (1)
Type of content
peer-reviewed (5)
other academic/artistic (2)
Author/Editor
Schweinsberg, Martin (5)
Johannesson, Magnus (4)
Kirchler, Michael (4)
Wagenmakers, Eric-Ja ... (4)
Aczel, Balazs (3)
Szaszi, Barnabas (3)
show more...
Nilsonne, Gustav (3)
Albers, Casper J. (3)
Botvinik-Nezer, Rote ... (3)
Busch, Niko A. (3)
Cataldo, Andrea M. (3)
van Dongen, Noah N. ... (3)
Hoekstra, Rink (3)
Holzmeister, Felix (3)
Matzke, Dora (3)
van Ravenzwaaij, Don (3)
Sarafoglou, Alexandr ... (3)
Simons, Daniel J. (3)
Spellman, Barbara A. (3)
Uhlmann, Eric Luis (3)
Wicherts, Jelte (3)
Cheung, Felix (3)
Vianello, Michelange ... (3)
Wagenmakers, Eric Ja ... (3)
Gamez-Djokic, Monica (3)
Dreber Almenberg, An ... (2)
Hoffmann, Sabine (2)
Mangin, Jean-Francoi ... (2)
Munafò, Marcus R. (2)
Nosek, Brian A. (2)
Silberzahn, Raphael (2)
Vanpaemel, Wolf (2)
Van Bavel, Jay J. (2)
Wetter, Erik (2)
Dreber, Anna (2)
Schaerer, Michael (2)
Huber, Jürgen (2)
Qureshi, Israr (2)
Sokolova, Tatiana (2)
Warren, Tierney (2)
Plessis, Christilene ... (2)
Cushman, Fiery A. (2)
Storage, Daniel (2)
Tuerlinckx, Francis (2)
Inbar, Yoel (2)
Graham, Jesse R. (2)
Motyl, Matt (2)
Chandler, Jesse J. (2)
Wong, Lynn (2)
Hofstein Grady, Rebe ... (2)
show less...
University
Stockholm School of Economics (7)
Stockholm University (2)
Linnaeus University (1)
Karolinska Institutet (1)
Language
English (7)
Research subject (UKÄ/SCB)
Social Sciences (5)
Natural sciences (2)

Year

Kungliga biblioteket hanterar dina personuppgifter i enlighet med EU:s dataskyddsförordning (2018), GDPR. Läs mer om hur det funkar här.
Så här hanterar KB dina uppgifter vid användning av denna tjänst.

 
pil uppåt Close

Copy and save the link in order to return to this view