SwePub
Tyck till om SwePub Sök här!
Sök i SwePub databas

  Utökad sökning

Träfflista för sökning "WFRF:(Corker Katherine S.) "

Sökning: WFRF:(Corker Katherine S.)

  • Resultat 1-4 av 4
Sortera/gruppera träfflistan
   
NumreringReferensOmslagsbildHitta
1.
  • Ebersole, Charles R., et al. (författare)
  • Many Labs 5: Testing Pre-Data-Collection Peer Review as an Intervention to Increase Replicability
  • 2020
  • Ingår i: Advances in Methods and Practices in Psychological Science. - : Sage. - 2515-2467 .- 2515-2459. ; 3:3, s. 309-331
  • Tidskriftsartikel (refereegranskat)abstract
    • Replication studies in psychological science sometimes fail to reproduce prior findings. If these studies use methods that are unfaithful to the original study or ineffective in eliciting the phenomenon of interest, then a failure to replicate may be a failure of the protocol rather than a challenge to the original finding. Formal pre-data-collection peer review by experts may address shortcomings and increase replicability rates. We selected 10 replication studies from the Reproducibility Project: Psychology (RP:P; Open Science Collaboration, 2015) for which the original authors had expressed concerns about the replication designs before data collection; only one of these studies had yielded a statistically significant effect (p < .05). Commenters suggested that lack of adherence to expert review and low-powered tests were the reasons that most of these RP:P studies failed to replicate the original effects. We revised the replication protocols and received formal peer review prior to conducting new replication studies. We administered the RP:P and revised protocols in multiple laboratories (median number of laboratories per original study = 6.5, range = 3-9; median total sample = 1,279.5, range = 276-3,512) for high-powered tests of each original finding with both protocols. Overall, following the preregistered analysis plan, we found that the revised protocols produced effect sizes similar to those of the RP:P protocols (Delta r = .002 or .014, depending on analytic approach). The median effect size for the revised protocols (r = .05) was similar to that of the RP:P protocols (r = .04) and the original RP:P replications (r = .11), and smaller than that of the original studies (r = .37). Analysis of the cumulative evidence across the original studies and the corresponding three replication attempts provided very precise estimates of the 10 tested effects and indicated that their effect sizes (median r = .07, range = .00-.15) were 78% smaller, on average, than the original effect sizes (median r = .37, range = .19-.50).
  •  
2.
  • Moshontz, Hannah, et al. (författare)
  • The Psychological Science Accelerator: Advancing Psychology Through a Distributed Collaborative Network
  • 2018
  • Ingår i: Advances in Methods and Practices in Psychological Science. - : SAGE Publications. - 2515-2459 .- 2515-2467. ; 1:4, s. 501-515
  • Tidskriftsartikel (refereegranskat)abstract
    • Concerns about the veracity of psychological research have been growing. Many findings in psychological science are based on studies with insufficient statistical power and nonrepresentative samples, or may otherwise be limited to specific, ungeneralizable settings or populations. Crowdsourced research, a type of large-scale collaboration in which one or more research projects are conducted across multiple lab sites, offers a pragmatic solution to these and other current methodological challenges. The Psychological Science Accelerator (PSA) is a distributed network of laboratories designed to enable and support crowdsourced research projects. These projects can focus on novel research questions or replicate prior research in large, diverse samples. The PSA’s mission is to accelerate the accumulation of reliable and generalizable evidence in psychological science. Here, we describe the background, structure, principles, procedures, benefits, and challenges of the PSA. In contrast to other crowdsourced research networks, the PSA is ongoing (as opposed to time limited), efficient (in that structures and principles are reused for different projects), decentralized, diverse (in both subjects and researchers), and inclusive (of proposals, contributions, and other relevant input from anyone inside or outside the network). The PSA and other approaches to crowdsourced psychological science will advance understanding of mental processes and behaviors by enabling rigorous research and systematic examination of its generalizability.
  •  
3.
  • van den Akker, Olmo R., et al. (författare)
  • Increasing the transparency of systematic reviews : presenting a generalized registration form
  • 2023
  • Ingår i: Systematic Reviews. - : BioMed Central (BMC). - 2046-4053. ; 12
  • Tidskriftsartikel (refereegranskat)abstract
    • This paper presents a generalized registration form for systematic reviews that can be used when currently available forms are not adequate. The form is designed to be applicable across disciplines (i.e., psychology, economics, law, physics, or any other field) and across review types (i.e., scoping review, review of qualitative studies, meta-analysis, or any other type of review). That means that the reviewed records may include research reports as well as archive documents, case law, books, poems, etc. Items were selected and formulated to optimize broad applicability instead of specificity, forgoing some benefits afforded by a tighter focus. This PRISMA 2020 compliant form is a fallback for more specialized forms and can be used if no specialized form or registration platform is available. When accessing this form on the Open Science Framework website, users will therefore first be guided to specialized forms when they exist. In addition to this use case, the form can also serve as a starting point for creating registration forms that cater to specific fields or review types.
  •  
4.
  • Schoenbrodt, Felix D., et al. (författare)
  • Replicability, Robustness, and Reproducibility in Psychological Science
  • 2022
  • Ingår i: Annual Review of Psychology. - : Annual Reviews. - 1545-2085 .- 0066-4308. ; 73, s. 719-748
  • Tidskriftsartikel (refereegranskat)abstract
    • Replication mdash an important, uncommon, and misunderstood practice mdash is gaining appreciation in psychology. Achieving replicability is important for making research progress. If findings are not replicable, then prediction and theory development are stifled. If findings are replicable, then interrogation of their meaning and validity can advance knowledge. Assessing replicability can be productive for generating and testing hypotheses by actively confronting current understandings to identify weaknesses and spur innovation. For psychology, the 2010s might be characterized as a decade of active confrontation. Systematic and multi-site replication projects assessed current understandings and observed surprising failures to replicate many published findings. Replication efforts highlighted sociocultural challenges such as disincentives to conduct replications and a tendency to frame replication as a personal attack rather than a healthy scientific practice, and they raised awareness that replication contributes to self-correction. Nevertheless, innovation in doing and understanding replication and its cousins, reproducibility and robustness, has positioned psychology to improve research practices and accelerate progress. © 2022 Annual Reviews Inc.. All rights reserved.
  •  
Skapa referenser, mejla, bekava och länka
  • Resultat 1-4 av 4

Kungliga biblioteket hanterar dina personuppgifter i enlighet med EU:s dataskyddsförordning (2018), GDPR. Läs mer om hur det funkar här.
Så här hanterar KB dina uppgifter vid användning av denna tjänst.

 
pil uppåt Stäng

Kopiera och spara länken för att återkomma till aktuell vy