1. |
- Huntington-Klein, Nick, et al.
(författare)
-
Subjective evidence evaluation survey for many-analysts studies
- 2024
-
Ingår i: Royal Society Open Science. - : The Royal Society. - 2054-5703 .- 2054-5703. ; 11:7
-
Tidskriftsartikel (refereegranskat)abstract
- Many-analysts studies explore how well an empirical claim withstands plausible alternative analyses of the same dataset by multiple, independent analysis teams. Conclusions from these studies typically rely on a single outcome metric (e.g. effect size) provided by each analysis team. Although informative about the range of plausible effects in a dataset, a single effect size from each team does not provide a complete, nuanced understanding of how analysis choices are related to the outcome. We used the Delphi consensus technique with input from 37 experts to develop an 18-item subjective evidence evaluation survey (SEES) to evaluate how each analysis team views the methodological appropriateness of the research design and the strength of evidence for the hypothesis. We illustrate the usefulness of the SEES in providing richer evidence assessment with pilot data from a previous many-analysts study.
|
|
2. |
- Menkveld, Albert J., et al.
(författare)
-
Nonstandard Errors
- 2024
-
Ingår i: JOURNAL OF FINANCE. - : Wiley-Blackwell. - 0022-1082 .- 1540-6261. ; 79:3, s. 2339-2390
-
Tidskriftsartikel (refereegranskat)abstract
- In statistics, samples are drawn from a population in a data-generating process (DGP). Standard errors measure the uncertainty in estimates of population parameters. In science, evidence is generated to test hypotheses in an evidence-generating process (EGP). We claim that EGP variation across researchers adds uncertainty-nonstandard errors (NSEs). We study NSEs by letting 164 teams test the same hypotheses on the same data. NSEs turn out to be sizable, but smaller for more reproducible or higher rated research. Adding peer-review stages reduces NSEs. We further find that this type of uncertainty is underestimated by participants.
|
|
3. |
- Uhlmann, Eric, L., et al.
(författare)
-
Subjective Evidence Evaluation Survey For Multi-Analyst Studies
- 2024
-
Annan publikation (övrigt vetenskapligt/konstnärligt)abstract
- Multi-analyst studies explore how well an empirical claim withstands plausible alternative analyses of the same data set by multiple, independent analysis teams. Conclusions from these studies typically rely on a single outcome metric (e.g., effect size) provided by each analysis team. Although informative about the range of plausible effects in a data set, a single effect size from each team does not provide a complete, nuanced understanding of how analysis choices are related to the outcome. We used the Delphi consensus technique with input from 37 experts to develop an 18-item Subjective Evidence Evaluation Survey (SEES) to evaluate how each analysis team views the methodological appropriateness of the research design and the strength of evidence for the hypothesis. We illustrate the usefulness of the SEES in providing richer evidence assessment with pilot data from a previous multi-analyst study.
|
|