SwePub
Sök i SwePub databas

  Extended search

Träfflista för sökning "WFRF:(Menkveld Albert J.) "

Search: WFRF:(Menkveld Albert J.)

  • Result 1-9 of 9
Sort/group result
   
EnumerationReferenceCoverFind
1.
  • Menkveld, Albert J., et al. (author)
  • Nonstandard Errors
  • 2024
  • In: JOURNAL OF FINANCE. - : Wiley-Blackwell. - 0022-1082 .- 1540-6261. ; 79:3, s. 2339-2390
  • Journal article (peer-reviewed)abstract
    • In statistics, samples are drawn from a population in a data-generating process (DGP). Standard errors measure the uncertainty in estimates of population parameters. In science, evidence is generated to test hypotheses in an evidence-generating process (EGP). We claim that EGP variation across researchers adds uncertainty-nonstandard errors (NSEs). We study NSEs by letting 164 teams test the same hypotheses on the same data. NSEs turn out to be sizable, but smaller for more reproducible or higher rated research. Adding peer-review stages reduces NSEs. We further find that this type of uncertainty is underestimated by participants.
  •  
2.
  • Huntington-Klein, Nick, et al. (author)
  • Subjective evidence evaluation survey for many-analysts studies
  • 2024
  • In: Royal Society Open Science. - : The Royal Society. - 2054-5703 .- 2054-5703. ; 11:7
  • Journal article (peer-reviewed)abstract
    • Many-analysts studies explore how well an empirical claim withstands plausible alternative analyses of the same dataset by multiple, independent analysis teams. Conclusions from these studies typically rely on a single outcome metric (e.g. effect size) provided by each analysis team. Although informative about the range of plausible effects in a dataset, a single effect size from each team does not provide a complete, nuanced understanding of how analysis choices are related to the outcome. We used the Delphi consensus technique with input from 37 experts to develop an 18-item subjective evidence evaluation survey (SEES) to evaluate how each analysis team views the methodological appropriateness of the research design and the strength of evidence for the hypothesis. We illustrate the usefulness of the SEES in providing richer evidence assessment with pilot data from a previous many-analysts study.
  •  
3.
  • Uhlmann, Eric, L., et al. (author)
  • Subjective Evidence Evaluation Survey For Multi-Analyst Studies
  • 2024
  • Other publication (other academic/artistic)abstract
    • Multi-analyst studies explore how well an empirical claim withstands plausible alternative analyses of the same data set by multiple, independent analysis teams. Conclusions from these studies typically rely on a single outcome metric (e.g., effect size) provided by each analysis team. Although informative about the range of plausible effects in a data set, a single effect size from each team does not provide a complete, nuanced understanding of how analysis choices are related to the outcome. We used the Delphi consensus technique with input from 37 experts to develop an 18-item Subjective Evidence Evaluation Survey (SEES) to evaluate how each analysis team views the methodological appropriateness of the research design and the strength of evidence for the hypothesis. We illustrate the usefulness of the SEES in providing richer evidence assessment with pilot data from a previous multi-analyst study.
  •  
4.
  • Hagströmer, Björn, 1981-, et al. (author)
  • Information Revelation in Decentralized Markets
  • 2019
  • In: Journal of Finance. - : Wiley. - 0022-1082 .- 1540-6261. ; 74:5, s. 2751-2787
  • Journal article (peer-reviewed)abstract
    • How does information get revealed in decentralized markets? We test several hypotheses inspired by recent dealer-network theory. To do so we construct an empirical map of information revelation where two dealers are connected based on the synchronicity of their quote changes. The tests, based on EUR/CHF quote data including the 2015 crash, largely support theory: Strongly connected (i.e., central) dealers are more informed. Connections are weaker when there is less to be learned. The crash serves to identify how a network forms when dealers are transitioned from no-learning to learning, that is, from a fixed to a floating rate.
  •  
5.
  • Menkveld, Albert J., et al. (author)
  • Non-Standard Errors
  • 2021
  • Other publication (other academic/artistic)abstract
    • In statistics, samples are drawn from a population in a data generating process (DGP). Standard errors measure the uncertainty in sample estimates of population parameters. In science, evidence is generated to test hypotheses in an evidence generating process (EGP). We claim that EGP variation across researchers adds uncertainty: non-standard errors. To study them, we let 164 teams test six hypotheses on the same sample. We find that non-standard errors are sizeable, on par with standard errors. Their size (i) co-varies only weakly with team merits, reproducibility, or peer rating, (ii) declines significantly after peer-feedback, and (iii) is underestimated by participants.
  •  
6.
  • Menkveld, Albert J., et al. (author)
  • Non-Standard Errors
  • 2024
  • In: Journal of Finance. - 0022-1082. ; 79:3, s. 2339-2390
  • Journal article (peer-reviewed)abstract
    • In statistics, samples are drawn from a population in a data-generating process (DGP). Standard errors measure the uncertainty in estimates of population parameters. In science, evidence is generated to test hypotheses in an evidence-generating process (EGP). We claim that EGP variation across researchers adds uncertainty—nonstandard errors (NSEs). We study NSEs by letting 164 teams test the same hypotheses on the same data. NSEs turn out to be sizable, but smaller for more reproducible or higher rated research. Adding peer-review stages reduces NSEs. We further find that this type of uncertainty is underestimated by participants.
  •  
7.
  • Menkveld, Albert J., et al. (author)
  • Nonstandard Errors
  • 2024
  • In: Journal of Finance. - 0022-1082 .- 1540-6261.
  • Journal article (peer-reviewed)
  •  
8.
  • Menkveld, Albert J., et al. (author)
  • Nonstandard Errors
  • 2024
  • In: Journal of Finance. - : Wiley-Blackwell. - 1540-6261 .- 0022-1082. ; 79:3, s. 2339-2390
  • Journal article (peer-reviewed)abstract
    • In statistics, samples are drawn from a population in a data-generating process (DGP). Standard errors measure the uncertainty in estimates of population parameters. In science, evidence is generated to test hypotheses in an evidence-generating process (EGP). We claim that EGP variation across researchers adds uncertainty—nonstandard errors (NSEs). We study NSEs by letting 164 teams test the same hypotheses on the same data. NSEs turn out to be sizable, but smaller for more reproducible or higher rated research. Adding peer-review stages reduces NSEs. We further find that this type of uncertainty is underestimated by participants.
  •  
9.
  • Perignon, Christophe, et al. (author)
  • Computational Reproducibility in Finance : Evidence from 1,000 Tests
  • 2024
  • In: Review of Financial Studies. - : Oxford Univ Press. - 1465-7368 .- 0893-9454.
  • Journal article (peer-reviewed)abstract
    • We analyze the computational reproducibility of more than 1,000 empirical answers to 6 research questions in finance provided by 168 research teams. Running the researchers' code on the same raw data regenerates exactly the same results only 52% of the time. Reproducibility is higher for researchers with better coding skills and those exerting more effort. It is lower for more technical research questions, more complex code, and results lying in the tails of the distribution. Researchers exhibit overconfidence when assessing the reproducibility of their own research. We provide guidelines for finance researchers and discuss implementable reproducibility policies for academic journals.
  •  
Skapa referenser, mejla, bekava och länka
  • Result 1-9 of 9
Type of publication
journal article (7)
other publication (2)
Type of content
peer-reviewed (7)
other academic/artistic (2)
Author/Editor
Menkveld, Albert J. (9)
Holzmeister, Felix (7)
Johannesson, Magnus (7)
Kirchler, Michael (7)
Weitzel, Utz (5)
Razen, Michael (5)
show more...
Dreber, Anna (4)
Huber, Jürgen (4)
Szaszi, Barnabas (3)
Dreber Almenberg, An ... (3)
Huber, Juergen (3)
Neusüss, Sebastian (3)
Zwinkels, Remco (3)
Aczel, Balazs (2)
Nilsonne, Gustav (2)
Albers, Casper J. (2)
Botvinik-Nezer, Rote ... (2)
Busch, Niko A. (2)
Cataldo, Andrea M. (2)
van Dongen, Noah N. ... (2)
Hoekstra, Rink (2)
Matzke, Dora (2)
van Ravenzwaaij, Don (2)
Sarafoglou, Alexandr ... (2)
Schweinsberg, Martin (2)
Simons, Daniel J. (2)
Spellman, Barbara A. (2)
Wicherts, Jelte (2)
Wagenmakers, Eric-Ja ... (2)
Vanpaemel, Wolf (2)
Wilhelmsson, Anders (2)
Longarela, Iñaki R. (2)
Hurlin, Christophe (2)
Pérignon, Christophe (2)
Akmansoy, Olivier (2)
Tuerlinckx, Francis (2)
Huntington-Klein, Ni ... (2)
Ioannidis, John (2)
Loken, Eric (2)
Schulz-Kuempel, Hann ... (2)
Shanks, David R. (2)
Stoevenbelt, Andrea ... (2)
Trübutschek, Darinka (2)
Uhlmann, Eric L. (2)
Hoogeveen, Suzanne (2)
van den Bergh, Don (2)
Althoff, Tim (2)
Devezer, Berna (2)
Fried, Eiko I. (2)
Wu, Zhen-Xing (2)
show less...
University
Stockholm School of Economics (6)
Stockholm University (4)
Lund University (2)
University of Gothenburg (1)
Language
English (9)
Research subject (UKÄ/SCB)
Social Sciences (7)
Natural sciences (4)

Year

Kungliga biblioteket hanterar dina personuppgifter i enlighet med EU:s dataskyddsförordning (2018), GDPR. Läs mer om hur det funkar här.
Så här hanterar KB dina uppgifter vid användning av denna tjänst.

 
pil uppåt Close

Copy and save the link in order to return to this view