CQUniversity
Browse

File(s) not publicly available

Delusions of expertise : the high standard of proof needed to demonstrate skills at horserace handicapping

journal contribution
posted on 2017-12-06, 00:00 authored by Matthew BrowneMatthew Browne, Matthew RockloffMatthew Rockloff, A Blaszcynski, C Allcock, A Windross
Gamblers who participate in skill-oriented games (such as poker and sports-betting) are motivated to win over the long-term, and some monitor their betting outcomes to evaluate their performance and proficiency. In this study of Australian off-track horserace betting, we investigated which levels of sustained returns would be required to establish evidence of skill/expertise. We modelled a random strategy to simulate ‘naı̈ve’ play, in which equal bets were placed on randomly selected horses using a representative sample of 211 weekend races. Results from a Monte Carlo simulation yielded a distribution of return-on-investments for varying number of bets (N), showing surprising volatility, even after a large number of repeated bets. After adjusting for the house advantage, a gambler would have to place over 10,000 bets in individual races with net returns exceeding 9 % to be reasonably considered an expert punter (a=.05). Moreover, a record of fewer bets would require even greater returns for demonstrating expertise. As such, validated expertise is likely to be rare among race bettors. We argue that the counter-intuitively high threshold for demonstrating expertise by tracking historical performance is likely to exacerbate known cognitive biases in self-evaluation of expertise.

History

Issue

2013

Start Page

1

End Page

17

Number of Pages

17

ISSN

1573-3602

Location

New York

Publisher

Springer

Language

en-aus

Peer Reviewed

  • Yes

Open Access

  • No

External Author Affiliations

Balmoral Consultancy Services, Sydney; Institute for Health and Social Science Research (IHSSR); School of Psychology; TBA Research Institute;

Era Eligible

  • Yes

Journal

Journal of gambling studies.

Usage metrics

    CQUniversity

    Categories

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC