Please use this form to submit your study for inclusion into our database. It will be checked by a member of the Innovation Growth Lab team, who may be in contact to ask for more information. Your email address * Your name * Title * The name of the study Short summary This experiment in the context of medical research grants indicates a discount of novelty in research proposals, which may be due to evaluators internalising the average effects of novelty for potential concerns about the lower success rates. However, this censoring of novel projects means that experiments never get a chance to be run and thus the benefits of generating greater diversity of experiments are curtailed. This is of concern to policy makers and society because research funds are being allocated towards more incremental research as compared to high variability and potentially breakthrough efforts. A brief description of the project's goals and its current state Abstract <p>Central to any innovation process is the evaluation of proposed projects and allocation of resources. We investigate whether novel research projects, those deviating from existing research paradigms, are treated with a negative bias in expert evaluations. We analyze the results of a peer review process for medical research grant proposals at a leading medical research university, in which we recruited 142 expert university faculty members to evaluate 150 submissions, resulting in 2,130 randomly assigned proposal-evaluator pair observations. Our results confirm a systematic penalty for novel proposals; a standard deviation increase in novelty drops the expected rank of a proposal by 4.5 percentile points. This discounting is robust to various controls for unobserved proposal quality and alternative explanations. Additional tests suggest information effects rather than strategic effects account for the novelty penalty. Only a minority of the novelty penalty could be related to perceptions of lesser feasibility of novel proposals.</p> The full abstract of the study, if available Links http://dash.harvard.edu/handle/1/10001229 Links to any published papers and related discussions Authors * Affiliations Academic and other institutes that the authors of the study are members of Delivery partner Organisations involved in delivering the trial, if appropriate Year Year Year199419951996199719981999200020012002200320042005200620072008200920102011201220132014201520162017201820192020202120222023202420252026 Month MonthJanFebMarAprMayJunJulAugSepOctNovDec Day Day12345678910111213141516171819202122232425262728293031 Journal Journal publishing the study, if available Publication stage * Working Paper Published Ongoing Research Forthcoming Discussion Paper Research theme * Entrepreneurship Innovation Business Growth Country Country or countries where this study took place. Topics What sort of topics does the study cover? Sample attributes Hypotheses / research question Is there a bias against novel ideas and research hypotheses? How are nascent scientific hypotheses evaluated by designing a randomised expert peer review process? Are novel research ideas outside currently accepted scientific paradigms susceptible to being discounted, rejected, or ignored due to the existing range of institutions that have developed as a result of society and organisations? Sample Trial population and sample selection Researchers worked closely with a research grant allocating body within a leading medical research university to study results from the first stage of a $1 million grant process related to a major disease in terms of its economic and health burden. The university gives out internal grants to allow investigators to bootstrap their research efforts to generate preliminary data for NIH grant applications. The call for proposals was bolstered to encourage participation numbers and "out of the box" ideas. 142 evaluators were recruited to participate from three distinct groups to assure a large number and variety of evaluators while maintaining representativeness. A total of 15 randomly selected research proposal submissions (of the 150 total) were assigned to each of the 142 evaluators. Number of treatment groups Size of treatment groups N/A Size of control group Unit of analysis Clustered? Yes No Cluster details Trial attributes Treatment description Following random assignment of proposals to evaluators, each proposal was evaluated. Proposals vary in terms of their measure of novelty compared to the existing published literature. The task of evaluators was to score proposals using a 10-point scale to summarize their estimated assessment of the potential impact of the ideas, hypotheses, and research pathways contained in each of the 15 proposals they evaluated. Identities of submitters and evaluators was double-blind, and the overall evaluation can be regarded as triple-blinded. Rounds of data collection Baseline data collection and method Data sources include evaluator score sheets, the database of prior academic publications and citations of submitting researchers, and detailed backgrounds and C.V.'s of all evaluators. Data collection method and data collected Evaluation Outcome variables <p>Score: Main score (out of 10) given by evaluators as an overall assessment of the potential impact of a research proposal. Points allocation: Scores given to proposals by evaluators across various dimensions. Feasibility.</p> Results <p>The authors included 37 impact-evaluation studies in the review. The studies cover 25 countries across sub-Saharan Africa (nine studies), South Asia (10 studies), Latin America and the Caribbean (10 studies), East Asia and the Pacific (four studies), Eastern Europe (two studies) and North Africa (two studies). Two-thirds of the interventions evaluated came from low-income or lower-middle-income countries. Of the estimates, 80% were based on experimental interventions.</p> Intervention costs Not available. Cost benefit ratio Reference Boudreau, K. J., & Guinan, E. C., Lakhani, K. R., & Riedl, C., 2012. 'The Novelty Paradox and Bias for Normal Science: Evidence from Randomized Medical Grant Proposal Evaluations'. Harvard Business School Technology & Operations Mgt. Unit Working Papers, SSRN. Citation for use in academic references