Assessing the risk of bias in non-randomised studies evaluating the effects of interventions – in-person workshop
Hosted by the Methods in Evidence Synthesis Unit, School of Public Health and Preventive Medicine, Monash University
Date and time: Friday 7 February 2025 from 9am to 4.30pm.
Location: School of Public Health and Preventive Medicine Conference Rooms, Monash University School of Public Health and Preventive Medicine, 553 St Kilda Road Melbourne, VIC 3004
Registration: Register online here
Standard in-person: $490 | Student in-person: $350
Linked event: Evolving Methods for Evidence Synthesis of Health Research Symposium, Thursday 6 February
Any questions? For more information on this workshop, please email Matthew Page: matthew.page@monash.edu
About the workshop
Non-randomised studies of interventions (NRSI) can provide evidence additional to that available from randomized trials about long term outcomes, rare events, adverse events and populations that are typical of real-world practice. However, limitations in the design, conduct, analysis and reporting of an NRSI may lead to underestimation or overestimation of the true effects of an intervention, which is known as bias. Therefore, a critical aspect when interpreting results from an NRSI is assessing whether features of the study may have placed the results at risk of bias, thus making them less trustworthy. The ROBINS-I (Risk Of Bias In Non-randomized Studies – of Interventions) tool provides a structured process for researchers to make risk-of-bias judgements. The tool is the gold standard approach for assessing the risk of bias in NRSI, is widely used (the 2016 version has been cited >13,000 times), and has been endorsed by Cochrane for use in their systematic reviews. A new version of ROBINS-I that incorporates several improvements and innovations will be launched in early 2025, together with an online implementation that will facilitate its use.
Who is the workshop for?
This in-person workshop on how to assess the risk of bias in NRSI is designed for those undertaking systematic reviews, synthesizing evidence for guidelines, or generally interested in learning how to appraise studies.
The presenters will describe key features of version 2 of the ROBINS-I tool for cohort studies in which intervention groups are allocated during the course of usual treatment decisions. We will highlight the updates and improvements to the original (2016) version of the tool and show how it has been implemented in online software for use by review authors. The workshop involves a mix of presentations, interactive discussions and hands-on exercises. Electronic copies of the slides will be provided on the day of the workshop. Details of what will be covered each day are below.
Outline:
Introduction to risk-of-bias assessment
Introduction to the ROBINS-I tool
Preliminary considerations, including specification of the causal effect of interest
Risk of bias due to confounding
Risk of bias in classification of interventions
Risk of bias in selection of participants into the study (or into the analysis)
Risk of bias due to deviations from intended interventions
Risk of bias due to missing data, measurement of the outcome and selection of the reported result
Software implementation of ROBINS-I version 2.
Course facilitators
Jonathan Sterne, University of Bristol, UK: Jonathan is a Professor of Medical Statistics and Epidemiology at the University of Bristol, Director of the NIHR Bristol Biomedical Research Centre and co-Director of Health Data Research UK South-West. During the pandemic he led a large team that produced novel, high impact research on COVID-19 vaccination and long COVID, based on analyses of up to 55 million people. He has a longstanding interest in methodology for systematic reviews and meta-analysis. He co-led development of the RoB 2 tool for assessing risk of bias in randomised trials and the ROBINS-I and ROBINS-E tools for assessing risk of bias in non-randomized studies of interventions and exposures, respectively. Jonathan is a former co-convenor of the Cochrane Bias Methods Group and has published influential papers on reporting bias in meta-analysis, meta-epidemiology, causal inference and statistical methodology. Read more about Jonathan’s research interests here.
Joanne McKenzie, Monash University: Jo is a Professor and Head of the Methods in Evidence Synthesis Unit. She leads a programme of research on methods for evidence synthesis, with some key areas of interest being methods to present and synthesize results when meta-analysis is not possible, statistical methods for analysing and meta-analysing results from interrupted time series studies, and the development of reporting guidelines for different evidence synthesis products. She co-led the PRISMA 2020 statement and contributed to the development of the ROB-ME tool for assessing risk of bias due to missing evidence in meta-analyses. She is an active contributor to Cochrane, including being a Co-convenor of the Statistical Methods Group and an author of several chapters of the Cochrane Handbook for Systematic Reviews of Interventions. Read more about Jo’s research interests here.
Matthew Page, Monash University: Matthew is a Senior Research Fellow and Deputy Head of the Methods in Evidence Synthesis Unit. His research aims to improve the quality of systematic reviews of health and medical research. He has led many studies investigating the transparency, reproducibility and risk of bias in systematic reviews and the studies they include and has developed several methods to address these issues. For example, he co-led the development of the PRISMA 2020 statement, a highly cited reporting guideline for systematic reviews, was a member of the core group who developed the RoB 2 tool for assessing risk of bias in randomized trials, and led the development of the ROB-ME tool for assessing risk of bias due to missing evidence in meta-analyses. He is a member of Cochrane’s Methods Executive, the group that is responsible for directing the methods used within Cochrane Reviews. Read more about Matthew’s research interests here.
About the Methods in Evidence Synthesis Unit
The Methods in Evidence Synthesis Unit (MESU) sits within the School of Public Health and Preventive Medicine at Monash University. The Unit’s mission is to develop, evaluate and make accessible optimal statistical and research methodology for evidence synthesis. The MESU team has led and contributed to major developments and understanding in evidence synthesis including developing reporting guidelines (PRISMA 2020, PRIOR, SWiM, and extensions to PRISMA 2020), risk of bias tools (ROB-ME, RoB 2), methods for synthesis when meta-analysis is not possible, methods for meta-analysing results from non-randomized studies, methods for overviews of systematic reviews, examining reproducibility in systematic reviews and bias in the review process. MESU staff regularly provide training to researchers, nationally and internationally, and collaborate on systematic reviews. MESU is funded through nationally competitive NHMRC and ARC grants.