The Monash University Methods in Evidence Synthesis Unit invites you to Evolving Methods for Evidence Synthesis of Health Research: Symposium 2025.
Date and time: Thursday 6 February 2025 (from 9.30am to 5.00pm). Social function from 5:30pm onwards
Location: School of Public Health and Preventive Medicine Conference Rooms, Monash University School of Public Health and Preventive Medicine, 553 St Kilda Road Melbourne, VIC 3004
Registration: Register here
Standard in-person: $150
Student in-person: $125
Linked event: Assessing the risk of bias in non-randomized studies evaluating the effects of interventions, in-person workshop
Friday 7 February 2025 9am to 4.30pm
Any questions? Please email Joanne McKenzie at joanne.mckenzie@monash.edu
Start the new year with an in-person symposium and learn about the latest developments and evaluation of methods for evidence synthesis of health research across a range of review types and stages of the review process. It’s designed for anyone who undertakes evidence synthesis or has an interest in the development and evaluation of methods for evidence synthesis.
National and international guest speakers will join us for a series of plenary presentations exploring:
the use of artificial intelligence (AI) in reviews
safeguarding the integrity of research when using individual participant data
rethinking which meta-analysis model is best for most systematic reviews
how we can improve the reporting of systematic reviews
how we can use assessments of the certainty of evidence to drive improvements in systematic reviews (see symposium presentations below for further details).
We will also be shining a light on the evidence synthesis methodological research underway in Australia, and offer networking opportunities with presenters and participants. Two speed networking sessions off the chance for participants to discuss research interests across key themes including:
search methods
living methods
statistics
reporting
fraud detection and predatory publishing.
All presentations will be pitched for an audience with general knowledge about evidence synthesis methods.
The symposium finishes at 5pm, but don’t miss our social event from 5.30 onwards at the Commons Collective, just a few minutes walk from our conference venue.
Find out all the details about our speakers, plenaries, short talks and the Methods in Evidence Synthesis Unit below.
Plenary speakers
-
Sue Brennan
Sue Brennan is the founding Director of the Melbourne GRADE Centre, and Senior Researcher in evidence synthesis methods at Cochrane Australia, School of Public Health and Preventive Medicine, Monash University.
Sue is an elected member of the GRADE guidance group (2019-current), the executive that oversees the global network of 600 experts in the GRADE working group. She has authored foundational guidance including four chapters of the Cochrane Handbook for Systematic Reviews of Interventions, wrote the GRADE items for the PRISMA 2020 statement, co-led development of a tool for planning and reporting clinical questions (InSynq), and contributed to chapters of the GRADE Handbook.
-
Justin Clark
Justin Clark is the Research Enhancement Manager at the Institute for Evidence-Based Healthcare (IEBH), Bond University, Australia. He is also the Cochrane Information Specialist for the Acute Respiratory Infections Group, was a member of the Cochrane Information Specialists Executive and the Co-Lead of the search group of the Living Evidence Network.
Justin is one of the inventors of the Two-Week Systematic Review (2weekSR) method, a founding member of the International Collaboration for the Automation of Systematic Reviews (ICASR) and leads the development of the Systematic Review Accelerator (SRA) a suite of automation tools that accelerate the production of evidence synthesis. His research focuses on improving evidence synthesis methods to reduce the resources needed to conduct reviews of the evidence.
-
Julian Higgins
Julian Higgins is a Professor of Evidence Synthesis at the University of Bristol. He has a long-standing interest in methodology of systematic reviews and meta-analysis. He is author of over 350 publications, collectively cited more than 350,000 times.
Julian’s contributions include: a Bayesian approach to network meta-analysis; the I-squared statistic to quantify inconsistency across studies in a meta-analysis; simple prediction intervals for random-effects meta-analysis; a general framework for individual participant data meta-analysis; and risk-of-bias assessment tools for clinical trials and other study designs.
Julian is a past President of the Society for Research Synthesis Methodology and has co-edited the Cochrane Handbook for Systematic Reviews of Interventions since 2003. He is also co-author of the Wiley textbook Introduction to Meta-analysis and co-editor of the 3rd edition of Wiley textbook Systematic Reviews in Health Research: Meta-analysis in Context.
-
Kylie Hunter
Kylie Hunter is a Research Fellow and co-lead of the NextGen Evidence Synthesis Team at the NHMRC Clinical Trials Centre, University of Sydney. She also serves as Associate Convenor of the Cochrane Prospective Meta-Analysis Methods Group.
Kylie’s research centres on advancing evidence synthesis methods to address high-priority health research questions, with a particular focus on individual participant data, prospective meta-analysis, and research integrity.
-
Zachary Munn
Zachary Munn is an advocate for evidence-based healthcare and for ensuring policy and practice is based on the best available evidence. Professor Munn is the founding Director of Health Evidence Synthesis, Recommendations and Impact (HESRI) in the School of Public Health at the University of Adelaide.
His also Head of the Evidence Synthesis Taxonomy Initiative (ESTI); Founding Director of the Adelaide GRADE Centre; past-Chair of the Guidelines International Network (GIN) and a National Health and Medical Research Council (NHMRC) Investigator. He is a systematic review, evidence implementation and guideline development methodologist.
-
Evan Mayo Wilson
Evan Mayo-Wilson is an Associate Professor of Epidemiology at the UNC Gillings School of Global Public Health. His research focuses on: evaluating the benefits and harms of health interventions; improving methods for clinical trials and systematic reviews; and developing methods and interventions to increase research transparency and openness.
He is the Scientific Director for peer review of PCORI Research Reports and Associate Editor for Systematic Reviews for the American Journal of Public Health. Evan serves on the American Psychological Association Open Science and Methodology Expert Panel. He has co-authored multiple guidelines for reporting clinical trials and systematic reviews.
-
Joanne McKenzie
Joanne McKenzie is a Professor and Head of the Methods in Evidence Synthesis Unit within the School of Public Health and Preventive Medicine at Monash University. She leads a programme of research on methods for evidence synthesis, with some key areas of interest being methods to present and synthesize results when meta-analysis is not possible, statistical methods for analysing and meta-analysing results from interrupted time series studies, and the development of reporting guidelines for different evidence synthesis products.
Joanne co-led the PRISMA 2020 statement and contributed to the development of the ROB-ME tool for assessing risk of bias due to missing evidence in meta-analyses. She is an active contributor to Cochrane, including being a Co-convenor of the Statistical Methods Group and an author of several chapters of the Cochrane Handbook for Systematic Reviews of Interventions.
-
Matthew Page
Matthew Page is a Senior Research Fellow and Deputy Head of the Methods in Evidence Synthesis Unit within the School of Public Health and Preventive Medicine at Monash University. His research aims to improve the quality of systematic reviews of health and medical research.
Matthew has led many studies investigating the transparency, reproducibility and risk of bias in systematic reviews and the studies they include and has developed several methods to address these issues. For example, he co-led the development of the PRISMA 2020 statement, a highly cited reporting guideline for systematic reviews, was a member of the core group who developed the RoB 2 tool for assessing risk of bias in randomized trials, and led the development of the ROB-ME tool for assessing risk of bias due to missing evidence in meta-analyses. He is a member of Cochrane’s Methods Executive, the group that is responsible for directing the methods used within Cochrane Reviews.
-
Jonathan Sterne
Jonathan Sterne is a Professor of Medical Statistics and Epidemiology at the University of Bristol, Director of the NIHR Bristol Biomedical Research Centre and co-Director of Health Data Research UK South-West. During the pandemic he led a large team that produced novel, high impact research on COVID-19 vaccination and long COVID, based on analyses of up to 55 million people. He has a longstanding interest in methodology for systematic reviews and meta-analysis. He co-led development of the RoB 2 tool for assessing risk of bias in randomised trials and the ROBINS-I and ROBINS-E tools for assessing risk of bias in non-randomized studies of interventions and exposures, respectively. Jonathan is a former co-convenor of the Cochrane Bias Methods Group and has published influential papers on reporting bias in meta-analysis, meta-epidemiology, causal inference and statistical methodology.
Plenary presentations
Fixed-effects (plural) meta-analyses are the best choice for most systematic reviews
Jonathan Sterne, University of Bristol
Safeguarding research integrity using the Individual Participant Data (IPD) Integrity Tool
Kylie Hunter, University of Sydney
Use of generative artificial intelligence for various evidence synthesis tasks: what’s the evidence?
Justin Clark, Bond University
Driving improvements in systematic reviews: the role of GRADE
Sue Brennan, Monash University &
Zachary Munn, University of Adelaide
Fixed-effects (plural) meta-analyses may not be the best choice for most systematic reviews
Julian Higgins, University of Bristol
Using artificial intelligence to detect changes in clinical trial outcomes
Evan Mayo-Wilson, UNC Gillings School of Global Public Health, USA
Improving reporting of systematic reviews and primary research:
launch of the Melbourne Branch of the Australasian EQUATOR Centre
Joanne McKenzie and Matthew Page, Monash University, Australia
Short talks
The PRIMER checklist for helping peer-reviewers detect issues in meta-analyses in systematic reviews of interventions: results from the development process
Elizabeth Korevaar, Monash University
Data transformation in mixed methods systematic reviews: a starting point
Lucylynn Lizarondo, University of Adelaide
Do we really need that many evidence synthesis types? Results from the Evidence Synthesis Taxonomy Initiative Scoping Review
Danielle Pollock, University of Adelaide
Contrast- and arm-synthesis models for network meta-analysis: does the choice of method matter?
Emily Karahalios, Centre for Epidemiology and Biostatistics, The University of Melbourne
Semi-automated methods for living guideline maintenance: simulation case studies on the 2023 international PCOS (polycystic ovary syndrome) guidelines
Darren Rajit, Monash University
The role of evidence synthesis in developing diagnostic criteria for conditions without obvious tests, biomarkers or reference standards
Sam White, University of Adelaide
Methods and tools for equity in systematic reviews
Natalie Strobel, Edith Cowan University, Australia
About the symposium organisers & the Methods in Evidence Synthesis Unit
The organisational leads for this symposium are Professor Joanne McKenzie and Dr Matthew Page from the Methods in Evidence Synthesis Unit (MESU).
Members of the organising committee responsible for shaping the program include:
Edoardo Aromataris | JBI, University of Adelaide
Justin Clark | Institute for Evidence-Based Healthcare (IEBH), Bond University
Tammy Hoffmann | Institute for Evidence-Based Healthcare (IEBH), Bond University
Kylie Hunter | NHMRC Clinical Trials Centre, University of Sydney, Australia
Vanessa Jordan | Cochrane New Zealand, University of Auckland, New Zealand
Zachary Munn | Health Evidence Synthesis, Recommendations and Impact (HESRI), University of Adelaide,
Steve McDonald | Cochrane Australia, Monash University
Joanne McKenzie | Methods in Evidence Synthesis Unit, Monash University
Matthew Page | Methods in Evidence Synthesis Unit, Monash University
Natalie Strobel | Maladjiny Research Centre, Edith Cowan University
Heath White | Australian Living Evidence Collaboration, Monash University
Cochrane Australia is pleased to provide administrative and event support for this symposium.
About the Methods in Evidence Synthesis Unit (MESU)
The MESU is based within the School of Public Health and Preventive Medicine at Monash University. The Unit’s mission is to develop, evaluate and make accessible optimal statistical and research methodology for evidence synthesis.
The MESU team has led and contributed to major developments and understanding in evidence synthesis including developing reporting guidelines (PRISMA 2020, PRIOR, SWiM, and extensions to PRISMA 2020), risk of bias tools (ROB-ME, RoB 2), methods for synthesis when meta-analysis is not possible, methods for meta-analysing results from non-randomized studies, methods for overviews of systematic reviews, examining reproducibility in systematic reviews and bias in the review process.
MESU staff regularly provide training to researchers, nationally and internationally, and collaborate on systematic reviews. MESU is funded through National Health and Medical Research (NHMRC) grants.
Linked event:
Assessing the risk of bias in non-randomised studies evaluating the effects of interventions | In-person workshop
Date and time: Friday 7 February 2025 from 9am to 4.30pm
Location: School of Public Health and Preventive Medicine Conference Rooms, Monash University School of Public Health and Preventive Medicine, 553 St Kilda Road Melbourne, VIC 3004
Linked event: Evolving Methods for Evidence Synthesis of Health Research, Thursday 6 February 2025
Registration: Register here
Standard in-person: $490
Student in-person: $350
Any questions? Please email Matthew Page at matthew.page@monash.edu
Background
Non-randomized studies of interventions (NRSI) can provide evidence additional to that available from randomized trials about long term outcomes, rare events, adverse events and populations that are typical of real-world practice. However, limitations in the design, conduct, analysis and reporting of an NRSI may lead to underestimation or overestimation of the true effects of an intervention, which is known as bias. Therefore, a critical aspect when interpreting results from an NRSI is assessing whether features of the study may have placed the results at risk of bias, thus making them less trustworthy. The ROBINS-I (Risk Of Bias In Non-randomized Studies – of Interventions) tool provides a structured process for researchers to make risk-of-bias judgements. The tool is the gold standard approach for assessing the risk of bias in NRSI, is widely used (the 2016 version has been cited >13,000 times), and has been endorsed by Cochrane for use in their systematic reviews. A new version of ROBINS-I that incorporates several improvements and innovations will be launched in early 2025, together with an online implementation that will facilitate its use.
Who is this workshop for and what does it cover?
This in-person workshop on how to assess the risk of bias in NRSI is designed for those undertaking systematic reviews, synthesizing evidence for guidelines, or generally interested in learning how to appraise studies.
The presenters will describe key features of version 2 of the ROBINS-I tool for cohort studies in which intervention groups are allocated during the course of usual treatment decisions. We will highlight the updates and improvements to the original (2016) version of the tool and show how it has been implemented in online software for use by review authors. The workshop involves a mix of presentations, interactive discussions and hands-on exercises. Electronic copies of the slides will be provided on the day of the workshop. Details of what will be covered each day are below.
Outline:
• Introduction to risk-of-bias assessment
• Introduction to the ROBINS-I tool
• Preliminary considerations, including specification of the causal effect of interest
• Risk of bias due to confounding
• Risk of bias in classification of interventions
• Risk of bias in selection of participants into the study (or into the analysis)
• Risk of bias due to deviations from intended interventions
• Risk of bias due to missing data, measurement of the outcome and selection of the reported result
• Software implementation of ROBINS-I version 2.
Course facilitators
Jonathan Sterne, University of Bristol, UK: Jonathan is a Professor of Medical Statistics and Epidemiology at the University of Bristol, Director of the NIHR Bristol Biomedical Research Centre and co-Director of Health Data Research UK South-West. During the pandemic he led a large team that produced novel, high impact research on COVID-19 vaccination and long COVID, based on analyses of up to 55 million people. He has a longstanding interest in methodology for systematic reviews and meta-analysis. He co-led development of the RoB 2 tool for assessing risk of bias in randomised trials and the ROBINS-I and ROBINS-E tools for assessing risk of bias in non-randomized studies of interventions and exposures, respectively. Jonathan is a former co-convenor of the Cochrane Bias Methods Group and has published influential papers on reporting bias in meta-analysis, meta-epidemiology, causal inference and statistical methodology. Read more about Jonathan’s research interests here.
Julian Higgins, University of Bristol, UK: Julian is a Professor of Evidence Synthesis at the University of Bristol. He has a long-standing interest in methodology of systematic reviews and meta-analysis. He is author of over 350 publications, collectively cited more than 350,000 times. Julian’s contributions include: a Bayesian approach to network meta-analysis; the I-squared statistic to quantify inconsistency across studies in a meta-analysis; simple prediction intervals for random-effects meta-analysis; a general framework for individual participant data meta-analysis; and risk-of-bias assessment tools for clinical trials and other study designs. Julian is a past President of the Society for Research Synthesis Methodology and has co-edited the Cochrane Handbook for Systematic Reviews of Interventions since 2003. He is also co-author of the Wiley textbook Introduction to Meta-analysis and co-editor of the 3rd edition of Wiley textbook Systematic Reviews in Health Research: Meta-analysis in Context. Read more about Julian's research interests here.
Matthew Page, Monash University: Matthew is a Senior Research Fellow and Deputy Head of the Methods in Evidence Synthesis Unit. His research aims to improve the quality of systematic reviews of health and medical research. He has led many studies investigating the transparency, reproducibility and risk of bias in systematic reviews and the studies they include and has developed several methods to address these issues. For example, he co-led the development of the PRISMA 2020 statement, a highly cited reporting guideline for systematic reviews, was a member of the core group who developed the RoB 2 tool for assessing risk of bias in randomized trials, and led the development of the ROB-ME tool for assessing risk of bias due to missing evidence in meta-analyses. He is a member of Cochrane’s Methods Executive, the group that is responsible for directing the methods used within Cochrane Reviews. Read more about Matthew’s research interests here.
Joanne McKenzie, Monash University: Jo is a Professor and Head of the Methods in Evidence Synthesis Unit. She leads a programme of research on methods for evidence synthesis, with some key areas of interest being methods to present and synthesize results when meta-analysis is not possible, statistical methods for analysing and meta-analysing results from interrupted time series studies, and the development of reporting guidelines for different evidence synthesis products. She co-led the PRISMA 2020 statement and contributed to the development of the ROB-ME tool for assessing risk of bias due to missing evidence in meta-analyses. She is an active contributor to Cochrane, including being a Co-convenor of the Statistical Methods Group and an author of several chapters of the Cochrane Handbook for Systematic Reviews of Interventions. Read more about Jo’s research interests here.
Organisers
The Methods in Evidence Synthesis Unit (MESU) sits within the School of Public Health and Preventive Medicine at Monash University. The Unit’s mission is to develop, evaluate and make accessible optimal statistical and research methodology for evidence synthesis. The MESU team has led and contributed to major developments and understanding in evidence synthesis including developing reporting guidelines (PRISMA 2020, PRIOR, SWiM, and extensions to PRISMA 2020), risk of bias tools (ROB-ME, RoB 2), methods for synthesis when meta-analysis is not possible, methods for meta-analysing results from non-randomized studies, methods for overviews of systematic reviews, examining reproducibility in systematic reviews and bias in the review process. MESU staff regularly provide training to researchers, nationally and internationally, and collaborate on systematic reviews. MESU is funded through nationally competitive NHMRC and ARC grants.