Research Methods
Better conduct and reporting of studies
Chair: Seneca Fitch
Co-Chair: Paul Whaley
100+ members
A healthy evidence pipeline depends on an inflow of well-reported studies, conducted using valid methods.
Our Research Methods Working Group is developing guidance, checklists, and appraisal tools to help improve the quality of primary studies and reduce research waste.
Current Projects
- The FEAT framework for evaluating study appraisal tools. Choosing, modifying, or designing appraisal tools for use in systematic reviews (and other contexts!) can be very challenging. The FEAT (Focus, Extent, Application, and Transparency) framework has been designed to help with this, by making explicit the core things an appraisal tool ought to cover. FEAT was introduced in a recent comprehensive guide to assessing risk of bias (here) but a short commentary about FEAT, with lots of examples of its application, would be useful in toxicology and environmental health, where we find these issues very tough.
- PECO statements for chemical exposures. PECO statements are useful for both primary studies and systematic reviews. While a guidance paper on PECO development already exists, the examples used in the paper (relating to noise exposure) are not necessarily easy to translate to the toxicology context. It has therefore been suggested that we develop a follow-up paper that expands on the advantages of careful use of PECO statements with extra case examples about chemical exposures.
- Encouraging correspondence and post-publication peer-review. There is often insufficient post-publication critique of published research, either in the form of correspondence, or the use of post-publication peer-review websites such as PubPeer. At the same time, writing critiques of research intended for public reading can be a very good introduction for early-career researchers to academic writing, and can be a practical output of a journal club. There is also an opportunity to investigate empirically some of the barriers to post-publication critique, such as APCs for correspondence in journals.
Evidence Synthesis
Know what the evidence is saying
Acting Chair: Sebastian Hoffmann
100+ members
One study on its own may not mean very much. Many studies together will give a fuller picture - but that picture can be difficult to make out in a noisy, confusing, and contradictory body of evidence.
Our Evidence Synthesis Working Group is refining methods for systematically reviewing and mapping research, to ensure we are able to make best use of the best evidence in policy-making.
Open Science
More accessible, reusable research
Chair: Emily Golden
Co-Chair: Paul Whaley
100+ members
Most toxicological research is still being published in a paper-based paradigm, with data trapped in inaccessible PDFs and behind paywalls. This makes it difficult to find and reuse data and to reproduce study findings.
Our Open Science Working Group is promoting a range of approaches to improving the efficiency of reuse of scientific data and the use of AI in research, so our ability to use evidence scales with the rate at which it is being generated.
Current Projects
- Enhancing study manuscripts with machine-readable metadata. As many of you may be (painfully?) aware, it is really difficult and time-consuming to abstract data from manuscripts for reuse in systematic reviews and other projects. Some of you may be abstracting data for entry into systems like HAWC. A start-of-pipe solution is to improve the quality and breadth of study metadata provided by researchers. This project is about describing what metadata is, what sort of metadata should be provided, and do some blue-sky thinking on how it could be provided, to make it easier to automatically ingest data from a study into any data system you might be using. This will revive a dormant manuscript from the precursor to this WG.
- Introductory course on open science for toxicology and environmental health. Open science is new and fancy, so not many people have yet had the opportunity to grasp what it involves and how it affects them. EBTC has already done one open science course at SOT 2024 and is going another at Eurotox 2024. This would be an opportunity for EBTC members to design a course that works for them, that could potentially be distributed as educational materials for others.
- Structured reading guide for developing comprehension skills. Reading scientific papers is difficult, but it is a skill that we are too-often expected to develop passively. This project seeks to provide structured reading guides to help scientists engage with manuscripts. It should be of value to everyone, but perhaps especially to early-career researchers. The guides could be tested in the context of EBTC’s plans for a journal club, and could form the basis of some education and training materials.
- Encouraging correspondence and post-publication peer-review. There is often insufficient post-publication critique of published research, either in the form of correspondence, or the use of post-publication peer-review websites such as PubPeer. At the same time, writing critiques of research intended for public reading can be a very good introduction for early-career researchers to academic writing, and can be a practical output of a journal club. There is also an opportunity to investigate empirically some of the barriers to post-publication critique, such as APCs for correspondence in journals.
- Annual hackathons. We are discussing sponsoring the annual hackathons (more details soon) potentially for the next three years, to facilitate EBTC member involvement in open science and AI tool development that will help automate evidence synthesis, study evaluation, and other challenges in speeding up the reuse and evaluation of research data.
Evidence & Decisions
Using evidence in policy-making
Chair: Emily Senerth
Co-Chair: Paul Whaley
100+ members
It is one thing to know what the evidence says about the health risks posed by an environmental exposure, but quite another to know how best to respond to this evidence.
Our Evidence & Decision-Making Working Group is focused on the systematic handling of decision-relevant information that is additional to certainty about health effects, so that decisions and policy are transparent and fully informed by all relevant considerations.