The What Works Network helps policymakers and practitioners across the UK make informed decisions by providing toolkits that summarize evidence on various interventions. Despite their impact, developing these toolkits is challenging, as evidence can affect multiple outcomes, impact different population segments, and vary over time, while uncertainties in evidence remain. My team conducted a study to evaluate how well the toolkits communicate evidence and meet user needs. Over 450 decision-makers from seven What Works Centres and Conservation Evidence participated, revealing that their top priorities are understanding the effectiveness of interventions and the quality of supporting evidence. Additionally, policymakers prioritized information on financial costs and potential harms, while practitioners valued detailed evidence quality metrics, such as study type and number.
To optimize toolkit design, we also surveyed 200 members of the general public, who shared similar preferences, suggesting they could be a helpful group for future toolkit testing. The study explored the effectiveness of icons, with a microscope symbol emerging as the most recognizable for “evidence quality.” Users prefer detailed breakdowns of evidence but cautioned against overly complex summaries that could reduce engagement. This feedback guides ongoing efforts to improve toolkits, ensuring they are clear, accessible, and useful for informed decision-making.