5 Software Tools to Make Grad School in Epi Better

As epidemiology and public health graduate students, a good number of us spend almost more time on computers crunching data than watching youtube. We all have our favorite data analysis tools installed: R, Stata, SPSS, SAS, JMP, WinBUGS, Matlab… we use Dropbox to sync and backup files, Google Docs to collaborate, Endnote or Papers to manage our PDFs and citations, and Evernote to manage our notes.

gif_huggingmonitor

Friday afternoons in Purvis Hall

But aside from the famous  tools we all know and love, there are a lot of awesome software tools and plugins out there that can make our lives just a little easier. You want to search for 50 different keywords in 50 different windows at the same time? there’s a plugin for that (Chrome). You want to download citations on-the-go? this button is mandatory (Chrome, Firefox).  You want to force that window to stay on top so you don’t have to flip back and forth? download a little utility software (Win, Mac).

Here are 5 software tools that have made my life just a little bit better:

Continue reading

Operational research: bridging theory and practice

One of the things we are told as students in public health and epidemiology is that our work has real life implications and will help in making better decisions in practice. On the first week of May a group of us from very diverse backgrounds, academia and field workers, participated in a week-long course focused on operational research methods offered through McGill’s Global Health Programs and partners. This course gave us a chance to see how exactly that gap between academia and practice can be bridged. Operations research is a term with broad scope, used by the military, industry, and the public sector. The objective of this course was to give us insight as to how analytic methods can be used to guide planning and decision-making in global health operations, particularly in low and middle income countries. The workshops were guided by Dr. Rony Zachariah and Dr. Tony Reid of MSF, Dr. Ajay Kumar of the The Union, and Dr. Srinath Satyanarayana of McGill University. Below are some ideas worth sharing that participants in the course from our department picked up:

Ebola treatment unit (ETU) run by Médecins Sans Frontières (MSF). Photo: UNMEER/Simon Ruf released under creative commons.

Ebola treatment unit (ETU) run by Médecins Sans Frontières (MSF). Photo: UNMEER/Simon Ruf released under creative commons.

The simplicity of operational research: simple solutions for important issues – Vincent Lavallée (Public Health)

Like many others, I was new to the field of operational research when beginning the course. My greatest takeaway from this course was the potential for simple solutions when tackling difficult questions. A common trend coming from academics is a stubbornness that demands perfect study design. This is often described as the holy grail of epidemiological research, the randomized control trial. While it is very important to identify potential biases and errors in reporting when conducting a study, unfortunately gold-standard RCTs are rarely feasible in the field.

What I enjoyed most about this course was how they highlighted the use of natural experiments and creative solutions to answer questions regarding health care implementation and utilization in low resource settings. While finishing my public health degree, one class required we write proposals for theoretical research projects. Many groups often got caught up in trying to answer all the questions, resulting in increasingly complex study designs. It was refreshing to see how operational research teams from MSF take on one or two very poignant questions and develop simple yet eloquent solutions to answer them. In doing so, they manage to change policy and current practice in these settings.

The importance and challenges of publication in operational research – Marzieh Ghiasi (Epidemiology)

One of the interesting topics covered in the course was the important role that publication can play in operational research. In academia, for better or worse, the mantra ‘publish or perish’ exists in part because publications are a measure of productivity. In implementation settings, the objectives and pressures are different and publication is not a priority. In fact, projects are often are implemented by governments and agencies, without a strong empirical framework or post-hoc analysis– and the people doing the implementation may or may not be trained in constructing scientific publications. The course instructors highlighted how conducting operational research and publishing can play the role of providing an evidence-based road-map and dissemination tool. Consequently, the capacity to conduct operational research is built by not only by training people how to develop protocols, collect data, but also how to publish and do it well. The presenters gave the example of a course by The Union/MSF focused on developing these skills.

We had a hands-on overview of how to use EpiData, a free open software for systematic data entry ideal for use in constrained settings. As well, an overview of how the publication process works: for example, the often overlooked but important task of actually looking at and adhering to author guidelines before submitting a manuscript to a journal! One of the most interesting things I took away from the workshops was the idea of ‘inclusive authorship’ in operational research, which is critical in projects that involve dozens and dozens of people in design, implementation, data collection and analysis. The instructors recalled their own experiences of trying to chase authors and contributors down by email versus bringing dozens of people in a room over the course of a couple of days to get them to write a paper together (the latter works better!). Bringing 30-something people to write a paper is, of course, in itself an operational challenge. But, as this paper showcases, it is possible and should be done to ensure fairness and engagement.

The untapped potential of operational research – Marc Messier-Peet (Public Health)

When I first glanced at the course outline for this Operational Research course, I felt this wave of relief come over me. Yes, people are researching implementation science, and yes, people acknowledge the potential gains it can bring to the field of public health. Delivered by an exceptional team of operational research experts, we had an excellent crash review that would appeal to anyone interested in strengthening health systems. Among the many things I took away were how to improve routine data collection through streamlining it and making it as user-friendly as possible, in order to ensure benefits for researchers and decision makers alike. We were shown that data collection is not inherently justified in itself. In operational settings, there is an ethical imperative as publicly-funded researchers to make sure any data collected answers a relevant question and the final work is disseminated to those best suited to use it.

Focusing on collaborations and partnership between stakeholders, the course underlined how it is important to build relationships all along the operational research trajectory. With a greater emphasis being placed by the international development community on impact evaluation and donor accountability, operational research can help find the necessary tweaks and adjustments needed to improve any under-performing health systems. Perhaps we in Canada could benefit from turning the operational research lens inwards, and develop our capacity to see how our institutions could perform better? The questions raised from an operational research approach are ones that need to be asked, and provide the opportunity for engaged researchers to bridge the ‘know-do gap’ and see their work make a real difference in people’s lives.