Escaping the Red Queen

Health care practice is as varied as the patients we care about. Not only do we engage in clinical practice, but some of us also conduct educational research, while others experiment and innovate as clinicians, researchers, and health policy makers. To keep up with evolving knowledge, we must also continually ramp up new care processes and technologies, learn new skills, and enhance efficiency. I’m often reminded of Lewis Carroll’s Through the Looking-Glass where Red Queen tells Alice: “it takes all the running you can do, to keep in the same place. If you want to get somewhere else, you must run at least twice as fast as that!” The truth of the matter is that change is constant and we are always going somewhere else. Running twice as fast is not an option.

I see two implications for the Journal of Research in Interprofessional Practice and Education (JRIPE). One is that JRIPE must be as diversified as the daily work of its readers. So, we will expand our readership and encourage publication for a diverse audience about the science of interprofessional education and what it can do to improve health care delivery, workforce policy, workplace and community-based education, interprofessional clinical and educational models, international issues in interprofessional education, the list goes on…

Second, JRIPE cannot stop at increasing access to refined versions of research. Published research is only the tip of an iceberg in an ocean of uncertainty. In the midst of all the humming and the buzzing of everyday practice, there is that complex and beautiful puzzle, if not mystery: somehow, somewhere, between people, there is a capillary force that attracts and sometimes holds participants together in spite of their varied disciplinary boundaries, professional self-interests, identities, goals, knowledge, values, cultures, and differential access to resources. Getting to the parts of that puzzle, and getting them right, takes more than what a polished manuscript can tell.

Much of the fun and not so fun parts of research are in the “war stories” that defy neat categorizations and simple linear equations: the false starts, the frustrations with messy, confusing problems of everyday practice that go into collecting data and getting it ready for analysis; the adjustments, the changes, the improvisations, the muddling through; all the juicy stuff that goes into negotiating with insufficient information, experimenting, failing, and trying again.

The best way to get at those “war stories”, and the practical know-how behind them, is through conversations and the sharing of narratives among professionals. And so we need to open discussions about what went into the making of research no matter how big or small its purview. This is an action we can take immediately through IPELOG.

IPELOG ( offers a space for dialogue, knowledge sharing, and learning. We invite you to share your comments on what JRIPE publishes. Please go to to select the articles that interest you, and leave us a comment, tell us what you think, give us your opinions, share your success and failure stories, solutions to problems, and feedback about tactics and methods. Collectively, we can get our puzzles right without having to run twice as fast.

Hassan Soubhi.

One thought on “Escaping the Red Queen

  1. The study by D’Eon, Proctor, Cassidy, McKee, & Trinder (2010) titled Evaluation of an Interprofessional Problem-based Learning Module on Care of Persons Living with HIV/AIDS greatly contributes to the knowledge base of interprofessional education (IPE) as it is the first study to coordinate students of multiple disciplines to simulate real life health care teams. Richardson et al.[1] were only able to place physiotherapy and occupational therapy students together in a community health setting with casual interaction with medical and nursing students due to scheduling and space difficulties.

    D’Eon et al. established trustworthiness through triangulation of methods collecting data from many different sources (self assessment, pre/post-tests, focus groups, and satisfaction surveys). Sources and researchers could have been triangulated by including tutors in focus groups and having more than one researcher code the data[2]. Sample size of focus groups appeared too small to achieve redundancy and not representative of all disciplines (medical students (n = 3); nutrition students (n = 2). Though authors did include a comments section in self assessment and satisfaction surveys which were obtained from all participants. Richardson et al.[1] obtained rich IPE data from analyzing students’ reflective journals in addition to focus groups and surveys.

    Results indicated that students learned more about HIV/AIDS than interprofessional teams and other professionals’ roles. As the composition of each problem based learning (PBL) group differed, authors could have compared groups to see if this finding changed depending on type and number of professionals involved.

    D’Eon et al. did not identify whether subjects had previous PBL experience. This may have negatively affected results as novice groups may have spent more time establishing group function. Regardless, effect sizes were large. They did acknowledge that the study needed to be replicated at other institutions which have longer term groups.

    The pre/post-test measure was perhaps not the best tool to evaluate the students’ learning as authors identified that they did not establish strong reliability and validity of the tool. Reliability values were in absence of confidence intervals making it difficult to determine the precision of the estimated reliability. Test- retest reliability was not completed and internal consistency values were low. Authors did have four experts develop the test. Other data collection methods showed promising results.

    This study focused on the development of the learner. Future research could investigate whether learning was transferred to practice resulting in improved patient outcomes.

    1. Richardson, J., Letts, L, Childs, A., Semogas, D., Stavness, C., et al. (2010). Development of a community scholar program: an interprofessional initiative. Journal of Physical Therapy Education. 24, 37-43.
    2. Letts, L., Wilkins, S., Law, M., Stewart, D., Bosch, J., et al., (2007). Guidelines for critical review form: Qualitative studies (version 2.0). [Internet]. Hamilton, ON. URL: [September 13, 2010].

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s