JMIR Publications

JMIR Medical Education

Advertisement

Citing this Article

Right click to copy or hit: ctrl+c (cmd+c on mac)

Published on 17.10.17 in Vol 3, No 2 (2017): Jul-Dec

This paper is in the following e-collection/theme issue:

    Original Paper

    Systems-Based Training in Graduate Medical Education for Service Learning in the State Legislature in the United States: Pilot Study

    1College of Medicine, University of Illinois at Chicago, Chicago, IL, United States

    2Library of the Health Sciences, University Library, University of Illinois at Chicago, Chicago, IL, United States

    3Department of Medical Education, College of Medicine, University of Illinois at Chicago, Chicago, IL, United States

    4Department of Internal Medicine, College of Medicine, University of Illinois at Chicago, Chicago, IL, United States

    5Department of Pediatrics, College of Medicine, University of Illinois at Chicago, Chicago, IL, United States

    6Department of Business Management, Brooklyn College, City University of New York, New York, NY, United States

    7Health Policy Administration, School of Public Health, University of Illinois at Chicago, Chicago, IL, United States

    *these authors contributed equally

    Corresponding Author:

    James Ronayne, MD

    Department of Pediatrics

    College of Medicine

    University of Illinois at Chicago

    Office 1435, CSB, MC856

    840 S Wood Street

    Chicago, IL, 60612

    United States

    Phone: 1 914 602 1868

    Fax:1 312 413 8778

    Email:


    ABSTRACT

    Background: There is a dearth of advocacy training in graduate medical education in the United States. To address this void, the Legislative Education and Advocacy Development (LEAD) course was developed as an interprofessional experience, partnering a cohort of pediatrics residents, fourth-year medical students, and public health students to be trained in evidence-informed health policy making.

    Objective: The objective of our study was to evaluate the usefulness and acceptability of a service-based legislative advocacy course.

    Methods: We conducted a pilot study using a single-arm pre-post study design with 10 participants in the LEAD course. The course’s didactic portion taught learners how to define policy problems, research the background of the situation, brainstorm solutions, determine evaluation criteria, develop communication strategies, and formulate policy recommendations for state legislators. Learners worked in teams to create and present policy briefs addressing issues submitted by participating Illinois State legislators. We compared knowledge and attitudes of learners from pre- and postcourse surveys. We obtained qualitative feedback from legislators and pediatric residency directors.

    Results: Self-reported understanding of the health care system increased (mean score from 4 to 3.3, P=.01), with answers scored from 1=highly agree to 5=completely disagree. Mean knowledge-based scores improved (6.8/15 to 12.0/15 correct). Pediatric residency program directors and state legislators provided positive feedback about the LEAD course.

    Conclusions: Promising results were demonstrated for the LEAD approach to incorporate advocacy training into graduate medical education.

    JMIR Med Educ 2017;3(2):e18

    doi:10.2196/mededu.7730

    KEYWORDS



    Introduction

    Since the 1970s, both US legislators and the public have shown diminished confidence in physician leadership [1,2]. In contrast, national health care and policy leaders are calling upon physicians to be trained in policy and advocacy in order to provide optimal care for their patients [3-5]. This shift in physician practice is emphasized by the American College of Graduate Medical Education. Milestones were implemented in 2015 as evaluation criteria for graduate medical education. For example, pediatric residents are expected to develop the ability to “advocate for quality patient care and optimal patient care systems” [6] and to “work in interprofessional teams to enhance patient safety and improve patient care quality” throughout their course of training [7].

    There are very few published studies of curricula that train health care professionals in advocacy to provide optimal patient care [5,8,9]. Studies of these curricula conclude that involvement in an advocacy course increased the learner’s likelihood of pursuing future advocacy and that involvement of legislators led to more meaningful policy results [8,9]. However, we found no curricular descriptions of learners partaking in a dialogic process with legislators to understand values and issues and then using knowledge brokering to arrive at policy solutions and recommendations.

    To address this void in health professional education, a multidisciplinary faculty committee at the University of Illinois at Chicago, USA, created the Legislative Education and Advocacy Development (LEAD) course to train pediatrics residents, public health students, and fourth-year medical students to think critically about health care, analyze policy, and communicate effectively about policy through the method of legislative briefing. The LEAD course sought to help learners to discern the actors and institutions involved in the policy-making process; critically examine the context of policy developments; appreciate how issues are placed on the policy-making agenda; understand the process of policy development, implementation, and modification; and apply dominant conceptual theories of the policy-making process to a critical health issue.

    The LEAD curriculum therefore drew from previously established advocacy training programs to provide learners with the tools to understand and engage in health policy making [8,9]. The LEAD course incorporated project-based learning to enhance the learner experience and cultivate competencies outside of the traditional classroom setting [10,11]. Advanced organizers, which have been shown time and again to reduce cognitive load by providing methodological scaffolding, were an important addition to the course [12,13]. However, the LEAD course’s key innovation was the incorporation of knowledge brokering: bringing health science professionals, state legislators, and other stakeholders together to facilitate knowledge interaction and intermediation in the service-based learning process [14,15]. This approach went beyond the traditional linear knowledge-deliverance model because it was iterative and invoked active participation from all involved parties to develop new ideas and foster meaningful legislative action [16]. Our first aim with the LEAD curriculum was to measure learners’ demographics and changes in knowledge. We hypothesized that there would be significant improvement in our learners’ knowledge. Since there are very high correlations between symbolic political attitudes and political behaviors [17-19], our second aim was to measure learners’ attitudes before and after the course. We hypothesized that attitudes, which are symbolic in nature and thus resistant to change, would not shift significantly, but might change slightly [18-20]. Our final aim was to gather feedback from all invested parties: learners, pediatric residency program directors, and state legislators. We hypothesized that our program would be well received and considered a valuable addition.


    Methods

    We used a single-arm pre-post study design to study the feasibility of the LEAD course and its impact on attitudes and knowledge among learners.

    Setting

    We purposively invited pediatrics residents, fourth-year medical students, and public health students by email to participate in the 2-week LEAD course. A pediatrics faculty member with 3 years of policy experience and a public health faculty member with 20 years of policy experience were the instructors for the course. The course and study were conducted in February 2016. We expected learners to spend about 40 hours per week on their work. This time was divided thusly: 30 hours per week spent on modules and preparation with the group, and 10 hours per week on lectures and mock panels. Learners’ pre- and postcourse surveys were printed, self-administered, and anonymous to ensure privacy, and therefore completion of the surveys did not affect learners’ grades in the course. Additionally, only approved members of the research team had access to the surveys to ensure confidentiality.

    We received ethical approval from the Institutional Review Board of the University of Illinois at Chicago (December 21, 2015, Research Protocol # 2015-1084). The study was consistent with the ethical standards of the Declaration of Helsinki. All learners provided verbal informed consent to participate in the study; this was obtained by the lead author and not recorded.

    Curriculum

    The curriculum had two parts: didactics and experiential learning.

    Learners participated in didactics, largely grounded in the works of Bardach, concerning background and landscape discovery, reiterative formulation of problem statements, and decision-making criteria [12]. Learners were instructed in the use of an advanced organizer that contained the core elements of a policy brief: issue statement, background, landscape, options and analysis, and final recommendation. Multimedia Appendix 1 shows this advanced organizer. The course objectives (Multimedia Appendix 2) were based on the advanced organizer. The curriculum focused on training learners to support recommendations with evidence and to use the advanced organizer for structure. Emphasis was also placed on developing legislator-derived, value-based criteria to evaluate each option and produce a final recommendation. An interactive overview of the state-level policy-formulation process was also provided. Learners participated in the policy-formulation process with extensive faculty mentorship and discussion. Beyond guidance on creating policy brief documents, participants also honed their oral presentation skills.

    Concurrently, learners worked in 4 independent interdisciplinary teams to create briefs based on specific child health queries submitted directly from the state legislators. Learners discoursed both in live groups and virtually by cocreating briefs through Google Docs (Google Inc). Some examples of queries are lead abatement, gun control, access to home care services for disabled children, and licensure of professional midwives. In creating these briefs, learners used legislator values to create decision-making criteria, which guided research and policy analysis. Multimedia Appendix 3 shows an example of a decision-making chart. Learners presented their briefs during guided role play involving a panel of LEAD faculty and guest experts from the Department of Pediatrics and the School of Public Health, University of Illinois at Chicago. Additionally, participants identified and resolved common pitfalls encountered during the policy brief creation process [20]. The final product was a polished presentation with accompanying full-length and summarized policy briefs. Finally, learners formally presented their policy analysis and recommendations to state legislators and received feedback.

    Measures

    Knowledge

    We assessed knowledge with 15 questions on the pre- and postcourse surveys. These questions tested learners on factual data such as major US health care policies, components of a policy brief, and identification of state legislators and their governmental roles. Of these 15 questions, 13 were multiple choice questions with 4 to 12 answer choices, and 2 were free-response questions: “Who is your district’s state Senator?” and “Who is your district’s state Representative?” The highest possible correct total score was 15.

    Attitudes

    The pre- and postcourse surveys gave 13 attitude questions with possible answers ranging from 1=strongly agree to 5=strongly disagree. We analyzed each question separately. Content was adapted from 2 previously reported questionnaires on medical students’ and residents’ attitudes [21,22]. The 13 questions are tabulated below. Questions 1 and 2 were adapted from Stafford et al [22], questions 3 through 5, 7, 8, and 10 through 13 were from Emil et al [21], and questions 6 and 9 are original to this study.

    Program Feedback

    Learners were asked questions concerning quantity, quality, and engagement in past and present health policy instruction via the pre- and postcourse surveys. Questions 1 and 2 were rated from 1=excellent to 4=poor, questions 3 and 4 were rated from 1=excellent to 4=N/A (ie, not applicable), and questions 5 and 6 were rated from 1=strongly agree to 5=strongly disagree. To further measure feasibility and gauge interest among pediatric residency program directors, we presented the curriculum as a workshop at the Association of Pediatric Program Directors 2016 annual meeting and collected feedback. In addition to open-ended feedback, the 9 pediatric residency program directors who viewed the presentation were asked “Would you want this type of experience at your institution?” State legislators were queried in an open-ended format regarding their experience.

    Analysis

    Due to an insufficient number of paired responses, we did not perform inferential statistical tests for the knowledge and attitudes questions. As applicable, we assessed program feedback data with a Wilcoxon signed rank test and otherwise assessed the feedback qualitatively for themes. We conducted statistical analyses using SAS version 9.4 (SAS Institute). All P values were 2-tailed. Thematic analysis of legislator and pediatric residency program director feedback was performed by 2 independent raters who evaluated all themes. Discrepancies were resolved by consensus.


    Results

    A total of 10 learners provided pre- and postcourse surveys. We received 9 responses for demographic data (90% response rate), 5 precourse knowledge surveys (50% response rate), 7 postcourse knowledge surveys (70% response rate), 8 precourse attitude surveys (80% response rate), 7 postcourse attitude surveys (70%), and 10 sets of program feedback data (100% response rate). However, many of the pre- and postcourse attitude surveys were incompletely filled out by learners, and on further inspection it appeared this was partly due to secretarial issues, with some questions printed on the back of the page. We received qualitative feedback from 4 state legislators and 9 pediatric residency directors. Table 1 shows the demographics and characteristics of responders.

    Table 1. Demographics and characteristics of participants in the Legislative Education and Advocacy Development course (n=10).
    View this table

    Knowledge

    Learners’ scores improved from a mean of 6.8 out of 15 to 12.0 out of 15 by the end of the course (Figure 1). Given the lack of overlap between the pre- and postcourse 95% confidence intervals, we noted a pattern toward improved knowledge. The lower limit of the postcourse knowledge score (10.49) did not include the upper limit of the precourse knowledge score (9.89). As there were only 3 sets of paired responses, we could not conduct an analysis with P values.

    Attitudes

    Table 2 highlights the pre- and postcourse mean attitude scores. Attitudes were generally consistent from the pre- to postcourse surveys. Of the 13 items, 2 showed changes of 0.50 or more, toward greater recognition of the importance of health policy (question 6) and that the health care system should be government controlled rather than free market (question 9).

    Program Feedback

    Table 3 shows the pre- and postcourse means for feedback measures of the LEAD course. Self-reported understanding of the health care system significantly improved, with mean Likert scores improved from 3.0 (fair) to 2.3 (good) (P=.01). Additionally, learners reported that health care policy instruction prior to the LEAD course was “little” in quantity and only “fair” in quality. Learners agreed they would be more likely to engage in health policy, and more likely to recommend to a colleague to engage in health policy learning, than they would have been 1 month prior to the end of the course.

    Figure 1. Mean and 95% CIs of pre- and postcourse knowledge scores.
    View this figure
    Table 2. Mean pre- and postcourse attitude scoresa, with number of responders.
    View this table
    Table 3.
    View this table

    We queried 4 state legislators about their experience with the LEAD course, and their responses were positive. Specifically, 2 state legislators expressed themes of efficacy. For example, 1 legislator wrote that this was “a thoughtful and well-researched brief that greatly improved my understanding in the area.” All state legislators expressed a desire to continue participating. For example, 1 legislator wrote, “I look forward to working with the learners again next year.” Of 9 pediatric residency program directors queried at the national conference, 8 (89%) said yes and 1 (11%) said maybe, regarding their desire for this type of course at their institution. Qualitatively, they found the experience insightful, were interested in viewing the didactics, and would like to incorporate LEAD into their training program.


    Discussion

    Principal Findings

    The hypotheses for our LEAD course pilot study were all supported. We found that knowledge improved from pre- to postprogram. We found that attitudes were generally consistent from pre- to postprogram. We found that the pilot was well received by learners who took the course, pediatric residency program directors who may choose to implement the course, and the state legislators who participated in the course.

    Knowledge improved meaningfully when learners’ scores improved (6.8/15 to 12.0/15) on the postsurvey questionnaire. Knowledge outcomes among health policy training programs for medical students and residents to date have been self-assessed [5,9]. One study using learners’ self-assessed knowledge improvement found a statistically significant increase in 5 areas of knowledge [9]. A previous study asked learners to self-assess their knowledge before and after course completion across several topics, including quality of and access to care, Medicaid and Medicare, and the role of government in health policy [9]. Both methods of evaluation have shown improvement in knowledge. Our findings within the LEAD program were consistent with other approaches demonstrating improved knowledge.

    As hypothesized, scores on attitude questions for LEAD learners did not generally change between pre- and postcourse. This was likely a product of the already extreme Likert responses at baseline and the small sample, which self-selected into a policy course. Categorically stable attitudes that are held over time tend to better predict behavior than attitudes that change [23]. Two questions demonstrated variation: question 6 (greater recognition of the importance of health policy) and question 9 (health care system should be government controlled rather than free market), which both moved 0.50 points along the scale. While the cohort of learners who partook in the LEAD course generally displayed categorically stable attitudes, one potential caveat to this stability and general trend toward “progressive” attitudes was an apparent internal inconsistency between learners’ signaled general support for universal health care and their support for financial means testing. Scores for question 12 showed that learners were more likely to consider “financial means testing” after the course. Although this may reflect a dichotomy between principles and the means of achieving principled goals, policy “targeting” (eg, financial means testing) is not necessarily juxtaposed to universalism [24]. It is possible that these learners were signaling greater nuance in their understanding of redistributive policy, a product of engaging with contradictory forces in a highly complex system [25].

    Measuring attitudes is important, since attitudes may correlate closely with long-term behavioral outcomes in general [17-19], and specifically are thought to be indicators of health professional behaviors [26]. A previous study gauged the attitudes of learners in California, USA and Ontario, Canada [23]. The LEAD cohort of learners more closely reflected the participants of the Ontario than of the California cohort, but the LEAD learners were more agreeable than both the Ontario and California cohorts. For instance, both Ontario and California learners “agreed” while LEAD learners “strongly agreed” that they planned to become involved and take leadership in health care policy issues as a physician. We suggest that demography and self-selection into a policy course are possible reasons for attitudes discrepant with those previously reported [23].

    Another focus of our study was to measure feedback from the 3 key types of players in the LEAD course: learners who took the course, pediatric residency program directors who may choose to implement the course, and the state legislators who participated in the course. There was broadly positive feedback from learners, pediatric residency program directors, and state legislators. More specifically, evaluative data from learners suggested that their understanding of the health care system improved, and prior to our course, their health care policy training was quite limited. In addition, the learners indicated a “good” likelihood of both engaging in health policy activities and recommending that a colleague engage in health policy learning. This is important because we know that learners’ subjective opinions about a course directly translate to both their long-term behavioral changes and their underlying satisfaction with their education [27]. These feedback data from pediatric residency program directors is a measure of external validation, as these program directors were from different academic centers and therefore may have provided greater objectivity concerning programmatic strengths [28]. As demonstrated by responses from the pediatric residency program directors, LEAD would be a desirable addition to other pediatric residency programs. The LEAD curriculum can be easily exported to diverse residency programs, as it has no specific geographical or institutional requirements.

    Although legislators have previously signaled a desire to work with undergraduates, none have been surveyed in the context of graduate medical education [29]. As demonstrated by the positive response of the state legislators, it is reasonable to assume that we brokered a meaningful interaction between the learners and legislators. Therefore, it is likely that state legislators who are interested in improving the health of their communities would be willing to participate with future iterations of LEAD. To further institutionalize this approach, educational leaders can work with legislative leaders and their staff to strengthen the didactic and formative learning approach. For instance, in Illinois, the LEAD leadership team worked closely with the leaders of the legislative caucuses—House and Senate Democrats and Republicans—to identify issues and active bills that might serve as centerpieces for engaged interprofessional service-based learning.

    Limitations

    This study has several limitations. First, the sample size was small. Second, we did not use a control group. This limited the ability to assess attitudes and knowledge of those medical students, residents, or public health students not participating in an intensive advocacy experience. Third, the lack of responses limited the ability to perform certain inferential statistical tests. Fourth, we did not collect data on how learners used their time; this would have been valuable to examine and perhaps compare with other advocacy training programs. Future research should study this topic with a larger sample and a control group.

    Conclusion

    There were promising results from the LEAD course as an acceptable and useful tool incorporating advocacy training into graduate medical education in the United States. The LEAD curriculum should be considered by institutions and programs seeking to help generate a new cadre of policy leaders from within the health professions.

    Acknowledgments

    The authors wish to acknowledge and thank Leana Wen, MD MSc, Health Commissioner of Baltimore City, and Fitzhugh Mullan, MD, Murdock Head Professor of Medicine and Health Policy, George Washington University, for their guidance in creating this course, and for their years of dedication to the health policy training of physicians and allied health professionals.

    Funding for course development was provided by a grant from the Institute for Policy and Civic Engagement.

    Financial support for the open access publishing fee for this article was provided by the Research Open Access Publishing (ROAAP) Fund of the University of Illinois at Chicago.

    This topic was previously presented at the Association of Pediatric Program Directors Annual Meeting, New Orleans, LA, USA, April 2016.

    Conflicts of Interest

    None declared.

    Multimedia Appendix 1

    Advanced organizer.

    PDF File (Adobe PDF File), 38KB

    Multimedia Appendix 2

    Course objectives.

    PDF File (Adobe PDF File), 11KB

    Multimedia Appendix 3

    Decision-making chart.

    PDF File (Adobe PDF File), 17KB

    References

    1. Blendon RJ, Benson JM, Hero JO. Public trust in physicians--U.S. medicine in international perspective. N Engl J Med 2014 Oct 23;371(17):1570-1572. [CrossRef] [Medline]
    2. Schlesinger M. A loss of faith: the sources of reduced political legitimacy for the American medical profession. Milbank Q 2002;80(2):185-235 [FREE Full text] [Medline]
    3. Pellegrino ED, Relman AS. Professional medical associations: ethical and practical guidelines. JAMA 1999 Sep 08;282(10):984-986. [Medline]
    4. Earnest MA, Wong SL, Federico SG. Perspective: Physician advocacy: what is it and how do we do it? Acad Med 2010 Jan;85(1):63-67. [CrossRef] [Medline]
    5. Heiman HJ, Smith LL, McKool M, Mitchell DN, Roth BC. Health policy training: a review of the literature. Int J Environ Res Public Health 2015 Dec 23;13(1):ijerph13010020 [FREE Full text] [CrossRef] [Medline]
    6. Klein MD, Schumacher DJ, Sandel M. Assessing and managing the social determinants of health: defining an entrustable professional activity to assess residents' ability to meet societal needs. Acad Pediatr 2014;14(1):10-13. [CrossRef] [Medline]
    7. Guralnick S, Ludwig S, Englander R. Domain of competence: systems-based practice. Acad Pediatr 2014;14(2 Suppl):S70-S79. [CrossRef] [Medline]
    8. Meltzer R. Practice makes perfect: teaching policy analysis through integrated client-based projects. J Public Affairs Educ 2013;19:405-431 [FREE Full text]
    9. Greysen SR, Wassermann T, Payne P, Mullan F. Teaching health policy to residents--three-year experience with a multi-specialty curriculum. J Gen Intern Med 2009 Dec;24(12):1322-1326 [FREE Full text] [CrossRef] [Medline]
    10. Maida CA. Project-based learning: a critical pedagogy for the twenty-first century. Policy Futures Educ 2011 Jan;9(6):759-768. [CrossRef]
    11. Heinrich WF, Habron GB, Johnson HL, Goralnik L. Critical thinking assessment across four sustainability-related experiential learning settings. J Experiential Educ 2015 Jul 09;38(4):373-393. [CrossRef]
    12. Bardach E. A Practical Guide for Policy Analysis: The Eightfold Path to More Effective Problem Solving. 2nd edition. New York, NY: Chatham House Publishers, Seven Bridges Press; 2000.
    13. Gurlitt J, Dummel S, Schuster S, Nückles M. Differently structured advance organizers lead to different initial schemata and learning outcomes. Instructional Sci 2011 Jul 22;40(2):351-369. [CrossRef]
    14. Haynes AS, Derrick GE, Redman S, Hall WD, Gillespie JA, Chapman S, et al. Identifying trustworthy experts: how do policymakers find and assess public health researchers worth consulting or collaborating with? PLoS One 2012 Mar;7(3):e32665 [FREE Full text] [CrossRef] [Medline]
    15. Ward V, House A, Hamer S. Knowledge brokering: the missing link in the evidence to action chain? Evid Policy 2009 Aug;5(3):267-279 [FREE Full text] [CrossRef] [Medline]
    16. Prewitt K, Schwandt T, Straf M, National Research Council Committee on the Use of Social Science Knowledge in Public Policy. Using Science as Evidence in Public Policy. Washington, DC: National Academies Press; 2012.
    17. Ajzen I, Fishbein M. The influence of attitudes on behavior. In: Albarracin D, Johnson BT, Zanna MP, editors. The Handbook of Attitudes. Mahwah, NJ: Lawrence Erlbaum Associates; 2005:173-221.
    18. Lau R, Heldman C. Self-interest, symbolic attitudes, and support for public policy: a multilevel analysis. Polit Psychol 2009;30(4):513-537 [FREE Full text] [CrossRef]
    19. Noe RA. Trainees’ attributes and attitudes: neglected influences on training effectiveness. Acad Manage Rev 1986 Oct 01;11(4):736-749. [CrossRef]
    20. Meltsner AJ. The seven deadly sins of policy analysis. Knowledge 2016 Aug 18;7(4):367-381. [CrossRef]
    21. Emil S, Nagurney JM, Mok E, Prislin MD. Attitudes and knowledge regarding health care policy and systems: a survey of medical students in Ontario and California. CMAJ Open 2014 Oct;2(4):E288-E294 [FREE Full text] [CrossRef] [Medline]
    22. Stafford S, Sedlak T, Fok MC, Wong RY. Evaluation of resident attitudes and self-reported competencies in health advocacy. BMC Med Educ 2010 Nov 18;10:82 [FREE Full text] [CrossRef] [Medline]
    23. Kraus SJ. Attitudes and the prediction of behavior: a meta-analysis of the empirical literature. Pers Soc Psychol Bull 2016 Jul 02;21(1):58-75. [CrossRef]
    24. Carey G, Crammond B. A glossary of policy frameworks: the many forms of 'universalism' and policy 'targeting'. J Epidemiol Community Health 2017 Mar;71(3):303-307. [CrossRef] [Medline]
    25. Moses H, Matheson DHM, Dorsey ER, George BP, Sadoff D, Yoshimura S. The anatomy of health care in the United States. JAMA 2013 Nov 13;310(18):1947-1963. [CrossRef] [Medline]
    26. Cruess RL, Cruess SR, Steinert Y. Amending Miller's pyramid to include professional identity formation. Acad Med 2016 Feb;91(2):180-185. [CrossRef] [Medline]
    27. Agrawal JR, Huebner J, Hedgecock J, Sehgal AR, Jung P, Simon SR. Medical students' knowledge of the U.S. health care system and their preferences for curricular change: a national survey. Acad Med 2005 May;80(5):484-488. [Medline]
    28. Durning SJ, Hemmer P, Pangaro LN. The structure of program evaluation: an approach for evaluating a course, clerkship, or components of a residency or fellowship training program. Teach Learn Med 2007;19(3):308-318. [CrossRef] [Medline]
    29. Magnussen L, Itano J, McGuckin N. Legislative advocacy skills for baccalaureate nursing students. Nurse Educ 2005;30(3):109-112. [Medline]


    Abbreviations

    LEAD: Legislative Education and Advocacy Development


    Edited by G Eysenbach; submitted 21.03.17; peer-reviewed by R Meltzer, S Murthy; comments to author 12.04.17; revised version received 04.08.17; accepted 20.09.17; published 17.10.17

    ©Shikhar H Shah, Maureen D Clark, Kimberly Hu, Jalene A Shoener, Joshua Fogel, William C Kling, James Ronayne. Originally published in JMIR Medical Education (http://mededu.jmir.org), 17.10.2017.

    This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in JMIR Medical Education, is properly cited. The complete bibliographic information, a link to the original publication on http://mededu.jmir.org/, as well as this copyright and license information must be included.