A Formative Evaluation of the Methodological Tools Taught in OPWL’s WIDe Program

person pointing his black pen on the paper
Photo by Kindel Media on Pexels.com

Background

The Organizational Performance and Workplace Learning (OPWL) department at Boise State University offers the Workplace Instructional Design (WIDe) Graduate Certificate to prepare students for instructional design and learning and development (L&D) roles. This evaluation assesses the real-world utility of the methodological tools taught in the WIDe program and identifies areas for potential program improvement, with the goal of aligning training with evolving industry standards and enhancing student career outcomes.

Evaluation Methodology

A formative, goal-based evaluation was conducted using multiple data collection methods:

  • A web-based survey of recent WIDe graduates
  • Semi-structured interviews with faculty, and leadership (upstream stakeholders), as well as  graduates and current students (downstream direct impactees)
  • Extant data review of syllabi for WIDe courses

Data were triangulated and analyzed using a dimensional rubric to evaluate tool utility across three key sub-dimensions: satisfying industry expectations, solving workplace performance problems, and advancing professional accomplishments.

Evaluation Results

Findings indicate that the methodological tools taught in the WIDe program largely meet industry needs and equip graduates with relevant skills. Key tools such as Gilbert’s Behavior Engineering Model (BEM), Gagne’s 9 Events, and Mayer’s Multimedia Learning Principles were frequently cited as valuable in workplace contexts. Graduates credited the program with supporting professional achievements such as promotions, job transitions, and expanded leadership roles. Areas for improvement included incorporating project management tools, integrating generative AI practices, offering access to industry-standard software, strengthening cross-course integration of concepts, and supporting graduates’ ongoing professional development. 

The evaluation team determined that the WIDe Certificate Program’s methodological tools overall Meet Needs per the criteria outlined in the rubric (see Appendix F), as two out of the three independent data sources validated the utility of the tools across key dimensions.

Findings revealed that the OPWL department and the WIDe program play a pivotal role in shaping students’ understanding of what constitutes industry-standard practice in instructional design and human performance improvement. Graduates reported that the methodological tools taught were highly transferable across diverse workplace contexts. Rather than rigidly applying tools in their entirety, graduates often adapted components from multiple frameworks to respond flexibly to workplace demands, balancing evidence-based approaches with practical constraints such as time, resources, and audience needs.

The program has demonstrably supported professional advancement for graduates, with several alumni securing promotions, new roles, and expanded leadership responsibilities. Gilbert’s Behavior Engineering Model (BEM) emerged as a standout, achieving high marks across all three evaluation sub-dimensions (satisfying industry expectations, solving workplace performance problems, and advancing professional accomplishments).

While upstream stakeholders must prioritize which tools can realistically be covered within the constraints of a semester, there is an opportunity to strengthen curriculum coherence. Stakeholders’ coaching and feedback are often filtered through an academic lens, which occasionally leads to instructional approaches that differ from corporate expectations.

In response to these findings, the evaluation team recommends enhancing the WIDe program by incorporating additional real-world tools and frameworks, expanding instruction around project management and generative AI, improving integration across courses, offering access to industry-standard software, and fostering an OPWL community of practice to support lifelong learning. These enhancements aim to ensure that graduates not only continue to meet but exceed evolving workplace demands.

Several limitations influenced the evaluation. These included a small survey sample size, a lower-than-expected survey response rate, and potential volunteer bias among interview participants. We were also unable to secure an interview with the Student Success Advisor, which limited insights into student motivations for enrolling in the program. Only one individual responded to our focus group invitation, requiring us to pivot to a third individual interview. Additionally, it was challenging to isolate the utility of specific tools from broader workplace culture factors, which likely influenced how graduates were able to apply the tools in practice.

A formative, goal-based evaluation utilizing Chyung’s ten-step evaluation for training and performance improvement model.

Type: Program Evaluation
Date: May 2025
Client: Boise State University

Qualtrics, Kellogg Foundation’s Program Logic Model, Risk Assessment Matrix

Team_WIDe_Evaluation_Report.docx

Leave a Reply