The Quality Matters Program (Distance Learning)

INTRODUCTION

The Quality Matters™ Program (www.qualitymatters. org) is a set of standards (or rubric) for the design of online college-level courses and the online components of hybrid/blended courses, and a peer review process for applying these standards. The Quality Matters Rubric is based on recognized best practices, built on the expertise of instructional designers and experienced online teachers, and supported by distance education literature and research. The goals of the program are to increase student retention, learning and satisfaction in online courses by implementing better course design.

HISTORY

The Quality Matters project was initiated by the Mary-land Online (MOL) consortium, a voluntary, not-for-profit, educational association of two and four-year institutions in Maryland. MOL was established in 1999 to leverage the efforts of individual campuses that were committed to the expansion of online educational opportunities in Maryland through collaborative activities. MOL and its members cooperate to support and maintain a portal for online programs and courses in Maryland, engage in joint faculty training initiatives, develop joint online programs, share online courses through a seat bank arrangement, and pursue federal, state and foundation support for a variety of distance learning initiatives. One of these initiatives is the Quality Matters project.

In the spring of 2003, MOL submitted a proposal to the Fund for the Improvement of Post Secondary Education (FIPSE) for the creation of a rubric for the design of online courses and a peer review process for evaluating and improving existing online courses.

The title of the proposal was: “Quality Matters: Inter-Institutional Quality Assurance in Online Learning.” FIPSE awarded MOL $509,177 over three years (September 2003 – August 2006) to carry out the project. The agency was interested in this proposal among many that involved quality assurance in online education because of the prospect of developing standards that would be inter-institutional and inter-segmental and the peer-to-peer structure of the proposed course review process. This proposal held the promise of a quality assurance tool that was both scalable and replicable, criteria that are fundamental to the FIPSE grant program.

The collaborative nature of the project operated at several different levels. The co-principal investigators, Mary Wells, Director of Distance Learning at Prince George’s Community College and Christine Sax, Assistant Dean of Social, Behavioral, Natural, and Mathematical Sciences at the University of Maryland University College, personified the inter-segmental character of the initiative. Experienced faculty and support staff from throughout the MOL institutions served on the various committees scanning the research and best practices literature, developing the rubric standards and a training program for peer reviewers, testing and refining preliminary versions of the rubric, etc. External institutional and organizational partners across the U.S., including the Kentucky Virtual University (now the Kentucky Virtual Campus), the Michigan Virtual Community College Consortium, the Sloan Consortium, the Southern Regional Education Board (SREB), and the Western Cooperative for Educational Telecommunications (W CET), advised the co-directors as the project moved through its various phases.

During the second year of the grant, the co-principal investigators began making presentations at state, regional and national conferences. These presentations generated widespread interest in the Quality Matters Rubric and evaluation process. In 2005, MOL received several awards, including the WCET Outstanding Work (WOW) Award and the

USDLA 21st Century Best Practice Award. In the second and third year of the grant, peer reviewer training to develop a cadre of reviewers attracted participants from 158 different institutions spanning 2 8 state s. More than 700 faculty and instructional development staff were training during this period. Trained peer reviewers served on the first rounds of course reviews, but also brought their experience back to their home campuses. Several MOL institutions made formal commitments to review and enhance their online courses using the Quality Matters Rubric, and a variety of institutions across the country began to adapt the QM Rubric and review process to serve their own agendas for online course development and quality assurance.

During the final year of the grant, September 2005 - August 2006, MOL conducted a study of options to continue the Quality Matters initiative beyond August 2006. The outcome of this investigation was a decision to continue the Quality Matters Program as a self-supporting activity of MOL through institutional subscription and fee-for-service revenues. Early in the second year of operation on this basis, the Quality Matters Program has more than 250 institutional subscribers from 37 states and has conducted training for more than 4,500 faculty and staff. Through a statewide subscription option, QM is poised for continued rapid growth. As the program and the QM Rubric continue to evolve, the program may also attract support from some of the many institutions that have adapted the freely available QM materials from the grant period.

UNDERLYING PRINCIPLES OF QUALITY MATTERS

The goal of the Quality Matters project was to improve student learning, retention and satisfaction in online courses through better course design. The project was designed as a collaborative, faculty-driven initiative, with faculty developing the standards, carrying out the reviews, providing advice to instructors and working together with instructors to make existing online courses more effective. The project leaders did not envision the creation of “perfect” courses, or a perfect set of standards. Rather, they foresaw that application of the rubric would improve courses, and that repeated reference to this evolving set of standards would make courses progressively better.

Another founding principle, which continues to drive the Quality Matters Program is the need to reflect the results of academic research on effective learning in the standards. While the initial standards and subsequent modifications have been based on the insights of teams of experienced online teachers and instructional designers and the best practices standards promulgated by accrediting bodies and national organizations, the QM standards have also been tested for consistency with the conclusions of the educational research literature regarding factors that improve student retention rates and focus their efforts on activities that increase learning and engagement.

THE QUALITY MATTERS RUBRIC

The Quality Matters Rubric consists of eight general standards that define quality expectations for the basic elements that go into the design of an online or hybrid course. These general standards cover

1. The Course Overview and Introduction

2. Learning Objectives

3. Assessment and Measurement

4. Resources and Materials

5. Learner Interaction

6. Course Technology

7. Learner Support

8. Accessibility

Under each general standard are a number of specific standards that deal with discrete course elements. The specific standards are stated in succinct language, but they are annotated with more detailed explanations of their intent and examples of good practice. The annotations provide guidance to instructors, course development specialists, and course reviewers who are attempting to interpret and implement the standards. The inter-relationships among the standards are highlighted in the annotations through the identification of standards that should align with each other in a well designed course.

A point system is associated with the specific standards, with some standards given more value than others. Those assigned the greatest number of points are considered essential standards and must be satisfied for a course to meet overall Quality Matters standards.

The Quality Matters Program collects feedback regarding the scope and language of the rubric and its annotations from users and trainers on a continuous basis. Periodically, the rubric is subjected to a thorough review and updated as needed. The current 06-07 version is scheduled for replacement in 2008.

THE QUALITY MATTERS COURSE REVIEW PROCESS

The Quality Matters Rubric was designed to enable formal peer reviews of existing online courses. A course is evaluated by a team of three reviewers who have been trained in the application of the QM Rubric. The team members are all experienced online instructors. At least one member of the team is external to the institution whose course is being reviewed. One team member is a subject matter expert. The chair is an experienced QM reviewer.

The instructor or design team whose course is being evaluated provide baseline information and access to a version of the online course or online component of a hybrid course to the review team members. The review is launched with a conference call between the team members and the instructor(s), and the team chair stays in touch with the instructor(s) throughout the process, relaying questions from the team, going over the evaluation, once it is complete, and monitoring any subsequent changes made to the course to satisfy the QM standards.

The reviewers are asked to rate the course on each of the specific standards, to identify strengths of the course and to provide recommendations on how the course might be improved to meet those standards they regard as not sufficiently met. The reviewers confer with each other before submitting their individual reports. The numerical score assigned to the course is a composite of the three reviewers’ judgments, but all advice is transmitted to the instructor(s) in the final report.

Courses that achieve a high enough numerical score and satisfy all the essential standards are regarded as having met QM standards and are immediately issued a version of the QM seal that may be displayed in association with the course. Instructors are still free to incorporate the review team’s advice to improve their courses, and data shows that most of them do make changes based on the review. Courses that achieve lower numerical scores must be modified if they are to eventually meet the QM standards. Data shows that almost all courses submitted for review eventually make the recommended changes, although, in some cases, this may only occur after a considerable lapse of time.

Figure 1. The Quality Matters Course Review Process

The Quality Matters Course Review Process

This process of formal review is still in use, although, as explained below, some institutions have developed alternative methods of applying the rubric that meet their internal needs, but do not qualify for the QM seal.

expanding uses of the quality

MATTERS RUBRIC AND PROCESS

While the Quality Matters Rubric and process were developed to meet a specific need, users have been quick to adapt it to meet a wider range of needs in relation to the rapidly growth of online and hybrid courses. Formal course reviews are both costly and time consuming, and, while they may provide the most detail and balanced assessment of existing online courses, many institutions have chosen to implement informal review processes internally, using the QM Rubric and some elements of the formal process.

Institutions are also providing instructors with access to the rubric to assess and improve their own courses, either independently or in collaboration with instructional design staff. Some have adapted the standards to provide guidance to instructors and course developers during the initial course design stage. Others have incorporated the QM standards into the development and training programs that prepare their faculty to teach online. Some colleges have gone so far as to incorporate some of the QM standards into institutional policies. Finally, institutions are beginning to cite their implementation of the Quality Matters standards and processes as evidence of quality assurance in their accreditation reviews and other forms of external accountability.

The Quality Matters approach is also having a broader impact on the culture of online education at many institutions. The Quality Matters process is providing a basis for campus dialog on best practices in online instruction among faculty and instructional development staff and commitment to Quality Matters principles is helping to build a campus culture dedicated to the continuous improvement of online learning.

FUTURE TRENDS

With a rapidly growing number of QM-reviewed courses, research projects that are attempting to mea sure the impact of QM reviews and other methods of implementing the QM Rubric are proliferating. As the results accumulate, the Quality Matters Program will be able to base the further refinement of the course design rubric on directly relevant research, in addition to inferences drawn from broader-based research on student learning and distance education.

The trend toward widespread and, in some cases, statewide adoption of the QM standards is creating momentum toward the de facto recognition of the QM Rubric as the prevailing national standard for online/hybrid course design. To maintain and further this momentum, the Quality Matters Program will need to remain relevant by continuing to evolve the standards in keeping with the latest research findings, changing technology, and the needs of its users, and by facilitating the ever wider range of uses being made of its tools and processes. It may prove necessary to develop additional rubrics for alternative forms and levels of distance education, and for programs as well as courses and modules.

KEY TERMS

Alignment: Critical course elements working together to ensure that students achieve the desired learning outcomes.

Course Delivery: The actual teaching of a course, i.e., the implementation of the course design.

Course Design: The forethought and planning that an instructor or course development team puts into a course, i.e., the elements of a course that are built online and the planning for how a course should unfold over time.

Hybrid Course: A course with both online and face-to-face components. Generally speaking, at least 25% of the course must be online for a course to be treated as a hybrid in the Quality Matters course evaluation process.

Learning Objectives: Course learning objectives describe what students are to gain from instruction in a course. The Quality Matters Rubric expects learning objectives to be measurable.

Learning Outcomes: The accomplishments of students in a course, as measured through various forms of assessment.

Quality Assurance: A systematic program for determining whether a product or process is performing according to established standards.

Rubric: A set of criteria to benchmark or evaluate a product, activity, or process.

Scalability: The potential of a process or function to handle a larger volume of activity without degrading.

Student Retention Rate: In the context of a course, the student retention rate is the percentage of initially enrolled students who complete the course

Next post:

Previous post: