The article, “The Impact of Previous Online Course Experience on Students’ Perceptions of Quality,” comes from a collection of articles prepared and collected by the Online Learning Consortium from a 2016 OLC Conference Special Issue. The article title and abstract caught my attention as we continue our search for what makes for a quality online learning experience. We’ve talked about novice and expert students in previous modules, so this article seemed like it might fit in and provide some insight as to the perception of what a quality online course looks like from students who have had either little or plenty of previous experience with online courses.
The research question for this article is “whether students’ perceptions of online course quality differ based on the extent of their previous online learning experience.” Some of the first things that came to my mind that might be perceived differently between these two groups are navigation and structure, technical details and technical support resources, aspects of building online community vs. feelings of isolation, and familiarity with asking for instructor clarification or support outside of normal discussion forums. I thought that these areas might be evaluated differently by each for the groups which might feed into their idea of quality.
Much to my surprise, the authors based their survey questions on the Quality Matters (QM) model for evaluating course quality. For those who aren’t familiar with QM, there are four underlying principles that are based on four categories: Continuous, Centered, Collegial and Collaborative.
- Continuous – there is always room for improvement even if a standard is fully met and eventually all courses that go through the review process can fully meet the standards.
- Centered – the process is based on research and best practices, the review is centered on promoting student learning and is reviewed from a learner-centered perspective. The review is centered around meeting expectations at an 85% level which is deemed better than good.
- Collegial – the review process is a faculty-drive review process. The process is meant to be collegial and not evaluative
- Collaborative – the review team consists of experienced online instructors who are asked to identify evidence found in the online course that speaks to the outlined Standards in the rubric including communicating directly with the Course Representative.
I wasn’t expecting that the basis for the study would be using QM for a couple of reasons. First, as a peer review process, only the course design is evaluated. The actual teaching and delivery of the course is not considered in a formal review. You are only evaluating the quality of the course design you aren’t looking at how the teacher interacts with the students. I’m suspicious that an online learner, especially one with little online experience would be able to discern the design vs. delivery difference. This can be difficult enough for faculty and administrators to differentiate.
Secondly, as a course peer reviewer, you must look at the course from the learner’s perspective. You don’t look at the course from the instructor’s or an Instructional Designer’s side. You evaluate the course based on how you, the student, perceive the quality of the course. So for this reason, using a rubric like QM has some merit. The authors pose an interesting question about the standards chosen for QM, by asking if the actual standards chosen for evaluation are standards that students would use to evaluate quality. “Although based on extensive research, the question remained—Would students agree that the standards in the QM rubric were important?
The specific standards included in the rubric were originally written in faculty-centered language. In order for online students to rank these rubric items, each standard was first converted to student-centered language. For example, the specific standard, “The self-introduction by the instructor is appropriate” (MarylandOnline, 2006), was modified to read, “The instructor introduces her- or himself” (p. 31). I find the rephrasing of the questions an interesting approach and I wonder if this would actually benefit peer reviewers when they are evaluating courses. [I’m tentatively scheduled to be on a QM review team and I will definitely keep this in mind during the evaluation period.] Students answered the questions using a likert scale where students rated each of the standards based on how important the standard was to their success (p. 31).
The sample size for the study was pretty large: 3,160 students from 31 universities in 22 states (p. 29). It seemed like there was a good mix of students, although there were twice as many females and men and less than half of the students were between 26-44 and half of the students worked full-time (p. 30). What surprises me is that in many of the demographic statistics between 15-19% of the participants didn’t respond to questions about sex, age, employment status, educational level, or number of online courses taken. Why would that number be so high? Are students afraid of divulging personal information about themselves in these types of surveys?
Since we all know that students bring different prior experience into their learning, it could very well be that one of the questions asked of the students could have received a lower score because the student had already received sufficient experience in another class so therefore didn’t attribute much importance to the standard for the class in which was receiving the evaluation. The authors did acknowledge “ Students in the same course are not homogeneous as some faculty believe. There is great variance in the needs and expectations of students taking online courses” (p. 34).
There were several key areas that were regarded as key design features that students felt were important for a quality course. The last bullet is one that was of particular importance to novices.
- Demonstrate strong alignment of course objectives, assessments and learning activities.
- Instructors need to also help learners see the connection between various course elements so they can better understand their path to success in the course.
- Exhibit clear organization, easy navigation and optimal readability by students. Making sure that students can easily access required technologies and materials is also recognized by experienced learners to be a key to their success.
- Clearly state expectations for student performance, especially as it relates to interaction with the course content, instructor, and their peers.
- Create opportunities for students to introduce themselves to the class and reinforce to students the importance of this activity in creating a supportive and effective learning community.
- List and explain netiquette guidelines. Even though it may seem that learners are more comfortable in the online environment with each passing year, the results of this study suggest that students in online courses, especially those with limited online course experience, still need and seek out guidance on acceptable behaviors in online courses (p. 36).
Several things about this study bother me. First, the idea of quality seems to be based on student preferences, not on learning outcomes. Important demographics about the students are missing such as success rate, how well each of the students did in their class and what was the student’s overall GPA. Both of these factors would confirm or negate perceptions of quality.
The authors also pointed out several reasons why students with less online experience might have concentrated on certain design features that their more experienced counterparts did not. Several of which are in line with my assumptions.
…novice online learners may be inexperienced in using a learning management system and other course technologies, and may be unfamiliar with typical instructional approaches and conventions used in online courses. They may not be aware of what they don’t know and may still be getting acclimated to the learning system. Intermediate online learners have likely become more proficient in using a learning management system and other technologies, and they may have at least been exposed to commonly used instructional approaches. These intermediate students, however, may not yet have the confidence or proficiencies of more experienced online learners. Finally, experienced online learners, who have taken at least seven online courses previously, have completed many credit hours in the online environment. Students with this level of experience can reasonably be expected to have more comfort with course technology, structure, and participation (p. 34).
In the end, the article points out several design features that an instructor should take under careful consideration when designing and planning a course. It should not matter whether or not your audience has extensive online experience or have only taken one or two courses. A student who has taken many courses will always be able to have a better grasp on quality due to their experience.
Hixon, E., Barszyk, C., Ralston-Berg, P., and Buckenmeyer, J. (2016). The Impact of Previous Online Course Experience on Students’ Perceptions of Quality. Online Learning, 20 (3). 25-40.