Skip to content

Student Satisfaction: Online vs. On-ground

November 25, 2010

These slides show the comparisons between our 2008 PSOL results and the 2008 SSI (student Satisfaction Inventory) results. There are eleven questions that match up between the two surveys, including some that we added for that very purpose.

To clarify: here are the questions that were compared from the two groups.

PSOL # SSI # Item
01 45 This institution has a good reputation.
04 46 Faculty provide timely feedback about student progress.
07 66 Program requirements are clear and reasonable.
09 07 Adequate financial aid is available.
20 18 The quality of (online) instruction is excellent.
21 14 (Online) Library resources and services are adequate.
23 52 Billing and payment procedures are convenient for me.
24 50 Tutoring services are readily available (for online courses).
32 23 Faculty are understanding of students’ unique life circumstances.
34 32 My academic advisor is knowledgeable about my program requirements.
36 75 The LSC Help Desk responds with useful information and solutions.
Advertisements

Average Satisfaction Scores

November 25, 2010

The basic PSOL has used the same 26 questions during all four of the survey administrations that we have conducted here at LSC. I decided to make a few simple calculations that are not normally made by Noel-Levitz.

One thing that they do with the data is aggregate them into five separate categories, such as Enrollment Services, Academic Services, etc. It seemed to me that you could also get an overall feel for student satisfaction by calculating the average satisfaction score of all 26 items. This simple average seems somewhat relevant to me as a measure of overall student satisfaction, and is probably more relevant than a couple of the PSOL overall satisfaction survey questions – such as whether their expectations have been met, exceeded, or not.

PSOL chart of average student ratings for four years
The chart above (click to enlarge) indicates the average scores over the four survey periods for students of LSC Online. I think it’s instructive to see that the average satisfaction score has increased each time on a year-over-year comparison. I realize that there is still an issue about whether the rights things (most important ones) are increasing in satisfaction rating, but I do think that this is still something worth paying attention to.

Desire2Learn Scores High in Reliability

November 25, 2010

This is cross-posted from my e-learning blog: Desire2Blog

The chart below shows the results over the past three years to the following statement:

The online course delivery platform (Desire2Learn or D2L) is reliable. (click photo to enlarge)

D2L reliability chart from PSOL

The PSOL is the main instrument that we use to gather information from students about the online programs and services that we provide. In two of the last three years, reliability of the VLE platform (we all use Desire2Learn) has been rated as the most important factor out of the 30 (31 this year) questions asked of all students. The satisfaction rating (6.01 in 2008), is also one of the highest scoring. This year it comes in with the 2nd highest satisfaction rating out of 31 statements, with first place going to “Registration for online courses is convenient.” (rating of 6.21)

I realize that the reliability factor does not capture all of the pertinent information about a VLE, but clearly it is an important one. Credit for the high student ratings goes both to Desire2Learn for the product development as well as to the MnSCU Office of the Chancellor staff who actually host and troubleshoot the service for our several hundred thousand user account holders.

Congratulations are in order for these high marks related to student satisfaction.

NOTE: the survey uses a 7 point scale where 6.0 is satisfied, 7.0 is very satisfied, 5.0 is somewhat satisfied, and 4.0 is neutral. The other 29 items rated below the D2L item ranged in satisfaction scores from 5.96 to 5.12.

2008 PSOL Results

November 25, 2010

The results from the PSOL were waiting for me in my inbox when I returned from vacation on Sunday. I haven’t been able to do a full analysis just yet, but the preliminary look is very encouraging. I will be making several posts to this blog over the next couple of months detailing what we’ve learned from our online students from this fourth administration of the Noel-Levitz survey.

In broad strokes, of the 26 items on the standard PSOL, our students were more satisfied than the national average on 21 items, and less satisfied on only 5. Even better, the mean differences for those five items were not statistically significant from the national average. Better still, ten of the positive differences were statistically significant when compared to the national average. Here are the ten items: (click to enlarge)

PSOL results 2008

The asterisks system works as follows: 3 stars (***) indicates that the difference is significant at the .001 level. 2 stars (**) is a significance level of .01, and one star (*) is the .05 level.

We also have comparison data from the 2006 survey and the 2008 survey and I will be asking for a peer group report as soon as I get a chance. Once again the aggregate Minnesota Online data is not very pretty, but I’ll be posting some info soon about how our results compare with our consortium results.

Even though this is my first post on the 2008 results, I should probably repeat that I don’t actually put much stock in comparisons to the national group data since the demographics of that big group are not very comparable to our students. However, it is one of many comparisons that I make in order to see the whole picture.

Survey Incentives

November 25, 2010

USB flash driveWe expect to receive our 2008 PSOL results any day now. This will be our fourth year of using the same instrument to gather data about importance and satisfaction for our online offerings and services. This year we had a 23% response rate with 458 students submitting the survey out of the pool of 2,012 students who were invited to do so. All students taking at least one online course at LSC are invited to submit the survey.

The last time we used the survey was in Spring 2006. We only had a 17% submission rate (325 out of 1,889). In an effort to significantly improve the submission rate, this year we offered 40 (2 GB) flash drives by random drawing to those who submitted the survey. There were no incentives on 2006.

For about $440 ($11 per USB drive) we were able to gather data from a more significant group of our online students. I’m thinking that was well worth it. Now we need to find out what they had to say to us.

Short Term Controllability

November 25, 2010

Washington MonumentIt’s one thing to get the data about importance to students and their related levels of satisfaction, it’s quite another to be able to do anything about those things. On the one hand you might have an opportunity to manage student expectations which could possibly affect either importance or satisfaction, or both. On the other hand, you might be able to improve services and affect satisfaction scores in a positive way.

On the third hand, there might not be much that you can do at all, at least not without a long time horizon, lots of patience, and maybe lots of money. As I look at our results of the PSOL, I believe that there are four items where we don’t have much of an opportunity to affect results. They are as follows:

01. This institution has a good reputation.
06. Tuition paid is a worthwhile investment.
08. Student-to-student collaborations are valuable to me.
09. Adequate financial aid is available.

College reputations are not built or destroyed overnight. It may be nice to know what your students think about your reputation, but there might not be too much that you can do about it. I suppose you could start an internal promotional campaign designed to convince your current students that you really are much better than they think, but that seems silly, self-serving, and somewhat pathetic.

Convincing students that their tuition dollars are being well spent is also a rather futile exercise in my opinion. Even though that may be true, many people think that there is always a better deal right around the corner. The grass is always greener somewhere else which means that tuition is always cheaper elsewhere or the quality is better or both. Good luck trying to convince students (or any consumers) that they have misjudged the value in what they’re paying for.

Student-to-student collaboration? I’ve already posted about that a couple of times. It’s a bit of a strange question for students to assess, especially on the satisfaction scale. My satisfaction with collaborations is more a function of who the collaborators are rather than something that the institution can control.

Finally, adequate financial aid? Are you kidding me? The only adequate financial aid for many people is something that covers all their expenses (that includes beer money) and doesn’t have to be paid back. Short of that they will rate financial aid adequacy as extremely important and satisfaction level as quite low. Again, this begs the question of what the institution can possibly do about the huge gap between the importance and satisfaction scores here. Start giving away money in the hallways? I doubt it.

All told, only four out of 26 survey items that you have very little control over isn’t too bad. The other items I believe that you either have a great deal or a moderate amount of control over, or ability to influence in the short term of 1-3 year. Those are the items where you may be able to see some increases in your survey results if you pick your strategies wisely.

(What does the Washington Monument have to do with this post? Nuthin’. I just like it and it’s mine, so why not?)

Meeting Expectations

November 24, 2010

One of the important things to do with your PSOL data is to look at the gaps – the differences between the iMind the Gapmportance ratings and the satisfaction ratings expressed by the students. You’re always concerned about the larger gaps which indicate where you have significant room for improvement, if it is an item where you have much control over (or impact upon) the level of student satisfaction.

At the same time, I like to look at the smaller gaps. You also need to have some positive reinforcement for those things that are going very well – for those things where you are essentially meeting the student expectations. The seven items listed below come from the 26 standard items on the PSOL and indicate the lowest performance gaps for Lake Superior College in the FY06 satisfaction survey. The item number is shown first, followed by the full text of the item, and then the size of the gap (importance score minus satisfaction score).

08. Student-to-student collaborations are valuable to me = (.42)
This one is always interesting (see previous post about this item) because the students are basically telling you something like this: “I really don’t care much about this, but you’re doing a pretty good job with it!” This item has a decent (not great) satisfaction rating, but it has an extremely low importance score.

01. This institution has a good reputation = .02
This item is interesting to me because it tells me that we are serving the proper audience. Our students know who we are and what we can do for them. For the national numbers, the importance score is much higher than ours, which is why the national gap was .39 while ours was only .02. The satisfaction scores were almost identical, but reputation is more important to more of the students in the national survey. Keep in mind that many of those students are graduate students and have a very different demographic than our students.

18. Registration for online courses is convenient = .14
We have a low gap here because our registration system is completely online and has been for several years. It is managed by the state system (MnSCU) and seems to meet students’ needs quite effectively. This is always one of the most important survey items for students, but one with consistently high satisfaction scores.

21. Adequate online library resources are provided = .33
For several years now we have had significant online library resources available to all students. Our biggest issue is in getting them to use the resources, not whether the resources are online or not.

24. Tutoring services are readily available for online courses = .33
We are currently in our fifth year of offering online tutoring services through SMARTHINKING. We have always scored highly in this category, but you also have to keep in mind that only 10-15% of the students use tutoring services (either online or on-ground) and so the importance score is one of the lowest on the survey.

23. Billing and payment procedures are convenient for me. = .35
Much like our online registration, the billing and payment function is managed centrally for all 32 institutions. This is an item with a high importance score, but also a high satisfaction score.

19. Online career services are available = .35
This last one also is not very important to many of our online students, since so many of our online students are transfer students working on an A.A. degree of the Minnesota Transfer Curriculum. Our gap is low because this is not a hot topic for most of our students.

For me, the takeaways here are as follows:

  1. Small performance gaps are often the result on items where the student ratings indicate a low importance score. That is the case for items 1, 8, 19, and 24 above. All four of those items have an importance score below 6.0 which generally indicates that most students rate them somewhere between “slightly important” and “important.”
  2. Small performance gaps on the other three items (18, 21, and 23) indicate areas where you are really meeting the needs of the students. These items have high importance scores (above 6.0) and high satisfaction scores. These are things to be proud of.
  3. Gap analysis is not an exact science. It is only a starting point in looking for places where you may be able to improve the services that are being offered to students.

(CC Flickr photo by Annie Mole)