journey of thousand miles photo

Question - Should e-learning courses have progress bars?

If you answered Yes then you will not be rebelling against the norm that online courses are expected to have a progress bar, it is pretty much a requirement.  Everyone has it, we look for it when we are presented with a multi-page survey or an e-learning course.  We want to know where we are.

If you answered No then we would love to hear your rationale, because this was our position as well, since our launch two years ago.  Progress bars can be distracting.  If I am at 0%, and after 10 minutes of learning and clicking, i am at 3% then it tells me the embarrassingly minuscule progress I have made and that a long and arduous road lies ahead.  So is it better not to show anything at all rather than to show a participant that they still have 97% left?

If you answer It Depends then your answer is a bit more correct vs. the Yes/No answers… read on and find out why.

>> You have read 20% of this article

Why obsess over progress bars?

Since our livelihood at Get Inclusive depends on delivering engaging educational content through online courses, we have to take all aspects of the learner’s experience very seriously.  A progress bar (or lack thereof) is a very important component of the learner’s experience.  In this post I want to share with you what we found, how we thought about it and how we ended up putting this research to practice.

>>>> you have read 30% of this article

So what is the consensus on progress bars research?

Turned out (to our surprise) that very smart people (with PhDs) have spent a lot of hours researching and publishing the effectiveness of progress bars.  During our search for the ultimate truth on this seemingly simple topic of progress bars, we came across a 2013 meta-research that nicely combines all the other research into a neatly packaged consensus. These researchers (from City College London, Google UK and Gallup Poll) cite that:

… effect of progress indicators appear to be mixed, where some studies found that progress indicators reduced drop-offs, some concluded that they increased drop-offs, and yet others found that they had no effect on drop-off rates.

Drop-offs are bad.  This is when the participant either stops working on the survey, or stops paying attention.  For all intents and purposes, drop-off means you have lost the participant.  They will do whatever is needed to get to the end if it is a requirement but if they had an option, they would rather not participate.

>>>> you have read 45% of this article

Relevance of this study to Get Inclusive

While this research was focused on experiments observing participant behavior with multi-page surveys, the relevance to online e-learning courses should be readily apparent.  In multi-page surveys, as in online e-learning modules, participants are presented with content (statements, questions, videos, cheesy photos, etc.) and are asked to respond to questions.  e-Learning modules may have more emphasis on content but the mechanics or content delivery and participant interaction remain very much the same.  Please note that at Get Inclusive we have a 0% cheese policy, we are highly allergic to it, so we really obsess over the content as much as on the dilemma of the progress bar.

What really matters in these long interactive content delivery methods (multi-page surveys or e-learning courses) is drop-off rate – this happens when a participant closes the browser windows, or just tunes out and starts clicking the “next” button as if they are being chased by killer whales or, physically dropping off their chairs out of sheer boredom.  Drop offs are bad.  They have to be measured and managed.  We measure drop-off in terms of level of engagement on any given page.  Measuring the number of participants who didn’t finish is not enough.  Drop-off is not a cliff, it’s a slow slide and as an e-learning training provider or a multi-page surveyor, you should be able to pinpoint where the slow-slide into drop-off land began.

>>>> you have read 50% of this article

Not all Progress Bars are Created Equal…

What we really loved about this research is that they took the (ahem, cheese warning) progress on the progress bar research to a whole new level.  They did this by setting up 32 experiments and showing the participants three types of progress bars (yes, we were in heaven when we found out that there are different types of progress bars):

  1. Linear progress bar – constant speed, e.g. if you are on page 1 of a 10 page survey, you will see progress bar jump up 10% as you go from one page to the next.
  2. Slow-to-fast progress bar  – starts out slower than the your actual progress, but as you are closer to the end it speeds up,  e.g. it may show 5% increments in the beginning but larger gains towards the end.
  3. Fast-to-slow progress bar – as you can guess, this one starts out with bigger leaps in the beginning as you turn the pages but slower towards the end.

And the winner is…

According to this research, the fast-to-slow progress bar is a clear winner.  Constant speed (linear progress bar) increases drop-off.  Slow-to-fast progress bars also increase drop-off rates, as it is plain discouraging to be rewarded “less” than the mathematical actual progress.  So why is it that fast-to-slow progress bars are so effective?  Before we go there, let us consider the ethical implications of using such a progress bar which does not reflect the true mathematical reality of where a person is.

And what about the ethics of this magic trick?

Having a “fast-to-slow” progress bars does seem to be a manipulative mind-trick designed to dupe the learners into believing that they are further ahead than they actually are.  So before we considered whether/how to implement, we had to be comfortable with the ethics behind it.  In this complex domain of intersection of mathematics, learning and philosophy, we turned to our favorite Chinese philosopher (no, not Confucius, actually his predecessor Laozi) who said: “A journey of a thousand miles begins with a single step”.  Let us look at what the steps are for a learner when they participate in one of our courses:

  1. Participant opens an invitation email sent by their employer/college administrator
  2. Participant clicks on an enrollment link in this email to arrive at the course registration page
  3. Participant enters email/password to register
  4. Participants starts the course and is presented with the content (content pages that take a total of 20-50 minutes to complete)
  5. Participant clicks “done” at the end of the course and receives a PDF certificate

The participant who takes the first step (i.e., clicking the enrollment link) is infinitely more likely to complete the course than one who ignored the email.  Should that not count for anything? we believe it should.  But then there is another drop-off point, what if they come to the registration page but see this as a less important/less urgent task vs. other things and never return.  Again, another very important step that is mathematically not linked to the progress bar at all in a traditional sense.  Why would we present the participant with a big 0% if they have taken these important steps.

No credit for the biggest obstacle to learning? e-Learning participants navigate around the biggest obstacle at the beginning of the e-learning course,one that doesn’t exist towards the end – namely, assessing Cognitive Load of the task – how much mental effort is this thing going to take.  Two activities requiring identical time commitments can have different cognitive loads, e.g. Multiplying pairs of 4 digit numbers for 10 minutes vs. reading your favorite newspaper.  Participants are trying to assess whether the cognitive load is worth the positive (or negative) incentives that come with participating in the course.  Participants develop this tradeoff as quickly as they can, in the first few interactions, and then they confirm this assessment in the next few interactions and continue or put this in a “return some other day” bucket.

The very low (less than 10%) completion rate of online courses at sites such as Coursera raise questions about how to measure progress.  The completion rate increases to 25% for students who answered at least one question correctly on the very first quiz (Inside Higher Ed).  Beginnings matter, progress in the first few pages is much more important towards achieving the goal relative to the last few pages.

In the case of courses at Get Inclusive, we ask participants to engage in real thinking and self-reflection and share their thoughts.  Participants aren’t putting on a video and taking a nap.  They have to put mental energy into it.  We are asking them to get over the cognitive miser part of their brains.  It is really challenging for participants to get through the first 10 pages as compared to the last few pages.  The progress for these first interactions has more value because this is where the participant is committing to the process.  Why wouldn’t we reward them with faster progress during these first few pages?

Beyond lazy estimates

I hope you are seeing our thought process here.  Mathematical truth is critically important yet just a partial truth.  A progress bar needs to be more than a lazy and simple division of “current-page divided by total-pages” based estimate.  A progress bar needs to consider and acknowledge all the obstacles the participant has overcome to get to where they are and how this progress relates to their reaching the “goal”.  Now we are in the land of heuristics-based progress bars and are comfortable talking you through how we implemented it in our courses.

Progress Bars at Get Inclusive

Everyone Start at 8%

Our customers (employers, college administration) send out an email introducing us and letting their participants know that we are providing an online course and communicating any other requirements (deadline, incentives, etc.) All participants who put in the effort of opening this email and clicking a link in it to register for the course are acknowledged with an 8% progress.  Several factors are taken into considering to come up with this number, which include drop-off/engagement rates and obviously the high degree of importance of these initial steps.

Acknowledging Cognitive Load

Participants are then presented with course content.  We know that they will be doing an assessment of cognitive load and deciding whether they should participate in the course now or come back “tomorrow”.  For this main part of the learning process, we show a distraction free progress bar at the top which visually (not numerically) communicates where the participants is.  You are looking at the first page of the course in the image below, the 8% starting progress carries over visually.

Visual vs. Numeric Progress Bar

We opted for a visual progress bar (as opposed to numeric) in the actual content pages.  We believe it matters a lot less whether the progress is 45% or 55%, it is “half-way”.  And this rounding is perfectly communicated via a green bar that fills from left to right as the participant makes progresses through the course pages.  On the dashboard, however, both visual and numerical versions are displayed.

Fast-to-slow Progress Bar

After having progress-bar-free courses since our founding, we had sufficient reasons to evolve our thinking and dip our toes into the murky lands of progress bars.  We experimented with various profiles and ended up with the one shown below.  A curve-fit formula helped us implement this in code.

For a participant going through the course, we first calculate the course page progress (x-axis), which is based on the lazy math (current page divided by total pages).  As noted earlier, this does not acknowledge the cognitive load assessment the participant is making and the important “commitment” steps s/he is taking during the first few pages of the course.  So we acknowledge that by adjusting upwards the first quarter of the course.  We also consider the content of the course pages in this adjustment.  Content presented earlier is foundational and progress there has a relatively higher impact on subsequent modules.  We expect this to be the case for most instructional courses.

>>>>> You have read 95% of this article

What about the "slow" part of the fast-to-slow progress bar?

What goes up must come down and the progress bar that moves with larger increments in the beginning must slow down towards the end. This raises an important question - isn't this demotivating towards the end? while this may be true, there is strong evidence to suggest that as humans we naturally experience higher motivation levels towards completing a task if we can visualize (literally see) the goal or the finish line.  For e-learning courses, being near the goal is represented by the progress bar being nearly full (or visually more full than empty).  Evidence of this extra motivation closer to the goal has been a subject of several research efforts (e.g., Cheema and Bagchi 2011).  So all in all, the fast-to-slow progress bar reduces progress increments from the second half where extra motivational juice is naturally present and acknowledges the challenges the participants have to overcome during the first half by higher progress increments.

What’s next

We launched the progress bar about a month ago.  We will be evaluating the engagement data we collect to see what the impact is from our days of not having a progress bar at all.  We will not be evaluating the efficacy of our chosen fast-to-slow progress bar vs. linear as we are satisfied with what the research has concluded.  What we are curious about is improving the accuracy with which we attribute progress in the initial steps because…

A journey of a thousand miles begins with a single step

Laozi (c 604 bc – c 531 bc)

You have reached 100% of this post – Did you observe that you were presented with a “You have reached x%” message during the beginning sections of this post.  Did you notice that it went missing? was having it helpful? From a purely word-count perspective, you were at 34% when we showed 50% completion.  Would love to hear your thoughts on this.