Viewing entries in

Campus Climate Surveys for Community Colleges

older student photoWhen going about conducting a campus climate survey, there are a number of factors that need to be taken into account. Community colleges have special considerations in the administration of their campus climate surveys. Issues surrounding student demographics and resources can all affect the success and results of your survey. These considerations and differences do not make conducting a campus climate survey on a college campus impossible nor useless. Understanding these differences enable you to make the changes necessary to cater your survey to your particular institutions needs.

Student Demographics

Community colleges have a different student body makeup than that of a traditional 4-year college or university. One of the biggest mistakes you could make while conducting a campus climate survey on a community college campus is to assume that all of your students are the same.

To make sure that you are able to understand your survey results within the context of your different student groups, make sure to ask distinguishing questions, as well a questions that would only apply to those particular students. This can be accomplished by placing skip-questions strategically within your survey; that way, students only answer questions that directly apply to them. Here are a few student types that require special consideration.

Part-time students

Community colleges generally have more part-time students than they do full time students. This differs from more traditional colleges where there is a sizable student population that resides on campus and have more full-time students. When surveying, make sure to take their status into account. Part-time and full-time students may have different experiences on campus. For example, part-time students may be more likely to take night classes to make room for their work schedules and therefore get a different perspective of campus than day students.

2-year degrees

Campus climate surveys are meant to be administered every two years or so. This timeline is useful to see trends over time; conduct the surveys too frequently and it becomes harder to see the effect of policy changes or opinions. The fact that a lot of community college students pursue 2-year degrees means that many students will not be able to participate in a second survey. If many of your community college’s students are pursuing 2-year degrees, it may not make sense to follow up with students for future surveys.

Collecting information from 2-year degree students can still yield useful information even if you won’t be able to follow up with them again in subsequent years, as is the case with 4 year students, which allows for multiple opportunities to survey. In a way, you can use this to your advantage; every two years the majority of your students will be giving you a fresh and potentially less biased data of their experiences on your campus.

Older Students

Community college students tend to have a broader age range and educational backgrounds vs. traditional 4 year colleges and universities.  Make sure that the survey instrument allows for skip/branch logic so certain questions can be skipped if it does not apply to older student demographic.


Community colleges see a fair number of students who commute to school. Commuters do not spend the same amount of time on campus, especially at night. They most likely do not live in the campus dorms, where many instances of sexual assault occur. While sexual assault can happen anywhere, anytime, commuters may not be the most helpful group to understanding campus climate. Make sure to ask distinguishing questions to sort answers from those that live on campus and those that don’t.

Community class members

Those involved in community classes are not considered enrolled students. However, they are important to survey because they interact with the campus as well. The problem with community class members is that they may not be involved in the campus awareness campaigns related to the campus climate surveys and may end up not participating in the campus climate survey. Community college’s campus climate survey questionnaire should include demographic questions that allow for segmenting the results by this group.  Your college may wish to forego surveying community class members and focus only on enrolled students, demographic questions can help you make sure of inclusion/exclusion.


The second biggest distinguisher between traditional 4 year universities and community colleges is resources. Community colleges may not have the funding or research teams that are often considered necessary to conduct a successful campus climate survey. However, do not let your funding or campus resources discourage you. Campus climate surveys can be conducted on budgets and with non-social science research teams.

Lack of funds

Community colleges often work with tight funds. Conducting a full campus climate survey can be an expensive project. For campuses with tight funding, hiring outside consulting teams may make economical sense, especially if you do not want to pay for surveying packages and statistical programs. Fortunately, most colleges have access to at least one of these types of programs, one of the biggest costs of conducting and analyzing a survey.

Other ways to save funds would be to assemble a small but competent team to create and conduct the survey. Besides having a small team, the survey may need to be shortened at first to save money on surveying costs. A small, concisely written survey is better than not administering a survey at all.

No research teams

Not all community colleges have research departments. This is especially relevant to trade and vocational schools who likely do not have the social science research skills or departments often used to conduct campus climate survey. The good thing is that a research department or school of social science is not necessary to conduct a successful campus climate survey. The lack of these main resources just means your college must get more creative with who they pick to create the survey. Research skills transfer into many disciplines, and there are bound to be students or faculty at your community college qualified for the job.


Conducting and administering a campus climate survey to a community college poses its own unique challenges, but it is not an impossible project. With the proper preparations and considerations taken into account, your survey can be successful and bring about valuable information about your campus.



Photo by COD Newsroom  "New Student Orientation"

Five Resources for Conducting Campus Climate Surveys at a Community College

Campus Climate Survey Resources and Tools Campus climate surveys are not only useful to 4 year private institutions and public universities, they can be very useful to community colleges as well. Although there are special circumstances which need to be taken into account for conducting effective climate surveys in community colleges, they can be adapted and administered successfully.

Here is a list of 5 helpful resources to assist you in creating and administering your campus climate survey. on Campus Climate Surveys

This is the official government website surrounding the issue of sexual harassment and assault. It gives general information about “how to respond to and prevent sexual assault.”

On their homepage, there is a link to their Resource Guide, which in turn gives schools several links to sources that are intended to help schools create their own campus climate surveys. The government’s own toolkit is included, which gives schools information on why the surveys are important, what they aim to accomplish, and how to successfully create and distribute a survey. This guide also includes a sample campus climate survey, which can be used as inspiration and adapted to your campus’ needs.

Learn from Other College's Campus Climate Survey Experiences

Possibly the most helpful resource for community colleges is learning from how other schools carried out their own campus climate surveys. Rutgers University--New Brunswick was the first university to run a pilot of this new campus climate survey. They documented their process, and created a document to share what they learned with other universities.

In their executive summary of their findings, they briefly mentioned the key lessons that they learned through the surveying process.

  • “Campus climate surveys provide more meaning when they are part of a larger assessment process.”
  • “The administration of campus climate surveys has the most impact when it is linked with the development of an action plan.”
  • “One size does not fit all.”
  • “It is important to find ways to represent all student voices.”
  • “A campus climate survey can be an educational tool in and of itself.”

We highly recommend reading Rutgers Campus Climate Surveys: Lessons Learned from the Rutgers-New Brunswick Pilot Assessment in full, as it provides very valuable insight into the survey process, as well as addressing problems they ran into. Detailed sections include methodology, preparation of assessment measures, implementation of measures, data analysis, and feedback on the survey experience.

Community Colleges and Campus Climate Surveys

Community colleges experiences with campus climate surveys serve as good resources.  There is an ever expanding number of community colleges who have completed campus climate surveys and have published their findings online. Grand Rapids Community College, Feather River College are just two that have publicly shared their experiences and findings. Since campus climate surveys should be tailored to your institution's specific needs, make sure to make the necessary changes on sample surveys to reflect your campus.

Online Survey Tools to Administer the Campus Climate Survey

There are numerous ways to survey a group of people, including interviewing, sending surveys in the mail, telephone, and handing out questionnaires. All surveys have their limitations, but the most adaptable and cost-effective surveying method for community colleges is the online questionnaire.

In choosing the right surveying tool for you, it is important to understand what type of capabilities you want your surveying tool to have. Do you need to have a high level of control on what the survey looks and feels like? Do you need for your statistical analysis to be done within that program, or do you only need a simple collection method to then analyze in a bonafide statistical package?

Idealware has a great article about different surveying tools and the needs they fill. They break down the different tools by ability and price.   One one end of the spectrum are simple and affordable options such as Survey Monkey, and on the other end of the spectrum is Qualtrics, a much pricier option.  The core requirement for Campus Climate Survey's is for the survey tool to ensure anonymity (e.g., IP addresses or any other identifiable data is not captured as part of the response).

Some colleges are engaging third party companies to run the campus climate survey as a stand-alone research project.  The budgetary requirement for this approach tend to be significantly higher.  For community colleges, there are a range of options that could fit their budget and need to augment capabilities.

Statistical Packages to Analyze Campus Climate Survey Response Data

For those looking to analyze survey results in a statistical program, one the one end of spectrum are simple tools such as Excel and on the other end are high sophisticated analysis packages such as STATA and SPSS. Both of these packages are heavily used in social science research. Many community colleges have access to some sort of statistical program.  An equally capable analytics software package gaining a lot of tracking in academic research and data analytics is R.  It is free and open source and its vast array of analytics and graphing libraries make it a formidable competitor to STATA and SPSS.   On


The benefit of using statistical packages in the analysis of surveys is that you can achieve results that are statistically significant. These packages are powerful and offer a multitude of ways to analyze your data, many of which cannot be completed in online survey tools. While they require skill to master, they yield powerful and trusted results.

Resources on your Campus

The last, but most important resource that is available to you in creating and administering your campus climate survey is your campus itself! Community colleges are filled with faculty and students qualified to assist you with survey research, especially those studying social science. In the planning of your survey, don’t forget to utilize the talent on your campus!

In the end, a campus climate survey conducted at a traditional 4 year university and one conducted at a community college may need to be adapted differently to meet their campus’ needs, but the resources used to create and distribute the surveys have a lot in common. In recognizing the tools available to you and your academic institution, you will be able to ensure the success of your campus climate survey.


Campus Climate Surveys: What to Expect in 2016

campus climate survey for 20162016 is a big year for Campus Climate Surveys and legislation. The surveys aims to give institutions the opportunity to better understand their campus and to make informed decisions about how to create and improve the safety of their educational environment. Recently, the US Senate Hearing 7.29.15 addressed the official bill The Campus Accountability and Safety Act. With that, the White House Task Force is adamant about cracking down on sexual assault and violences on campus.  With that, this upcoming year Campus Climate Surveys will potentially be federally mandated for all colleges and universities.

The first step in solving a problem is to name it and know the extent of it -- and a campus climate survey is the best way to do that. The White House Task Force to Protect Students from Sexual Assault

Several states like California and New York have already adopted these surveys and other policies like “Yes Means Yes”. This is a standard that requires affirmative consent — affirmative, conscious and voluntary agreement to engage in sexual activity — throughout the encounter, removing ambiguity for both parties. With these two proactive states leading the way, the Campus Climate Surveys surveys will soon be required as part of a Title IX/Clery Act compliance program.  So what do these tests entail?

They compromise student and employee knowledge about:

    • The Title IX Coordinator’s role;
    • Campus policies and procedures addressing sexual assault;
    • How and where to report sexual violence as a victim/survivor or witness;
    • The availability of resources on and off campus, such as counseling, health, academic assistance;
    • The prevalence of victimization and perpetration of sexual assault, domestic violence, dating violence, and stalking on and off campus during a set time period (for example, the last two years);
    • Bystander attitudes and behavior;
    • Whether victims/survivors reported to the College/University and/or police, and reasons why they did or did not report.
    • The general awareness of the difference, if any, between the institution’s policies and the penal law; and
    • The general awareness of the definition of affirmative consent.

With the topics being brought to light in the college community, many people will now be educated  about the basics of sexual assaults and what to do if involved. Many people remain unaware about the severity and frequency of sexual assaults. So, the surveys are an excellent tool in providing “education” to people who would otherwise be blind to an offensive and serious situation.

Surveys May Hold Problems Clearing Areas Of Ambiguity

The surveys seem invaluable but, there are also challenges like areas of ambiguity for the utilization and implementation of them. It is unclear for what purpose a climate survey would be used: “Is it intended as a consumer information tool, an institutional improvement tool, an enforcement mechanism or some combination of all three?” The answer to this question could have a substantial impact on how a survey is designed and on how schools and others react to its results.

However, the plausibility of how these tests results can impact schools across the nation is incremental and erases many doubts. But, the usage and implementation continues to be questioned. As legislation improved and becomes widespread, we will begin to see change in colleges and universities. California and New York provide proof that Campus Climate Surveys and “Yes Means Yes” legislation can work and be properly enforced.

Forerunners New York and California Enact 2015-16 Bills

New York and California are two out of fifty states to enact “Yes Means Yes” legislation requiring Campus Climate Surveys and legislation against sexual assault state-wide. As these forerunners continue to implement their policies, they set an example for the rest of the states to follow.

State by state Campus Climate Survey requirements as of Dec 2015

The leader of the pack, California created a standard in 2014 that requires affirmative consent throughout the encounter, removing ambiguity for both parties. The law protects both individuals by ensuring that there is a mutual understanding. Legislation deems a person who is incapacitated by drugs or alcohol cannot give consent. With this legislation, California colleges are being held more accountable for prevention, evaluation and a consistent protocol surrounding sexual assault.

New York’s Campus Climate Assessment Policy gives institutions the opportunity to increasingly understand their campus and to make informed decisions when it comes to providing a safe educational environment. Beginning in the 2015-2016 academic year, each State University of New York State-operated and community college will conduct a uniform climate survey that ascertains student experience with and knowledge of reporting and college adjudicatory processes for sexual harassment, including sexual violence, and other related crimes.

With these states creating the standard, Affirmative Consent laws and policies are making their way through the states. To Keep updated with continuing legislation, here is an updated list of Title IX Schools under investigation for Sexual Assault by the US Department of Education.

Affirmative consent legislation isn’t just about the more than 20 percent of young women and girls who will have to live as assault survivors. It’s about the 100 percent of women who have to live every day, never quite certain of their physical safety. Research shows that with affirmative consent education, we can create a culture of respect.”


Using Custom Fonts with AWS OpsWorks, Cloudfront and Rails 4.1

We attempted to update the fonts of course apps to custom fonts and it was a very painful experience.  Here is the summary of the challenge and the various solutions we banged our heads against, and finally a solution that worked.

The setup

The following problem and solution will only apply to you if you are using a setup similar to ours.  We have an AWS Opsworks hosted Rails app which is using Cloudfront as the CDN then you will face some interesting challenges.

The Problem

The browser is throwing this error and refusing to load the font because the font is coming from a different domain (e.g. vs. what is displayed in the address bar.  When browser did a GET request on the font file from cloudfront, it did not get the Origin header allowing the site in the address bar to use this font... In a way, it is preventing you from hot-linking a font file.  Even though you it is a licensed font and you have it in your repo assets.

Why does this problem exist?

Fonts are special assets, unlike images, js and css, they come with higher intellectual property protection standards and browsers are starting to recognize that.  If your assets are being loaded via cloudfront's SSL URL then everything except fonts will load fine, browser console will show CORS error.

Rule out "quick fixes" for opsworks / cloudfront setup

  1. You can whitelist CORS settings on cloudfront; while necessary, this is insufficient, because cloudfront will just cache the headers as provided by your load balancer (meaning your nginx/app instance)
  2. Changing nginx configuration would have been an easy fix but this will not work because if you are using Opsworks standard setup/deploy recipes, nginx header customization is not an option for you
  3. font-asset gem will not work because you rails app will not get to serve the font assets at all, assets will be served by nginx by default as it sees the assets in public/assets directory and saves your rails stack from having to serve these assets
  4. Setting "config.serve_static_assets=true" will not work, again, nginx is setup in standard opsworks setup to serve as a reverse proxy and will serve a precompiled asset (font) just as it would serve an image without hitting your rails app
  5. rails-cors gem will not work, again for the same reasons as before, pre-compiled assets get served by nginx, there is no easy way to reach back into your app where gems like rails-cors and font-asset could help

The Hack

We were able to get custom fonts to work only by setting up another origin in cloudfront which would hit an s3 bucket for all paths with "webfont.*" in their names.  For this to work the font had to renamed to have "xxx-webfont.ttf" filenames.  Here is the list of changes/setup we created to fix the CORS pain:

  • Setup an S3 bucket which will have all your fonts
  • Rename the fonts to have the word "webfont" in the name
  • Setup a CORS policy in this bucket (sample here)
  • For your cloudfront distribution, which already has a Default origin setup, add a new origin which will point to this s3 bucket
  • S3 bucket permissions
    • You can have your s3 setup as a "public website" if you are ok with all the fonts in your bucket to be publicly accessible, we didn't. We created the bucket with default permissions and then let cloudfront Origin setting create a special policy to access this s3 bucket
  • Now create a new Origin in Cloudflare; in  Origin Settings make sure to:
    • Set Restrict Bucket Access = Yes
    • In Your Identities, let it Create a new Identity (this will be used by cloudfront to access your restricted s3 bucket which has the fonts)
    • Set Grant Read Permissions on Bucket = "Yes, Update Bucket Policy".  This will create the policy for the bucket and let cloudfront access it
    • In Origin Behavior make sure to set the path pattern to be "-webfont.*" which will make sure that any path having a webfont request will be directed to the s3 bucket origin which you just created; all other paths will be hitting your standard rails stack and will be cached as before
    • Also make sure to set Forward Headers as whitelist, which will allow you to add Origin as a whitelisted header

Screenshot at Sep 21 00-47-15

  • Now in your Rails app, your CSS needs to have "url" instead of rails font URL helpers.  The reason for this is simple.  These font files are static, we do not expect to be generating digests for them and having "url" will allow us to access these files by their origin names instead of one which has the has digest appended to it.  We wanted to keep things simple as we will not be changing the fonts too frequently, and if we do, the filenames will be different enough thereby not needing of the hash digest (vs. assets like application.js, which change frequently while having the same filename).  Ultimately, you can always use query string to override the caching behavior if you are really concerned that stale font files are hanging around in the interwebs for too many hours infecting people's vision.  If you really want to have the precompiled hash-digest files then you will need to copy these to the s3 buckets, you have three options here, we didn't bother with this, we are ok with copying over the font files manually to the bucket as part of our process:
    1. In your chef recipes, you can have a step after asset pre-compilation to copy the assets to the bucket
    2. If you use a continuous integration setup (we use Codeship), then you can use that service to precompile and copy assets (font files, e.g. *.ttf, *.eot, *.wof*) over to the s3 bucket with every deploy
    3. You can use asset-sync gem to copy over the assets after pre-compliation set

With these changes in place, we were able to get the fonts to work in Chrome, Firefox and Hindernet Exploder.  Too many hours wasted to learn all this.  Wish AWS had made it simple to customize the nginx settings...

wish you luck!

Your Progress Bar May be Broken

journey of thousand miles photo

Question - Should e-learning courses have progress bars?

If you answered Yes then you will not be rebelling against the norm that online courses are expected to have a progress bar, it is pretty much a requirement.  Everyone has it, we look for it when we are presented with a multi-page survey or an e-learning course.  We want to know where we are.

If you answered No then we would love to hear your rationale, because this was our position as well, since our launch two years ago.  Progress bars can be distracting.  If I am at 0%, and after 10 minutes of learning and clicking, i am at 3% then it tells me the embarrassingly minuscule progress I have made and that a long and arduous road lies ahead.  So is it better not to show anything at all rather than to show a participant that they still have 97% left?

If you answer It Depends then your answer is a bit more correct vs. the Yes/No answers… read on and find out why.

>> You have read 20% of this article

Why obsess over progress bars?

Since our livelihood at Get Inclusive depends on delivering engaging educational content through online courses, we have to take all aspects of the learner’s experience very seriously.  A progress bar (or lack thereof) is a very important component of the learner’s experience.  In this post I want to share with you what we found, how we thought about it and how we ended up putting this research to practice.

>>>> you have read 30% of this article

So what is the consensus on progress bars research?

Turned out (to our surprise) that very smart people (with PhDs) have spent a lot of hours researching and publishing the effectiveness of progress bars.  During our search for the ultimate truth on this seemingly simple topic of progress bars, we came across a 2013 meta-research that nicely combines all the other research into a neatly packaged consensus. These researchers (from City College London, Google UK and Gallup Poll) cite that:

… effect of progress indicators appear to be mixed, where some studies found that progress indicators reduced drop-offs, some concluded that they increased drop-offs, and yet others found that they had no effect on drop-off rates.

Drop-offs are bad.  This is when the participant either stops working on the survey, or stops paying attention.  For all intents and purposes, drop-off means you have lost the participant.  They will do whatever is needed to get to the end if it is a requirement but if they had an option, they would rather not participate.

>>>> you have read 45% of this article

Relevance of this study to Get Inclusive

While this research was focused on experiments observing participant behavior with multi-page surveys, the relevance to online e-learning courses should be readily apparent.  In multi-page surveys, as in online e-learning modules, participants are presented with content (statements, questions, videos, cheesy photos, etc.) and are asked to respond to questions.  e-Learning modules may have more emphasis on content but the mechanics or content delivery and participant interaction remain very much the same.  Please note that at Get Inclusive we have a 0% cheese policy, we are highly allergic to it, so we really obsess over the content as much as on the dilemma of the progress bar.

What really matters in these long interactive content delivery methods (multi-page surveys or e-learning courses) is drop-off rate – this happens when a participant closes the browser windows, or just tunes out and starts clicking the “next” button as if they are being chased by killer whales or, physically dropping off their chairs out of sheer boredom.  Drop offs are bad.  They have to be measured and managed.  We measure drop-off in terms of level of engagement on any given page.  Measuring the number of participants who didn’t finish is not enough.  Drop-off is not a cliff, it’s a slow slide and as an e-learning training provider or a multi-page surveyor, you should be able to pinpoint where the slow-slide into drop-off land began.

>>>> you have read 50% of this article

Not all Progress Bars are Created Equal…

What we really loved about this research is that they took the (ahem, cheese warning) progress on the progress bar research to a whole new level.  They did this by setting up 32 experiments and showing the participants three types of progress bars (yes, we were in heaven when we found out that there are different types of progress bars):

  1. Linear progress bar – constant speed, e.g. if you are on page 1 of a 10 page survey, you will see progress bar jump up 10% as you go from one page to the next.
  2. Slow-to-fast progress bar  – starts out slower than the your actual progress, but as you are closer to the end it speeds up,  e.g. it may show 5% increments in the beginning but larger gains towards the end.
  3. Fast-to-slow progress bar – as you can guess, this one starts out with bigger leaps in the beginning as you turn the pages but slower towards the end.

And the winner is…

According to this research, the fast-to-slow progress bar is a clear winner.  Constant speed (linear progress bar) increases drop-off.  Slow-to-fast progress bars also increase drop-off rates, as it is plain discouraging to be rewarded “less” than the mathematical actual progress.  So why is it that fast-to-slow progress bars are so effective?  Before we go there, let us consider the ethical implications of using such a progress bar which does not reflect the true mathematical reality of where a person is.

And what about the ethics of this magic trick?

Having a “fast-to-slow” progress bars does seem to be a manipulative mind-trick designed to dupe the learners into believing that they are further ahead than they actually are.  So before we considered whether/how to implement, we had to be comfortable with the ethics behind it.  In this complex domain of intersection of mathematics, learning and philosophy, we turned to our favorite Chinese philosopher (no, not Confucius, actually his predecessor Laozi) who said: “A journey of a thousand miles begins with a single step”.  Let us look at what the steps are for a learner when they participate in one of our courses:

  1. Participant opens an invitation email sent by their employer/college administrator
  2. Participant clicks on an enrollment link in this email to arrive at the course registration page
  3. Participant enters email/password to register
  4. Participants starts the course and is presented with the content (content pages that take a total of 20-50 minutes to complete)
  5. Participant clicks “done” at the end of the course and receives a PDF certificate

The participant who takes the first step (i.e., clicking the enrollment link) is infinitely more likely to complete the course than one who ignored the email.  Should that not count for anything? we believe it should.  But then there is another drop-off point, what if they come to the registration page but see this as a less important/less urgent task vs. other things and never return.  Again, another very important step that is mathematically not linked to the progress bar at all in a traditional sense.  Why would we present the participant with a big 0% if they have taken these important steps.

No credit for the biggest obstacle to learning? e-Learning participants navigate around the biggest obstacle at the beginning of the e-learning course,one that doesn’t exist towards the end – namely, assessing Cognitive Load of the task – how much mental effort is this thing going to take.  Two activities requiring identical time commitments can have different cognitive loads, e.g. Multiplying pairs of 4 digit numbers for 10 minutes vs. reading your favorite newspaper.  Participants are trying to assess whether the cognitive load is worth the positive (or negative) incentives that come with participating in the course.  Participants develop this tradeoff as quickly as they can, in the first few interactions, and then they confirm this assessment in the next few interactions and continue or put this in a “return some other day” bucket.

The very low (less than 10%) completion rate of online courses at sites such as Coursera raise questions about how to measure progress.  The completion rate increases to 25% for students who answered at least one question correctly on the very first quiz (Inside Higher Ed).  Beginnings matter, progress in the first few pages is much more important towards achieving the goal relative to the last few pages.

In the case of courses at Get Inclusive, we ask participants to engage in real thinking and self-reflection and share their thoughts.  Participants aren’t putting on a video and taking a nap.  They have to put mental energy into it.  We are asking them to get over the cognitive miser part of their brains.  It is really challenging for participants to get through the first 10 pages as compared to the last few pages.  The progress for these first interactions has more value because this is where the participant is committing to the process.  Why wouldn’t we reward them with faster progress during these first few pages?

Beyond lazy estimates

I hope you are seeing our thought process here.  Mathematical truth is critically important yet just a partial truth.  A progress bar needs to be more than a lazy and simple division of “current-page divided by total-pages” based estimate.  A progress bar needs to consider and acknowledge all the obstacles the participant has overcome to get to where they are and how this progress relates to their reaching the “goal”.  Now we are in the land of heuristics-based progress bars and are comfortable talking you through how we implemented it in our courses.

Progress Bars at Get Inclusive

Everyone Start at 8%

Our customers (employers, college administration) send out an email introducing us and letting their participants know that we are providing an online course and communicating any other requirements (deadline, incentives, etc.) All participants who put in the effort of opening this email and clicking a link in it to register for the course are acknowledged with an 8% progress.  Several factors are taken into considering to come up with this number, which include drop-off/engagement rates and obviously the high degree of importance of these initial steps.

Acknowledging Cognitive Load

Participants are then presented with course content.  We know that they will be doing an assessment of cognitive load and deciding whether they should participate in the course now or come back “tomorrow”.  For this main part of the learning process, we show a distraction free progress bar at the top which visually (not numerically) communicates where the participants is.  You are looking at the first page of the course in the image below, the 8% starting progress carries over visually.

Visual vs. Numeric Progress Bar

We opted for a visual progress bar (as opposed to numeric) in the actual content pages.  We believe it matters a lot less whether the progress is 45% or 55%, it is “half-way”.  And this rounding is perfectly communicated via a green bar that fills from left to right as the participant makes progresses through the course pages.  On the dashboard, however, both visual and numerical versions are displayed.

Fast-to-slow Progress Bar

After having progress-bar-free courses since our founding, we had sufficient reasons to evolve our thinking and dip our toes into the murky lands of progress bars.  We experimented with various profiles and ended up with the one shown below.  A curve-fit formula helped us implement this in code.

For a participant going through the course, we first calculate the course page progress (x-axis), which is based on the lazy math (current page divided by total pages).  As noted earlier, this does not acknowledge the cognitive load assessment the participant is making and the important “commitment” steps s/he is taking during the first few pages of the course.  So we acknowledge that by adjusting upwards the first quarter of the course.  We also consider the content of the course pages in this adjustment.  Content presented earlier is foundational and progress there has a relatively higher impact on subsequent modules.  We expect this to be the case for most instructional courses.

>>>>> You have read 95% of this article

What about the "slow" part of the fast-to-slow progress bar?

What goes up must come down and the progress bar that moves with larger increments in the beginning must slow down towards the end. This raises an important question - isn't this demotivating towards the end? while this may be true, there is strong evidence to suggest that as humans we naturally experience higher motivation levels towards completing a task if we can visualize (literally see) the goal or the finish line.  For e-learning courses, being near the goal is represented by the progress bar being nearly full (or visually more full than empty).  Evidence of this extra motivation closer to the goal has been a subject of several research efforts (e.g., Cheema and Bagchi 2011).  So all in all, the fast-to-slow progress bar reduces progress increments from the second half where extra motivational juice is naturally present and acknowledges the challenges the participants have to overcome during the first half by higher progress increments.

What’s next

We launched the progress bar about a month ago.  We will be evaluating the engagement data we collect to see what the impact is from our days of not having a progress bar at all.  We will not be evaluating the efficacy of our chosen fast-to-slow progress bar vs. linear as we are satisfied with what the research has concluded.  What we are curious about is improving the accuracy with which we attribute progress in the initial steps because…

A journey of a thousand miles begins with a single step

Laozi (c 604 bc – c 531 bc)

You have reached 100% of this post – Did you observe that you were presented with a “You have reached x%” message during the beginning sections of this post.  Did you notice that it went missing? was having it helpful? From a purely word-count perspective, you were at 34% when we showed 50% completion.  Would love to hear your thoughts on this.