Viewing entries in
research

Campus Climate Surveys for Community Colleges

older student photoWhen going about conducting a campus climate survey, there are a number of factors that need to be taken into account. Community colleges have special considerations in the administration of their campus climate surveys. Issues surrounding student demographics and resources can all affect the success and results of your survey. These considerations and differences do not make conducting a campus climate survey on a college campus impossible nor useless. Understanding these differences enable you to make the changes necessary to cater your survey to your particular institutions needs.

Student Demographics

Community colleges have a different student body makeup than that of a traditional 4-year college or university. One of the biggest mistakes you could make while conducting a campus climate survey on a community college campus is to assume that all of your students are the same.

To make sure that you are able to understand your survey results within the context of your different student groups, make sure to ask distinguishing questions, as well a questions that would only apply to those particular students. This can be accomplished by placing skip-questions strategically within your survey; that way, students only answer questions that directly apply to them. Here are a few student types that require special consideration.

Part-time students

Community colleges generally have more part-time students than they do full time students. This differs from more traditional colleges where there is a sizable student population that resides on campus and have more full-time students. When surveying, make sure to take their status into account. Part-time and full-time students may have different experiences on campus. For example, part-time students may be more likely to take night classes to make room for their work schedules and therefore get a different perspective of campus than day students.

2-year degrees

Campus climate surveys are meant to be administered every two years or so. This timeline is useful to see trends over time; conduct the surveys too frequently and it becomes harder to see the effect of policy changes or opinions. The fact that a lot of community college students pursue 2-year degrees means that many students will not be able to participate in a second survey. If many of your community college’s students are pursuing 2-year degrees, it may not make sense to follow up with students for future surveys.

Collecting information from 2-year degree students can still yield useful information even if you won’t be able to follow up with them again in subsequent years, as is the case with 4 year students, which allows for multiple opportunities to survey. In a way, you can use this to your advantage; every two years the majority of your students will be giving you a fresh and potentially less biased data of their experiences on your campus.

Older Students

Community college students tend to have a broader age range and educational backgrounds vs. traditional 4 year colleges and universities.  Make sure that the survey instrument allows for skip/branch logic so certain questions can be skipped if it does not apply to older student demographic.

Commuters

Community colleges see a fair number of students who commute to school. Commuters do not spend the same amount of time on campus, especially at night. They most likely do not live in the campus dorms, where many instances of sexual assault occur. While sexual assault can happen anywhere, anytime, commuters may not be the most helpful group to understanding campus climate. Make sure to ask distinguishing questions to sort answers from those that live on campus and those that don’t.

Community class members

Those involved in community classes are not considered enrolled students. However, they are important to survey because they interact with the campus as well. The problem with community class members is that they may not be involved in the campus awareness campaigns related to the campus climate surveys and may end up not participating in the campus climate survey. Community college’s campus climate survey questionnaire should include demographic questions that allow for segmenting the results by this group.  Your college may wish to forego surveying community class members and focus only on enrolled students, demographic questions can help you make sure of inclusion/exclusion.

Resources

The second biggest distinguisher between traditional 4 year universities and community colleges is resources. Community colleges may not have the funding or research teams that are often considered necessary to conduct a successful campus climate survey. However, do not let your funding or campus resources discourage you. Campus climate surveys can be conducted on budgets and with non-social science research teams.

Lack of funds

Community colleges often work with tight funds. Conducting a full campus climate survey can be an expensive project. For campuses with tight funding, hiring outside consulting teams may make economical sense, especially if you do not want to pay for surveying packages and statistical programs. Fortunately, most colleges have access to at least one of these types of programs, one of the biggest costs of conducting and analyzing a survey.

Other ways to save funds would be to assemble a small but competent team to create and conduct the survey. Besides having a small team, the survey may need to be shortened at first to save money on surveying costs. A small, concisely written survey is better than not administering a survey at all.

No research teams

Not all community colleges have research departments. This is especially relevant to trade and vocational schools who likely do not have the social science research skills or departments often used to conduct campus climate survey. The good thing is that a research department or school of social science is not necessary to conduct a successful campus climate survey. The lack of these main resources just means your college must get more creative with who they pick to create the survey. Research skills transfer into many disciplines, and there are bound to be students or faculty at your community college qualified for the job.

Conclusions

Conducting and administering a campus climate survey to a community college poses its own unique challenges, but it is not an impossible project. With the proper preparations and considerations taken into account, your survey can be successful and bring about valuable information about your campus.

 

----

Photo by COD Newsroom  "New Student Orientation"

Five Resources for Conducting Campus Climate Surveys at a Community College

Campus Climate Survey Resources and Tools Campus climate surveys are not only useful to 4 year private institutions and public universities, they can be very useful to community colleges as well. Although there are special circumstances which need to be taken into account for conducting effective climate surveys in community colleges, they can be adapted and administered successfully.

Here is a list of 5 helpful resources to assist you in creating and administering your campus climate survey.

NotAlone.gov on Campus Climate Surveys

This is the official government website surrounding the issue of sexual harassment and assault. It gives general information about “how to respond to and prevent sexual assault.”

On their homepage, there is a link to their Resource Guide, which in turn gives schools several links to sources that are intended to help schools create their own campus climate surveys. The government’s own toolkit is included, which gives schools information on why the surveys are important, what they aim to accomplish, and how to successfully create and distribute a survey. This guide also includes a sample campus climate survey, which can be used as inspiration and adapted to your campus’ needs.

Learn from Other College's Campus Climate Survey Experiences

Possibly the most helpful resource for community colleges is learning from how other schools carried out their own campus climate surveys. Rutgers University--New Brunswick was the first university to run a pilot of this new campus climate survey. They documented their process, and created a document to share what they learned with other universities.

In their executive summary of their findings, they briefly mentioned the key lessons that they learned through the surveying process.

  • “Campus climate surveys provide more meaning when they are part of a larger assessment process.”
  • “The administration of campus climate surveys has the most impact when it is linked with the development of an action plan.”
  • “One size does not fit all.”
  • “It is important to find ways to represent all student voices.”
  • “A campus climate survey can be an educational tool in and of itself.”

We highly recommend reading Rutgers Campus Climate Surveys: Lessons Learned from the Rutgers-New Brunswick Pilot Assessment in full, as it provides very valuable insight into the survey process, as well as addressing problems they ran into. Detailed sections include methodology, preparation of assessment measures, implementation of measures, data analysis, and feedback on the survey experience.

Community Colleges and Campus Climate Surveys

Community colleges experiences with campus climate surveys serve as good resources.  There is an ever expanding number of community colleges who have completed campus climate surveys and have published their findings online. Grand Rapids Community College, Feather River College are just two that have publicly shared their experiences and findings. Since campus climate surveys should be tailored to your institution's specific needs, make sure to make the necessary changes on sample surveys to reflect your campus.

Online Survey Tools to Administer the Campus Climate Survey

There are numerous ways to survey a group of people, including interviewing, sending surveys in the mail, telephone, and handing out questionnaires. All surveys have their limitations, but the most adaptable and cost-effective surveying method for community colleges is the online questionnaire.

In choosing the right surveying tool for you, it is important to understand what type of capabilities you want your surveying tool to have. Do you need to have a high level of control on what the survey looks and feels like? Do you need for your statistical analysis to be done within that program, or do you only need a simple collection method to then analyze in a bonafide statistical package?

Idealware has a great article about different surveying tools and the needs they fill. They break down the different tools by ability and price.   One one end of the spectrum are simple and affordable options such as Survey Monkey, and on the other end of the spectrum is Qualtrics, a much pricier option.  The core requirement for Campus Climate Survey's is for the survey tool to ensure anonymity (e.g., IP addresses or any other identifiable data is not captured as part of the response).

Some colleges are engaging third party companies to run the campus climate survey as a stand-alone research project.  The budgetary requirement for this approach tend to be significantly higher.  For community colleges, there are a range of options that could fit their budget and need to augment capabilities.

Statistical Packages to Analyze Campus Climate Survey Response Data

For those looking to analyze survey results in a statistical program, one the one end of spectrum are simple tools such as Excel and on the other end are high sophisticated analysis packages such as STATA and SPSS. Both of these packages are heavily used in social science research. Many community colleges have access to some sort of statistical program.  An equally capable analytics software package gaining a lot of tracking in academic research and data analytics is R.  It is free and open source and its vast array of analytics and graphing libraries make it a formidable competitor to STATA and SPSS.   On

 

The benefit of using statistical packages in the analysis of surveys is that you can achieve results that are statistically significant. These packages are powerful and offer a multitude of ways to analyze your data, many of which cannot be completed in online survey tools. While they require skill to master, they yield powerful and trusted results.

Resources on your Campus

The last, but most important resource that is available to you in creating and administering your campus climate survey is your campus itself! Community colleges are filled with faculty and students qualified to assist you with survey research, especially those studying social science. In the planning of your survey, don’t forget to utilize the talent on your campus!

In the end, a campus climate survey conducted at a traditional 4 year university and one conducted at a community college may need to be adapted differently to meet their campus’ needs, but the resources used to create and distribute the surveys have a lot in common. In recognizing the tools available to you and your academic institution, you will be able to ensure the success of your campus climate survey.

 

Campus Climate Survey and New Initiatives at University of Chicago

uchicago photo The University of Chicago released its Spring 2015 Climate Survey NORC Report on September 1, 2015. The NORC, an independent research corporation, conducted the Sexual Misconduct Survey: “Attitudes, Knowledge and Experience” (also referred to as the Spring 2015 Climate Survey) on behalf of the University of Chicago in April 2015.

Design of the University of Chicago’s Campus Climate Survey

It was designed by the University faculty committee, based partially on a similar survey conducted by the Massachusetts Institute of Technology  and the “Not Alone” survey toolkit created by the White House Task Force to Protect Students from Sexual Assault.

Campus Climate Survey Achieved a Response Rate of 31%

All enrolled students who were at least 18 years of age at the time of the survey were invited to participate during a two-week field period that ended on April 28, 2015.  A total of 12,485 undergraduate students and graduate and professional students at  the University’s Hyde Park campus were invited to participate in the survey. They completed surveys from 3,955 students, for an overall response rate of 31.7%.

New Programs and Priorities

Campus Climate Survey After the survey was completed, the University of Chicago is making new initiatives this coming year to address issues raised by the Campus Climate Survey.  They intend to substantially redesign the content and approach for their sexual misconduct prevention training, provide training for all new graduate and professional students, and launch a student run website (umatter.uchicago.edu) that makes university policy, procedures and resources easier to access. You can view all of their initiatives here.

UC’s proposed efforts are their first steps toward addressing the results from this year's Campus Climate survey. During the fall quarter,  their Campus and Student Life (CSL) staff will bring together interested students to discuss key findings from the Spring 2015 Climate Survey with a particularly focusing on how the results can best inform the University’s prevention and education strategies. This will provide an all inclusive meeting to help improve policy and community education of sexual assault on campus between both students and staff.

You can find the Spring 2015 Climate Survey Executive Survey here.

Campus Climate Surveys: What to Expect in 2016

campus climate survey for 20162016 is a big year for Campus Climate Surveys and legislation. The surveys aims to give institutions the opportunity to better understand their campus and to make informed decisions about how to create and improve the safety of their educational environment. Recently, the US Senate Hearing 7.29.15 addressed the official bill The Campus Accountability and Safety Act. With that, the White House Task Force is adamant about cracking down on sexual assault and violences on campus.  With that, this upcoming year Campus Climate Surveys will potentially be federally mandated for all colleges and universities.

The first step in solving a problem is to name it and know the extent of it -- and a campus climate survey is the best way to do that. The White House Task Force to Protect Students from Sexual Assault

Several states like California and New York have already adopted these surveys and other policies like “Yes Means Yes”. This is a standard that requires affirmative consent — affirmative, conscious and voluntary agreement to engage in sexual activity — throughout the encounter, removing ambiguity for both parties. With these two proactive states leading the way, the Campus Climate Surveys surveys will soon be required as part of a Title IX/Clery Act compliance program.  So what do these tests entail?

They compromise student and employee knowledge about:

    • The Title IX Coordinator’s role;
    • Campus policies and procedures addressing sexual assault;
    • How and where to report sexual violence as a victim/survivor or witness;
    • The availability of resources on and off campus, such as counseling, health, academic assistance;
    • The prevalence of victimization and perpetration of sexual assault, domestic violence, dating violence, and stalking on and off campus during a set time period (for example, the last two years);
    • Bystander attitudes and behavior;
    • Whether victims/survivors reported to the College/University and/or police, and reasons why they did or did not report.
    • The general awareness of the difference, if any, between the institution’s policies and the penal law; and
    • The general awareness of the definition of affirmative consent.

With the topics being brought to light in the college community, many people will now be educated  about the basics of sexual assaults and what to do if involved. Many people remain unaware about the severity and frequency of sexual assaults. So, the surveys are an excellent tool in providing “education” to people who would otherwise be blind to an offensive and serious situation.

Surveys May Hold Problems Clearing Areas Of Ambiguity

The surveys seem invaluable but, there are also challenges like areas of ambiguity for the utilization and implementation of them. It is unclear for what purpose a climate survey would be used: “Is it intended as a consumer information tool, an institutional improvement tool, an enforcement mechanism or some combination of all three?” The answer to this question could have a substantial impact on how a survey is designed and on how schools and others react to its results.

However, the plausibility of how these tests results can impact schools across the nation is incremental and erases many doubts. But, the usage and implementation continues to be questioned. As legislation improved and becomes widespread, we will begin to see change in colleges and universities. California and New York provide proof that Campus Climate Surveys and “Yes Means Yes” legislation can work and be properly enforced.

Forerunners New York and California Enact 2015-16 Bills

New York and California are two out of fifty states to enact “Yes Means Yes” legislation requiring Campus Climate Surveys and legislation against sexual assault state-wide. As these forerunners continue to implement their policies, they set an example for the rest of the states to follow.

State by state Campus Climate Survey requirements as of Dec 2015

The leader of the pack, California created a standard in 2014 that requires affirmative consent throughout the encounter, removing ambiguity for both parties. The law protects both individuals by ensuring that there is a mutual understanding. Legislation deems a person who is incapacitated by drugs or alcohol cannot give consent. With this legislation, California colleges are being held more accountable for prevention, evaluation and a consistent protocol surrounding sexual assault.

New York’s Campus Climate Assessment Policy gives institutions the opportunity to increasingly understand their campus and to make informed decisions when it comes to providing a safe educational environment. Beginning in the 2015-2016 academic year, each State University of New York State-operated and community college will conduct a uniform climate survey that ascertains student experience with and knowledge of reporting and college adjudicatory processes for sexual harassment, including sexual violence, and other related crimes.

With these states creating the standard, Affirmative Consent laws and policies are making their way through the states. To Keep updated with continuing legislation, here is an updated list of Title IX Schools under investigation for Sexual Assault by the US Department of Education.

Affirmative consent legislation isn’t just about the more than 20 percent of young women and girls who will have to live as assault survivors. It’s about the 100 percent of women who have to live every day, never quite certain of their physical safety. Research shows that with affirmative consent education, we can create a culture of respect.”

 

Campus Climate Surveys - How The Legislation Has Evolved

7432016514_a6003ddbf3_supreme-court.jpg

The Early Aspects Of Legislation

Over the past two years, campus sexual violence has grabbed the attention of filmmakers, lawmakers and the White House. This issue is garnering lots of attention — which is good for students safety in colleges and universities.

From the start, United States legislation has struggled in concisely defining “affirmative consent”. The policy colloquially known as “No Means No”, deemed to be a problem early on with its adoption at universities because of it’s loose terms of sexual assault. For many years, student reports of assault have been frequently mishandled. This discouraged survivors to come forward and identify themselves as victims of sexual assault and domestic violence. The majority of state laws define sexual assault using the old consent standard (“No Means No”) in terms of campus sexual assault. However, there are sex offenses that can be charged in the criminal justice system using an affirmative consent standard. While “no means no” has become a well-known slogan, it places the burden on victims, which makes it their responsibility to adamantly show resistance.

Unfortunately, early legislation faltered in identifying assaults and effectively executing its laws to implement safety for students in college. Since “No Means No” and other policies have been lackluster in its efforts, the government and several states have taken strides for improvement.

This has led to the establishment of Campus Climate surveys. The surveys are created to afford institutions the opportunity to better understand their campus and make informed decisions when it comes to providing a safe educational environment. The states that utilize this service will conduct a uniform climate survey that ascertains student experience with and knowledge of reporting. As well as college adjudicatory processes for sexual harassment, sexual violence, and other related crimes.

Recently, the state of California enacted SB 967 legislation to make “Yes Means Yes” the consent standard on college campuses, which takes a major step toward preventing sexual violence. This legislation requires preventative education during student orientation, increased access to counseling resources and training for adjudication panels. Thus, we begin to see a shift in policy and legislation being adopted by universities that increases security and prevention of continual sexual violence on campuses.

Confining The Confusion of Legislation

Presently, the implementation of Campus Climate Surveys and new legislation have made huge strides in confining the confusion of sexual assault on campus. Legislation is beginning to narrow down the definition of consent and sexual assault. Each has it’s own interpretation like California, whose  “Affirmative consent” means affirmative, conscious, and voluntary agreement to engage in sexual activity. Lack of protest or resistance does not mean consent, nor does silence mean consent”. Making these terms transparent is very important in effectively tackling the problem.

Until now, these sanctions have been voluntarily adopted by colleges; SB-967 gives them the backing of a government mandate. In addition to creating a vaguely and subjectively defined offense of nonconsensual sex, the bill also explicitly places the burden of proof on the accused. Stating that they must demonstrate that he (or she) took “reasonable steps … to ascertain whether the complainant affirmatively consented.” Policies as such are controversial and burden the victims more than necessary.

One of the task force's recommendations for revising sexual misconduct policies included defining consent as a "voluntary agreement to engage in sexual activity." Past consent should not imply consent, nor should silence or the absence of resistance, the guidelines recommend.

Schools nationwide are in the process of rewriting or have already rewritten their sexual assault policies, procedures and prevention education programs to meet standards in the Campus Sexual Violence Elimination Act, known as the Campus SaVE Act. That took effect in 2013 as part of the reauthorization of the Violence Against Women Act. With a fairer process, more students are coming forward to report crimes, and in time campuses will be safer.

Improvement of  the Federal Government’s Enforcement Efforts

The Obama administration is working to improve the Federal Government’s enforcement efforts, and to make them more transparent in practice. He states:

“We need to build on these efforts. To better address sexual assault at our nation’s schools, we need to both strengthen our enforcement efforts and increase coordination among responsible federal agencies. Also, and importantly, we need to improve our communication with students, parents, school administrators, faculty, and the public, by making our efforts more transparent.”

In 2014 President Obama proposed this call to action to identify the problem and to solve it. The White House Task Force was created to protect students from sexual assault. They have proposed a new standard to dealing with sexual assault on campus strive to show sexual assault survivors that they are not alone. The Task Force helps schools live up to their obligation to protect students from sexual violence.

  1. Identify the scope of the problem on college campuses;
  2. Help prevent campus sexual assault;
  3. Help schools respond effectively when a student is assaulted; and
  4. Improve, and make more transparent, the federal government’s enforcement efforts.

Obama has stated that, Campus Climate Surveys are necessary and will be mandated in 2016. He feels that a mandate for schools to periodically conduct a climate survey will change the national dynamic. The Federal government will have a better picture of what’s really happening on campus. Schools will be able to more effectively tackle the problem and measure the success of their efforts (notalone.gov)

Developing A Comprehensive Sexual Misconduct Policy

The Task Force has created a way for colleges and universities to have an easily accessible, user-friendly sexual misconduct policy. They realize that many schools do not have adequate policies and that there is no one approach that suits every school. So they have created a policy aid that will help with the grey area that each school will have due to their diversity.  The White House Task Force states:

We are providing schools with a checklist for a sexual misconduct policy. This checklist provides both a suggested process for developing a policy, as well as the key elements a school should consider in drafting one. Importantly, schools should bring all the key stakeholders to the table – including students, survivors, campus security, law enforcement, resident advisors, student groups (including LGBTQ groups), on-campus advocates, and local victim service providers. Effective policies will vary in scope and detail, but an inclusive process is common to all.

In June of 2014, they provided schools with a sample Memorandum of Understanding (MOU) with local law enforcement. The MOU can help open lines of communication and increase coordination among campus security, local law enforcement and other community groups that provide victim services. This can also improve security on and around campus, make investigations and prosecutions more efficient, and increase officer's’ understanding of the unique needs of sexual assault victims. They will explore legislative or administrative options to require colleges and universities to conduct an evidence-based survey, also known as Campus Climate Surveys, in 2016.

In essence, The White House Task Force has played a big role in the improvement of legislation in colleges and universities with sexual assault. They seek to clearly define the problem, identify it, and solve it. The implementation of Campus Climate Surveys is a very important aspect of this process. It’s mandatory implementation in 2016 will only lead to vast improvement in sexual assault policy and legislation on college and university campuses nationwide. Policy and legislation has only improved since progressive policies have been implemented and helping  sexual assault become widely known all over the nation.

 

Photo by Mark Fischer

The 7 Steps to Conducting a Successful Campus Climate Survey

Conducting campus climate survey If your educational institution has decided to conduct a campus climate survey, you may be struggling with where to begin. You are not alone; designing a successful campus climate survey is a complex task, one that requires thorough planning, collaboration and hard work by everyone involved to yield useful information.

The United States government has issued a guide to help colleges in their efforts to reduce sexual assault on their campuses. This guide goes in depth on creating campus climate surveys. It is full of valuable information and deserves a thorough reading. However, for those looking for a quick overview, here are 7 basic steps to creating a campus climate survey to get you moving in the right direction.

Step 1 - Set Goals and Milestones

Sit down and discuss with your administrators, deans, Title IX coordinators to understand what kind of information you are wanting to gain from conducting a campus climate survey. Creating goals and requirements for the final survey will become your guideline for developing the survey.  It is also very important to set a deadline for major milestones, which include:

  1. Survey design
  2. IRB approval
  3. Finalizing technology/administration and analytical setup
  4. Administering the survey
  5. Conducting analysis
  6. Publishing results
  7. Determining action items, priorities, budget, roles/responsibilities for upcoming year

Step 2 - Engage with IRB

The order of this step may change depending on your educational institution’s approval process.  Most colleges/universities will have a formal Institutional Review Board (IRB) process to make sure the survey follows the IRB guidelines.  This may require proposing the project before any work is done.  Other IRB or campus review boards will only need to approve the final survey draft before it is sent out for responses. Check with your university’s or college’s human subject research guidelines before administering surveys.

Step 3 - Assemble a Team

After you have reached a consensus on the goal of your campus climate survey and know how you will engage with the IRB, it is time to assign roles and responsibilities of survey creation, review and distribution.  You will most likely need a multi-disciplinary team to help you. Preferably, you will have representation and cooperation from the following participants:

  • Research faculty (social science research)
  • Academics
  • Administration
  • Student representatives
  • Grad research assistants
  • Title IX administrators and coordinators
  • Counseling services employees

If you are reading this blog post, it is most likely that you will be the main coordinator who will come up with initial list of survey questions and setting review meetings and deadlines to keep the project progressing

Step 4 - Create and review survey

With your team assembled, many workshops and planning sessions will be needed to complete the survey creation process. These workshops will be essential to ironing out what questions are needed to obtain the information you are looking for. During these meetings, it is very important to take notes, particularly on the rationale for each question that is included. These notes will prove a valuable resource, especially when you run the survey again.

When the survey starts to come together, it is wise to test your campus climate survey with a focus group. These participants will be able to give you feedback, making you aware of things you need to change, questions you need to clarify, etc.

Finally, always review your final survey before it is sent out to your respondents. Some things to check for other than grammar and spelling are length, biased/leading questions and statements, confusing or misleading questions, etc.

Step 5 - Administer the Campus Climate survey

When your campus climate survey is completed and ready to be distributed, make sure to do it in a way that protects your respondent’s anonymity. There are a number of ways to distribute surveys, with the most popular being internet-based distribution. By using online survey programs such as Qualtrics or Survey Monkey, you have control over when the surveys open and close. They also offer tools to analyze the data when it comes in.

Make sure to distribute your survey to as many people as possible to ensure you are getting a good number of responses. Too few responses can lead to results that are not statistically sound.

Step 6 - Remind your respondents to take the survey

To increase your respondent numbers, make sure to periodically remind people to take the survey. Do not overburden them with reminders, however, or this could lead to people becoming irritated and less likely to respond.

Another way to gain more respondents is to incentivize the survey. This could be accomplished by offering a tangible reward, either to all respondents or to one or a few lucky respondents. Not all colleges and institutions offer incentives for their survey, so be sure to discuss incentives and budgets that comply with your institution’s values and budget.

Step 7 - Start analyzing the Campus Climate Survey data & results

After the survey response collecting time period has passed, or you have reached your sampling goal, close the survey. You may now start analyzing the results!

Many survey programs will allow you to analyze your results with their tools, but most will not be able to analyze open-ended questions. Make sure to analyze these answers carefully, as they often provide unique and interesting insights that can not be collected through traditional multiple choice questions.  You may need a grad assistant to "code" the responses so they can be analyzed as quantitative items

Publish your findings!

Those are the 7 basic steps to creating a successful campus climate survey! While these steps have been stripped down to their most basic forms, there is still a lot of complexity behind them. Remember, this guide is not a substitute for reading the full guide issued by the government, but a supplement and a springboard. Use these basic steps to start your planning, and refer to the full guide for more detailed instruction.

 

Your Progress Bar May be Broken

journey of thousand miles photo

Question - Should e-learning courses have progress bars?

If you answered Yes then you will not be rebelling against the norm that online courses are expected to have a progress bar, it is pretty much a requirement.  Everyone has it, we look for it when we are presented with a multi-page survey or an e-learning course.  We want to know where we are.

If you answered No then we would love to hear your rationale, because this was our position as well, since our launch two years ago.  Progress bars can be distracting.  If I am at 0%, and after 10 minutes of learning and clicking, i am at 3% then it tells me the embarrassingly minuscule progress I have made and that a long and arduous road lies ahead.  So is it better not to show anything at all rather than to show a participant that they still have 97% left?

If you answer It Depends then your answer is a bit more correct vs. the Yes/No answers… read on and find out why.

>> You have read 20% of this article

Why obsess over progress bars?

Since our livelihood at Get Inclusive depends on delivering engaging educational content through online courses, we have to take all aspects of the learner’s experience very seriously.  A progress bar (or lack thereof) is a very important component of the learner’s experience.  In this post I want to share with you what we found, how we thought about it and how we ended up putting this research to practice.

>>>> you have read 30% of this article

So what is the consensus on progress bars research?

Turned out (to our surprise) that very smart people (with PhDs) have spent a lot of hours researching and publishing the effectiveness of progress bars.  During our search for the ultimate truth on this seemingly simple topic of progress bars, we came across a 2013 meta-research that nicely combines all the other research into a neatly packaged consensus. These researchers (from City College London, Google UK and Gallup Poll) cite that:

… effect of progress indicators appear to be mixed, where some studies found that progress indicators reduced drop-offs, some concluded that they increased drop-offs, and yet others found that they had no effect on drop-off rates.

Drop-offs are bad.  This is when the participant either stops working on the survey, or stops paying attention.  For all intents and purposes, drop-off means you have lost the participant.  They will do whatever is needed to get to the end if it is a requirement but if they had an option, they would rather not participate.

>>>> you have read 45% of this article

Relevance of this study to Get Inclusive

While this research was focused on experiments observing participant behavior with multi-page surveys, the relevance to online e-learning courses should be readily apparent.  In multi-page surveys, as in online e-learning modules, participants are presented with content (statements, questions, videos, cheesy photos, etc.) and are asked to respond to questions.  e-Learning modules may have more emphasis on content but the mechanics or content delivery and participant interaction remain very much the same.  Please note that at Get Inclusive we have a 0% cheese policy, we are highly allergic to it, so we really obsess over the content as much as on the dilemma of the progress bar.

What really matters in these long interactive content delivery methods (multi-page surveys or e-learning courses) is drop-off rate – this happens when a participant closes the browser windows, or just tunes out and starts clicking the “next” button as if they are being chased by killer whales or, physically dropping off their chairs out of sheer boredom.  Drop offs are bad.  They have to be measured and managed.  We measure drop-off in terms of level of engagement on any given page.  Measuring the number of participants who didn’t finish is not enough.  Drop-off is not a cliff, it’s a slow slide and as an e-learning training provider or a multi-page surveyor, you should be able to pinpoint where the slow-slide into drop-off land began.

>>>> you have read 50% of this article

Not all Progress Bars are Created Equal…

What we really loved about this research is that they took the (ahem, cheese warning) progress on the progress bar research to a whole new level.  They did this by setting up 32 experiments and showing the participants three types of progress bars (yes, we were in heaven when we found out that there are different types of progress bars):

  1. Linear progress bar – constant speed, e.g. if you are on page 1 of a 10 page survey, you will see progress bar jump up 10% as you go from one page to the next.
  2. Slow-to-fast progress bar  – starts out slower than the your actual progress, but as you are closer to the end it speeds up,  e.g. it may show 5% increments in the beginning but larger gains towards the end.
  3. Fast-to-slow progress bar – as you can guess, this one starts out with bigger leaps in the beginning as you turn the pages but slower towards the end.

And the winner is…

According to this research, the fast-to-slow progress bar is a clear winner.  Constant speed (linear progress bar) increases drop-off.  Slow-to-fast progress bars also increase drop-off rates, as it is plain discouraging to be rewarded “less” than the mathematical actual progress.  So why is it that fast-to-slow progress bars are so effective?  Before we go there, let us consider the ethical implications of using such a progress bar which does not reflect the true mathematical reality of where a person is.

And what about the ethics of this magic trick?

Having a “fast-to-slow” progress bars does seem to be a manipulative mind-trick designed to dupe the learners into believing that they are further ahead than they actually are.  So before we considered whether/how to implement, we had to be comfortable with the ethics behind it.  In this complex domain of intersection of mathematics, learning and philosophy, we turned to our favorite Chinese philosopher (no, not Confucius, actually his predecessor Laozi) who said: “A journey of a thousand miles begins with a single step”.  Let us look at what the steps are for a learner when they participate in one of our courses:

  1. Participant opens an invitation email sent by their employer/college administrator
  2. Participant clicks on an enrollment link in this email to arrive at the course registration page
  3. Participant enters email/password to register
  4. Participants starts the course and is presented with the content (content pages that take a total of 20-50 minutes to complete)
  5. Participant clicks “done” at the end of the course and receives a PDF certificate

The participant who takes the first step (i.e., clicking the enrollment link) is infinitely more likely to complete the course than one who ignored the email.  Should that not count for anything? we believe it should.  But then there is another drop-off point, what if they come to the registration page but see this as a less important/less urgent task vs. other things and never return.  Again, another very important step that is mathematically not linked to the progress bar at all in a traditional sense.  Why would we present the participant with a big 0% if they have taken these important steps.

No credit for the biggest obstacle to learning? e-Learning participants navigate around the biggest obstacle at the beginning of the e-learning course,one that doesn’t exist towards the end – namely, assessing Cognitive Load of the task – how much mental effort is this thing going to take.  Two activities requiring identical time commitments can have different cognitive loads, e.g. Multiplying pairs of 4 digit numbers for 10 minutes vs. reading your favorite newspaper.  Participants are trying to assess whether the cognitive load is worth the positive (or negative) incentives that come with participating in the course.  Participants develop this tradeoff as quickly as they can, in the first few interactions, and then they confirm this assessment in the next few interactions and continue or put this in a “return some other day” bucket.

The very low (less than 10%) completion rate of online courses at sites such as Coursera raise questions about how to measure progress.  The completion rate increases to 25% for students who answered at least one question correctly on the very first quiz (Inside Higher Ed).  Beginnings matter, progress in the first few pages is much more important towards achieving the goal relative to the last few pages.

In the case of courses at Get Inclusive, we ask participants to engage in real thinking and self-reflection and share their thoughts.  Participants aren’t putting on a video and taking a nap.  They have to put mental energy into it.  We are asking them to get over the cognitive miser part of their brains.  It is really challenging for participants to get through the first 10 pages as compared to the last few pages.  The progress for these first interactions has more value because this is where the participant is committing to the process.  Why wouldn’t we reward them with faster progress during these first few pages?

Beyond lazy estimates

I hope you are seeing our thought process here.  Mathematical truth is critically important yet just a partial truth.  A progress bar needs to be more than a lazy and simple division of “current-page divided by total-pages” based estimate.  A progress bar needs to consider and acknowledge all the obstacles the participant has overcome to get to where they are and how this progress relates to their reaching the “goal”.  Now we are in the land of heuristics-based progress bars and are comfortable talking you through how we implemented it in our courses.

Progress Bars at Get Inclusive

Everyone Start at 8%

Our customers (employers, college administration) send out an email introducing us and letting their participants know that we are providing an online course and communicating any other requirements (deadline, incentives, etc.) All participants who put in the effort of opening this email and clicking a link in it to register for the course are acknowledged with an 8% progress.  Several factors are taken into considering to come up with this number, which include drop-off/engagement rates and obviously the high degree of importance of these initial steps.

Acknowledging Cognitive Load

Participants are then presented with course content.  We know that they will be doing an assessment of cognitive load and deciding whether they should participate in the course now or come back “tomorrow”.  For this main part of the learning process, we show a distraction free progress bar at the top which visually (not numerically) communicates where the participants is.  You are looking at the first page of the course in the image below, the 8% starting progress carries over visually.

Visual vs. Numeric Progress Bar

We opted for a visual progress bar (as opposed to numeric) in the actual content pages.  We believe it matters a lot less whether the progress is 45% or 55%, it is “half-way”.  And this rounding is perfectly communicated via a green bar that fills from left to right as the participant makes progresses through the course pages.  On the dashboard, however, both visual and numerical versions are displayed.

Fast-to-slow Progress Bar

After having progress-bar-free courses since our founding, we had sufficient reasons to evolve our thinking and dip our toes into the murky lands of progress bars.  We experimented with various profiles and ended up with the one shown below.  A curve-fit formula helped us implement this in code.

For a participant going through the course, we first calculate the course page progress (x-axis), which is based on the lazy math (current page divided by total pages).  As noted earlier, this does not acknowledge the cognitive load assessment the participant is making and the important “commitment” steps s/he is taking during the first few pages of the course.  So we acknowledge that by adjusting upwards the first quarter of the course.  We also consider the content of the course pages in this adjustment.  Content presented earlier is foundational and progress there has a relatively higher impact on subsequent modules.  We expect this to be the case for most instructional courses.

>>>>> You have read 95% of this article

What about the "slow" part of the fast-to-slow progress bar?

What goes up must come down and the progress bar that moves with larger increments in the beginning must slow down towards the end. This raises an important question - isn't this demotivating towards the end? while this may be true, there is strong evidence to suggest that as humans we naturally experience higher motivation levels towards completing a task if we can visualize (literally see) the goal or the finish line.  For e-learning courses, being near the goal is represented by the progress bar being nearly full (or visually more full than empty).  Evidence of this extra motivation closer to the goal has been a subject of several research efforts (e.g., Cheema and Bagchi 2011).  So all in all, the fast-to-slow progress bar reduces progress increments from the second half where extra motivational juice is naturally present and acknowledges the challenges the participants have to overcome during the first half by higher progress increments.

What’s next

We launched the progress bar about a month ago.  We will be evaluating the engagement data we collect to see what the impact is from our days of not having a progress bar at all.  We will not be evaluating the efficacy of our chosen fast-to-slow progress bar vs. linear as we are satisfied with what the research has concluded.  What we are curious about is improving the accuracy with which we attribute progress in the initial steps because…

A journey of a thousand miles begins with a single step

Laozi (c 604 bc – c 531 bc)

You have reached 100% of this post – Did you observe that you were presented with a “You have reached x%” message during the beginning sections of this post.  Did you notice that it went missing? was having it helpful? From a purely word-count perspective, you were at 34% when we showed 50% completion.  Would love to hear your thoughts on this.

Not My Athletic Teams! Six Rape Myths and Reality

Athletes tend to have higher acceptance of rape-supportive statements vs. control group. This summary looks at key findings from research paper "Understanding Community-Specific Rape Myths : Exploring Student Athlete Culture" by Sarah McMahon