Faculty Resources

Welcome to the faculty resources page for USF’s new Online Teaching Effectiveness Survey (Blue). Here you will find valuable and useful information on the new student survey, which is scheduled to replace SUMMA in Spring 2015.

We have assembled links to resources (below) taken from the slides and other materials presented at the two faculty information sessions on October 16 and 22, 2014, including a recording of the October 22nd information session. We hope you find these resources helpful.

Please contact Robert Bromfield (rlbromfield@usfca.edu) or Ed Munnich (emunnich@usfca.edu) should you have any questions about the new teaching effectiveness survey.

Links to Resources

The Online Student Evaluation Implementation Task Force

  • John Bansavich, Director, Center for Instruction and Technology & Adjunct Professor, School of Education
  • Robert Bromfield, Co-Chair; Associate Dean & University Registrar
  • Susanne Hoelscher, Adjunct Professor, College of Arts & Sciences
  • Ed Munnich, Co-Chair; Associate Professor, College of Arts & Sciences
  • Bill Murry, Director of Student Learning Assurance, Office of Academic Affairs
  • Janessa Rozal, Student and Faculty Services and Desktop Publications Coordinator


The new online Student Survey of Teaching Effectiveness (Blue) is scheduled to replace SUMMA in Spring 2015.

The transition to an online survey of teaching effectiveness came about due to widespread dissatisfaction with the SUMMA. In a 2010 survey, 87% of faculty felt SUMMA should be eliminated, and 78% believed an online survey would be a viable solution for getting results back to faculty more quickly.

The new survey was developed to capture USF values. The survey was built around four constructs of Teaching Effectiveness, based on a USF faculty survey and the literature: Instructional Design, Instructional Practices, Student Engagement, and Student Learning. The new survey provides a measure of the constructs of teaching effectiveness.

Note: Other evidence of teaching effectiveness includes instructional materials,invited presentations in colleagues' classes, curriculum development, publications on teaching methods, etc. (Article 17.9.6, FT CBA).

Faculty are encouraged to consider the four constructs in planning their classes, beginning in Spring 2015.

The benefits of moving to an online teaching effectiveness survey are several-fold. These include:

  1. Reports will be available 48 hours after the survey closes.
  2. Note: Faculty must submit their grades before getting access to their reports.
  3. Reports available in time for subsequent improvement in faculty teaching effectiveness.
  4. Online response is more natural for today’s students.
  5. Opportunity for students to provide anonymous comments to faculty on each item.
  6. Students may save their responses mid-completion and return to finish later.
  7. Survey is accessible on smartphones and tablets.
  8. Survey dynamically resizes to fit the pertinent device’s screen.
  9. Students may also access the online survey from within Banner Student Self-Service and Canvas.

Students will receive several email communications from the Office of the University Registrar. These emails stress the importance USF places on the teaching effectiveness survey and explains the simple process for completion. Students who do not complete upon the first prompt can receive up to 3 more prompts through the system.

Faculty can help, too:

  • Faculty are encouraged to stress the various ways of accessing the surveys (Banner Self-Service and Canvas).
  • Faculty can encourage their students to complete the online surveys.
  • Faculty may continue to set aside time during class when students can complete the survey. In selecting this method, faculty are asked to
    remind students to bring their laptops (or device of choice) to class on the scheduled survey day.

Faculty will receive an email indicating the reports are ready to be viewed. Grades for a given course must be submitted before the course’s report can be accessed.

Faculty can also view reports from within Banner Self-Service and Canvas.

Reports can be saved either as an HTML file or as a PDF. Reports may also be printed directly from the web interface.

Whereas many schools have experienced drops in response rates upon moving to online surveys, there are several steps you can take to maintain or increase current response rates in addition to the automatic email reminders students will receive. These include:

  1. Class time: Faculty should set aside class time for survey completion (as with SUMMAs).
  2. Ask students to bring smartphone, tablet, or laptop to class on survey day.
  3. Talk about it: Faculty should discuss changes they made with subsequent classes.
  4. Addresses major concern of USF student leaders: “Does anyone read our responses?”
  5.  Faculty should focus on the four constructs of teaching effectiveness.
  6. Quick feedback turnaround should facilitate rapid improvement in teaching effectiveness.

Yes. The faculty of record for each course can login to Banner Self-Service or Canvas to view response rates while the survey is open. Final response rates will be available through summary reports are survey period closes and after grades are submitted. IMPORTANT NOTE: Faculty will NOT be able to identify the status of any student or whether a particular student has completed the survey.

No. However, the Provost and Faculty Associations have agreed to form an ad hoc committee to review and provide guidance to Peer Review Committees on Blue results.

We spent a lot of time on this question, got faculty input that was generally opposed to a grade hold, and looked at what works at peer schools. There are several problems with withholding grades that led to the decision not to do so:

  1. Given our long grading period, students might be answering a survey about a class they finished several weeks earlier (e.g., Fall classes end early in December and grades are not due until the first week of January).
  2.  It is not clear how thoughtful responses are when students are answering just to get their grades.
  3. Withholding grades makes the survey feel punitive, which is at odds with the communication we are trying to promote by having comment boxes and working with the Center for Teaching Excellence to use feedback to promote better teaching.

The experience at peer institutions
Boston College, Loyola Marymount, and Fordham do not hold grades, but have nevertheless reported response rates in the high 80%s. Two strategies appear to be driving these response rates:

  1. Faculty discuss how much they value students' responses, and give examples throughout the semester of how they have used past feedback from students.
  2. Faculty set aside class time for students to complete surveys.

Regarding #2, one of the key reasons for adopting Blue as the platform for the Teaching Effectiveness Survey is that the survey can be taken on any web-enabled device, including laptops, smartphones, and tablets. Given this range of options, instructors who have given students class time for the survey have reported that close to 100% students brought one of these devices on the day that the instructor announced.

The average is calculated based on the survey’s 6-point Likert scale, where 1 is Strongly disagree, 2 is Disagree, 3 is Somewhat disagree, 4 is Somewhat agree, 5 is Agree, and 6 is Strongly agree.  

 Interpreting the Averages

  • The Response Average is the average of students’ semester responses for the course listed on the report
  • The Department (DPT) Average is the average across all the semester courses in the department associated with the course listed on the report.
  • The School (SCH) Average is the average across all the semester courses and departments in the school or college.
  • The Teacher Average is the average across all the semester courses taught by the named instructor.
  • The USF Average is the average across all courses surveyed in the semester.

If you have trouble logging-in to the system, contact the ITS Helpdesk at itshelp@usfca.edu or 415-422-6668. All other questions should be directed to