Proctored Exams

Elevating Support: A Journey of Continuous Improvement

In the realm of educational technology, exceptional support is not just an added benefit—it’s a cornerstone of user satisfaction and product success. Several years ago, inspired by a “Report Card” presented by Ray Henderson, then-President at Blackboard, I began to reflect deeply on how we could elevate the support services at Respondus. Henderson’s report was a detailed analysis, grading the company’s performance on product and support, particularly emphasizing metrics like defect resolution, response times, and customer satisfaction.

The Catalyst for Change

At Respondus, we realized that while our products were robust, our support system needed a more data-driven approach to reach its full potential. Initially, our performance measurement was minimal, and it took nearly six months just to gather enough data to set a baseline. This foundational work revealed a key insight: applications designed for instructors, such as Respondus 4, StudyMate Author, and the Test Bank Network, required relatively little support. These tools, although widely licensed across over 1,000 universities and K-12 institutions, served a niche audience of fewer than 100,000 educators.

In contrast, our student-facing products, LockDown Browser and Respondus Monitor, were on a different scale entirely. With millions of installations and over 30 million assessments conducted annually, it became clear that these applications accounted for 94% of our support tickets.

Targeting Support Where It’s Needed Most

With these insights in hand, we directed our focus towards enhancing support for LockDown Browser and Respondus Monitor. We began systematically collecting data from support tickets, including the operating system used, the Learning Management System (LMS) of the institution, problem types, resolutions, and more. This data collection became the foundation of our continuous improvement process.

Weekly analysis of this data led to significant changes. We expanded our support team, extended support hours, revamped our knowledge base, and reengineered the escalation process. More crucially, the data informed our product enhancement strategies, ensuring that we were addressing the most pressing issues identified by our users.

Achieving Measurable Success

The impact of these efforts was immediate and significant. For example, our average response time for support tickets dropped from 14 hours in early 2014 to just 2.5 hours by early 2016. The median response time was even more impressive, averaging around 90 minutes. Similarly, the time required to fully resolve and close a ticket decreased from 7.2 days to just 2.3 days, with the median time being even shorter.

Perhaps the most rewarding outcome has been the customer satisfaction ratings. On a 5-point scale, the average rating from August 2015 to May 2016 was an outstanding 4.8, reflecting the positive impact of our enhanced support processes.

Product Enhancements Driven by Data

Parallel to these support improvements, we used the data gathered to drive product enhancements. A pivotal change was the introduction of a new browser engine for the Windows version of LockDown Browser. In mid-2015, we transitioned from Internet Explorer to Google’s Chromium engine, a move that not only resolved numerous issues beyond our control—such as corrupted installations of IE and malware infections—but also provided us with direct access to the source code for faster issue resolution. This change led to a substantial reduction in support tickets, even as the usage of LockDown Browser and Respondus Monitor surged.

We are currently preparing a similar update for the Mac version of LockDown Browser, which will transition from Safari WebKit to Chromium, further enhancing stability and performance.

Engaging with Students for Continuous Improvement

Interestingly, as the number of support tickets decreased, we noticed a growing disconnect with our end-users: students. In educational technology, student feedback often gets filtered through IT staff or educators, which can delay or obscure the identification of issues. To address this, we implemented several student-focused features.

We introduced a “Help Center” button directly within the LockDown Browser toolbar. This feature allows students to run a comprehensive system check on their computer, network environment, and LMS, presenting the results in a user-friendly format that facilitates self-resolution of common issues, such as poor Internet connectivity.

Moreover, we simplified the process for students to send this technical data to their institution’s help desk or directly to Respondus. Providing this information at the outset significantly streamlines the troubleshooting process, reducing the time to resolve support tickets.

We also added a student-centric knowledge base within LockDown Browser and Respondus Monitor, empowering students to resolve issues independently while providing us with valuable data on which articles are most frequently accessed.

Looking Ahead: A Commitment to Excellence

While we don’t plan to publish an annual report card on our support services, the feedback we’ve received from our clients has been overwhelmingly positive. Our efforts have not only reduced the burden on institutional help desks but have also improved the overall user experience for LockDown Browser and Respondus Monitor. As institutions continue to see triple-digit growth in the adoption of our products, we are committed to further enhancing the user experience and reducing support issues across the board.

Interactive Elements and Visual Enhancements

To make this article even more engaging, we’ve included interactive elements such as an embedded timeline of our support enhancements and a feedback form where users can share their experiences directly with our support team. Additionally, visuals such as graphs and charts depicting our support metrics over time are integrated to provide a clear, data-driven view of our progress.

Leave A Comment

Your Comment
All comments are held for moderation.