Surpass

Designing a Flexible Graph & Chart Authoring Tool for Surpass Assessment Platform

Project Overview: Modernising Authoring Tools for Digital Assessment

When browser support for Flash officially ended, the Surpass platform lost a critical part of its assessment toolkit: the ability for awarding bodies to author and deliver graph- and chart-based questions. These interactive visual tools were widely used in maths and science exams, yet no suitable replacement existed within the platform.

The goal was to design a fully accessible, HTML-based tool that supported both the item authoring process and the candidate test experience, enabling users to create and interact with graph and chart question types directly inside Surpass. The project had to move quickly, prioritising the most critical functionality while laying the groundwork for extensibility.

Introduction

Executive Summary: Balancing Immediate Needs with Long-Term Vision

From the outset, we recognised the importance of balancing two types of priorities:

  1. 1. Immediate client needs — restoring lost functionality to avoid disruption to live exams
  2. 2. Strategic growth — exploring new, richer interaction types that could attract new clients and modernise Surpass’s competitive offering

By scoping the MVP to core functionality — while allowing room for future features — we could deliver value quickly and keep the platform scalable and flexible.

Research and Workshops

Discovery & Workshops: Mapping What the Tool Needed to Do

We began with a series of cross-functional workshops designed to understand how graphing and charting questions were used — and where the biggest challenges were. These sessions included participants from:

  1. UX/UI design
  2. Product management
  3. Development
  4. Support teams
  5. Account management
  6. QA and technical architecture
Mapping What the Tool Needed to Do

Affinity mapping workshop to understand the complete feature set of the question item type.

Through affinity mapping and dot voting, we gathered insights into how the new tool would integrate across the entire Surpass ecosystem. We explored which product areas it would touch, what data it would need to manage, how questions would be marked, and where accessibility challenges might appear.

This collaborative process helped us identify both the critical functionality for launch and the future-facing opportunities for feature development.

Service Mapping: Zooming Out to See the Whole Journey

To complement the feature-level research, we ran a service mapping session that looked at the end-to-end journey of both authors and candidates. We identified key service touchpoints — where people interact with the product, how support gets triggered, and where different teams are involved.

Service Mapping

Mapping how the question item is used across the business.

This helped us uncover two important truths:

  1. Stage 2 of the authoring process is where item authors spend the most time — this needed to be smooth, predictable, and fast.
  2. Stage 4 of the candidate journey (test delivery) was where accessibility and usability mattered most — particularly under exam conditions.

This dual focus ensured we weren’t just designing for function — we were designing for real user contexts.

Technology Review: Evaluating Third-Party Integration vs. Building In-House

We ran a parallel technology review to evaluate existing charting and graphing libraries. Each team member contributed ideas for APIs or tools that could handle the complexity of interactive visual content.

Third-Party Integration vs. Building In-House

Understanding the technology required to to develop the feature

We evaluated options based on:

  1. Feature sets (e.g. curved lines, labels, zooming, shapes)
  2. Cost and licensing implications
  3. Integration complexity
  4. Accessibility support (e.g. screen reader compatibility, keyboard navigation)
  5. Roadmaps and long-term support

This analysis helped guide the dev team’s architecture decisions, and ultimately led to prototyping with a third-party library (Dresden Chav API), which allowed us to reduce build time while maintaining flexibility.

Authoring Experience: Designing a Clear and Confident Creation Flow

With research and tech validation in place, we turned to the UX and UI of the authoring experience. Our goal was to make graph authoring intuitive — even for users who weren’t especially tech-savvy.

Process - defects

Working through how a user might use the product

We explored different models (freeform vs. guided) and ultimately landed on a wizard-style authoring flow. This approach mirrored the mental model authors already had: working step-by-step to build a question. Key tasks included:

  1. Selecting a graph or chart type
  2. Adding points, lines, shapes, or bars
  3. Editing titles, axes, and labels
  4. Setting correct answers and marking weights
  5. Previewing and saving the question

We reused components from the existing Surpass design system to maintain familiarity. We also layered in instructional hints, tooltips, and inline previews to reduce friction and improve confidence at each step.

Initial design and wireframes

Prototyping and UI Feedback: Collaborative Sketching & Dot Voting

Once the initial UX flows were mapped out, each team member created a sketch of how the graphing tool could live within the existing Surpass interface. We presented, discussed, and voted on the strongest ideas from each concept.

User Stories

UI design workshop

The final prototype integrated:

  1. A side panel for input and configuration
  2. A live preview of the graph or chart.
  3. Reusable UI elements from other question types
  4. Accessible controls for labels, shapes, and lines
  5. Clear success/error states for feedback and validation

This hands-on prototyping phase gave the dev team a clear direction and reduced the ambiguity that often slows handoff between design and build.

Authoring Experience: Designing for Control, Speed, and Modularity

The team then added more detail to capturing all the functionality for the authoring process.

We then did a detailed user journey on how an author would put together a question. We ran a workshop on all the tasks we thought an item author might need to do, and then grouped them in order of how we thought they would do them.

Process - defects

How the authoring part of the feature might work

On some of the key points of this session.

  1. Helped give a clearer guide for the UI and UX.
  2. Captured all functinality needed for the tool.
  3. Tool would be embedded into item offering and the online testing front-end testing viewer.
  4. Split up the graphing tool and the charting tool into separate, independent item questions.

First phase design

Graph Authoring UI/UX: Creating a Confident, Guided Authoring Flow

The graph authoring tool was the most complex piece of this project — requiring a flexible, dynamic UI that still felt simple for authors to use. We designed the experience to support step-by-step creation while maintaining a live preview of the question at every stage.

To do this, we adopted a wizard-style interface that reflected the way authors already approached question creation. This broke the process into manageable chunks, guiding the user through:

  1. Selecting a graph type (e.g. Cartesian, coordinate, curved line, freeform shape)
  2. Adding points, lines, and shapes to the graph
  3. Customising titles, labels, and axes
  4. Setting correct answers and marking weights
  5. Previewing the finished question in real time

We made sure the UI stayed consistent with the Surpass design system by reusing existing components like side panels, toggles, and tooltips — reducing learning curves and streamlining development.

To support both new and experienced users, we included:

  1. Contextual tooltips and hints
  2. Smart defaults (e.g. auto-labeling axes)
  3. Inline validation and warnings to prevent authoring errors
  4. A clear save and preview flow so authors could test before publishing

This approach gave authors full control, while reducing cognitive load and errors — especially for those unfamiliar with technical graphing interfaces.

Tool Settings

Allows item authors to select which tools the candidate can see.

Graph settings

Graph setting configuration by the item author and what the candidate can see.

Point settings

Item author adding points to a graph.

Curve settings

Item author adding a line curve to a graph.

Light lines settings

Item author adding a straight line to a graph.

Shape settings

Item author adding a shape to the graph.

Candidate Experience

Graph Candidate UI/UX: Supporting Interaction and Accessibility Under Exam Conditions

The candidate experience for graph-based questions needed to be powerful but frictionless — allowing students to interact with graph elements confidently during time-pressured assessments.

We mirrored much of the authoring functionality, while locking down system-level controls to preserve test integrity. Candidates could:

  1. Add points, lines, shapes, and labels
  2. Zoom and pan across the graph canvas
  3. Modify labels or axis titles (where allowed by question design)
  4. Use a drag-and-drop interface to position objects precisely
  5. View a real-time visual of their response

Graph settings

The candidate is able to adjust Graph settings to answer a questions.

Plotting points

The candidate can add a point to a graph.

Plotting curves

The candidate can add a curved line to a graph.

Plotting straight lines

The candidate can add a stright line to a graph

Plotting shapes

The candidate can add a shape to a graph

Warning modal

Warning modal if the candidate resets the graph information.

Since the tool was embedded directly into the Surpass test player, performance and accessibility were key.

We designed the interface to work fluidly with:

  1. Keyboard-only navigation
  2. Tab order optimisation for quick access between input fields
  3. Screen reader compatibility to support visually impaired learners
  4. Patterned visuals (e.g. dashed vs. solid lines, filled shapes) to reduce reliance on colour alone
  5. Light UI feedback that confirmed actions (e.g. “Point added”, “Label saved”) without distracting from the test

This careful balance of capability and clarity gave candidates a seamless, equitable experience across device types and abilities.

desktop Mask

Graphing tool prototype

We put together a prototype for the developers on how the tool could work, integrating a third-party Dresden Chav API. We are able to show product owners, account holders, and developers how the tools could work. This type of first touch really helps speed up the development of the project. Reduce ambiguity. We were able to share this demo to clients to help show confidence in the product.

Authoring

Bar Chart Authoring UI/UX: Rapid Configuration with Built-In Accessibility

To reduce risk and ship value faster, we made the strategic decision to separate bar chart authoring from graph authoring. This allowed us to build a more focused interface that shared patterns from the graph tool, while simplifying logic under the hood.

The bar chart tool was designed to let authors:

  1. Add up to 10 distinct bars per question
  2. Assign custom heights and values
  3. Reorder bars via drag-and-drop
  4. Add labels and legends with guidance
  5. Choose between horizontal and vertical orientation
  6. Configure axes and gridlines using existing UI panels

We reused core components from the graphing UI — sidebars, wizards, tooltips — but adapted them for a more data-entry-focused workflow.

Accessibility was considered from the start, including:

  1. Pattern-based fills for each bar to support colourblind users
  2. Accessible colour combinations and text contrasts
  3. Clear feedback states for authors when limits were exceeded (e.g. too many bars)

By creating a leaner UI for bar charts, we were able to reduce dev time, simplify QA, and release functionality that was immediately usable for high-volume clients.

Chart tools selection

Item author can select what is the shown on the charting tool.

Chart settings

Configuration allows authors to set an axis and division on the graph, add labels and titles, and to show bar chart values and labels.

Bar Chart configuration

Configuration of the bar chart including:- Chart height, - Label, - Colour, - Order, - Whether it is editable

Chart Key configuration

Configuration of the chat key including bar label & color and whether it is editable or not.

Bar chart on a graph

Item author chart showing all bars plotted on a graphic values, labels and axis labels.

Candidate experience

Bar Chart Candidate UI/UX: Simplified Input, Strong Feedback

For candidates, the bar chart experience needed to be easy to understand, easy to interact with, and completely accessible — especially on timed assessments.

Building on the work done for the graph UI, the bar chart interface allowed candidates to:

  1. Adjust bar heights and positions directly
  2. Add or edit labels for both axes
  3. Add a key/legend if required by the question
  4. Choose between horizontal or vertical orientation (when allowed)
  5. Tab between values or fields for faster keyboard use

Accessibility support included:

  1. Tab navigation between bars and axes
  2. Use of patterns instead of colour to differentiate bars
  3. Fully responsive layout that worked across screen sizes and input devices
  4. Streamlined UI with minimal distractions — helping candidates stay focused and confident during the exam

Bar chart graph

Candidate experience shoing bar chart on a graph.

Bar chart & key

Candidate with accessible bars and chart key.

Key configutation

Candidate can edit the Key to answer a question.

Bar chart label configutation

Candidate can add chart values and bar names to answer a question.

Vertical Bar chart configutation

Vertical bar chart, Candidate can add chart values and bar names.

Accessible bar chart

Accessible version of the bar chart.

Learnings

Outcomes & Learnings: What Worked and What We’d Do Again

Reflecting on the process, there were several things that made this project particularly effective. By taking a collaborative, user-first approach early on — and combining it with structured prototyping and quick decision-making — we were able to move fast without losing quality. These elements helped reduce ambiguity, improve alignment, and deliver something useful and usable within tight constraints.

  1. Running early workshops helped define scope quickly and surface risks
  2. Working closely with technical leads guided our tool selection and development path
  3. Involving support, account managers, and QA early meant fewer surprises later
  4. Reusing Surpass design system components saved time and improved consistency
  5. Sharing a working prototype helped secure internal and client buy-in

No project is without its challenges — and this one offered plenty of opportunities to adapt and improve. These lessons are things I’d carry into future work, especially when building tools that need to balance technical complexity with user simplicity.

  1. Accessibility needs to be designed from day one — not as a final check
  2. Small, scoped item types are faster to develop, test, and improve over time
  3. Real-world exam conditions (like time pressure or device variation) should shape every design decision
  4. Showing — not just telling — your ideas accelerates alignment and trust across teams

Acknowledgements

Big thanks to Kat Owens for her expert facilitation of the early workshops, Richard London for guidance throughout the project, and the wider team — Hannah, Jamie, Chris, Ruth, and Kailas — for their insight into how these item types would be used, supported, and scaled across the platform.

Copyright Neil Dickinson 2025