UX Heuristic
Evaluation &
Improvement
Strategy e-Learning
platform
Leer je Erfgoed
Project Brief
  • Client
    ErfgoedAcademie
  • Role
    UX/ UI Designer, UX Researcher, Digital Strategist
  • Deliverables
    Heuristic Analysis, Prioritisation Matrix, Annotated Screens, Redesign Mockups, Digital Strategy
  • Scope
    3-week heuristic audit, usability walkthroughs, and UX/UI proposals
Leer je Erfgoed is a Dutch e-learning platform used by volunteers and professionals in the cultural heritage sector. Built on a legacy LMS with custom code, it provides short modules on heritage care, storytelling, and cultural participation.

The platform faces significant usability challenges: inconsistent layout, outdated design, lack of mobile responsiveness, and limited flexibility due to its custom architecture.

I was asked by the ErfgoedAcademie to evaluate the platform, identify critical usability issues, and propose feasible improvements that align with system constraints and the available budget.

Initial Evaluation Hypotheses

Based on an expert review of the current platform, several testable hypotheses were formulated to guide the heuristic evaluation and subsequent user research. These statements indicate potential usability issues that require validation.

Navigation Clarity
H1
The current homepage structure may not sufficiently guide learners toward a clear starting point or intuitive navigation flow.
Readability and Visual Hierarchy
H2
The platform’s typography, spacing, and content density may reduce readability and increase cognitive load, particularly for new or older users.
Feedback and Progress Visibility
H3
The visibility and clarity of progress indicators and quiz feedback may be insufficient for learners to understand their status and correctness during tasks.
Interactivity with Hotspot Elements
H4
The design and technical implementation of hotspot interactions may hinder usability, especially on smaller screens or devices with limited responsiveness.
Accessibility Compliance
H5
The platform may not meet essential accessibility requirements, including sufficient contrast, alternative cues beyond colour, and consistent iconography.
Research Approach & Methods
The evaluation followed a structured but lightweight UX research framework appropriate for a legacy learning platform with limited design flexibility. The goal was to identify clear usability barriers and opportunities for improvement without requiring extensive redevelopment.
Heuristic Analysis

The platform was evaluated using Nielsen’s 10 usability heuristics, combined with basic accessibility and e-learning UX principles. For each key screen, the most critical issues were identified and rated on a severity scale from 1 (minor) to 4 (critical). Below, each screen highlights three prioritised issues, ordered from high to low severity, to illustrate the main usability risks and opportunities for improvement.

1. Dashboard Screen
If a building becomes architecture, then it is art
Key Usability Findings
  • High Severity
    The grid (four modules per row) collapses on smaller screens, making text overlays unreadable and buttons too small to tap. This significantly impacts accessibility and prevents task completion on mobile and tablet devices.
    4
  • Medium Severity
    Calls to action and duration labels rely on hover states and are not immediately visible, reducing clarity and discoverability for first-time users.
    3
  • Low Severity
    Minor inconsistencies in icon usage and hover feedback slightly reduce visual coherence but do not block interaction.
    2
2. Interactive Module Screen
Key Usability Findings
  • High Severity
    Hotspot panels overlap with the main visual content and are constrained by a fixed interaction container. On smaller screens, content risks being partially hidden or cut off, making it difficult to read additional information.
    4
  • Medium Severity
    Different hotspot icons and button styles are used across screens, reducing consistency and recognisability. Users must re-learn interaction patterns instead of relying on visual familiarity.
    3
  • Low Severity
    The relationship between hotspots, supporting text, and page navigation is not clearly communicated. While the interaction works, the lack of visual guidance slightly increases cognitive load.
    2
3. Multiple-Choice Question Screen
Key Usability Findings
  • High Severity
    Correctness feedback is communicated only through colour (e.g. blue for correct, pink for incorrect), without text or icon support. This creates an accessibility issue for colour-blind users and reduces clarity for all users, especially in low-contrast environments.
    4
  • Medium Severity
    Feedback messaging is visually detached from the selected answer. Users must infer why an answer is correct or incorrect, which weakens learning reinforcement and increases uncertainty.
    3
  • Low Severity
    Action buttons such as “Opnieuw” and “Volgende” are functional but visually understated. Their states do not clearly guide users through the learning flow, although navigation remains possible.
    2
Survey Results — User Perception & Platform Value

To complement the heuristic evaluation, this case study draws on the results of a user survey conducted in 2025 by the ErfgoedAcademie, with a parallel survey run by the joint platform VBNE (Leer je Groen). The combined response reached over 270 users, consisting primarily of volunteers and a substantial group of heritage professionals.


The survey aimed to understand overall user experience, perceived usefulness, and future relevance of the platform. The survey did not focus on detailed usability issues or task-based testing, but instead captured users’ general perceptions of learning experience, accessibility, and value.

Key Survey Findings
Overall, user feedback was highly positive.
Respondents describe Leer je Erfgoed as:
  • Accessible and easy to use
  • Clearly structured and intuitive to navigate
  • Well-balanced in its use of text, visuals, and assignments
  • Suitable for both volunteers and professionals seeking introductory knowledge
Engagement and loyalty indicators were particularly strong. A large majority of respondents indicated they would recommend the platform to others and expressed a desire to remain connected in the future.

At the same time, the survey reveals an important tension:
While the experience is positively rated, many users express a need for greater depth, clearer learning paths, and more specialised content, especially among professional users. There is also a recurring desire for improved mobile and tablet usability, more frequent updates, and clearer progression through modules.

These findings suggest that the platform’s foundation is trusted and well-received, while its long-term growth depends on refinement, scalability, and more targeted learning journeys.
Key Survey Metrics Highlights
  • 90%
    Would be Recommended
    A strong majority of users indicate they would recommend Leer je Erfgoed to others, reflecting high trust and perceived value.
  • 100%
    Usability Satisfacion
    Most respondents rate the platform’s usability positively, describing it as clear, accessible, and easy to navigate.
  • 65%
    Need for Mobile / Tablet UI Imrovement
    Many respondents mention the mobile and tablet experience as a key area for future improvement, particularly for sustained use.
Interpreting the Survey Results
Because the survey focuses on overall satisfaction and perceived ease of use, it does not surface more structural or interaction-level usability issues. In particular, it does not assess how the platform performs on mobile and tablet devices, how accessible it is for users with different abilities, or how cognitively demanding certain modules may be due to content density or interaction design.

It does not capture:
  • Mobile-specific usability or responsiveness issues
  • Accessibility compliance (e.g. contrast, tap targets, keyboard navigation)
  • Cognitive load and content density
  • Interaction clarity in complex components such as hotspots or quizzes
This distinction is essential. Positive usability ratings indicate trust and appreciation, but do not automatically rule out usability or accessibility risks, particularly in legacy systems that have not been systematically reviewed against modern UX, accessibility, or responsive design standards.
Prioritisation & Strategic Direction
How improvement priorities were defined
Prioritisation was guided by a combination of usability severity, technical feasibility, and the available budget and delivery timeline.

Rather than pursuing a full redesign, the strategy focused on high-impact improvements that could realistically be implemented within the existing platform architecture.
Particular emphasis was placed on issues affecting accessibility, task completion, and clarity of the learning flow, especially on smaller screens.

Improvements with high user impact and low to moderate implementation effort were prioritised, particularly where changes could be applied consistently across multiple screens. Issues with lower severity or requiring substantial structural refactoring were documented as future opportunities, rather than addressed within the current scope.

The impact–effort matrix illustrates how usability issues were prioritised to maximise user benefit within technical and budget constraints.
Design Proposals & Implementation Direction

Based on the prioritisation outcomes, a set of before/after mockups with annotations was created to translate usability findings into concrete, implementation-oriented suggestions. These mockups are exploratory rather than final and are intended to support discussion with developers by clarifying layout, hierarchy, and interaction improvements within the existing platform constraints.


Annotations highlight what should change and why, indicate which elements are intended as global updates, and help estimate technical effort without prescribing specific technical solutions.


The proposed improvements focus on clarity, accessibility, and consistency, prioritising changes that can be applied across multiple screens and deliver the highest user impact within a limited timeframe.

Conclusion & Next Steps

The evaluation showed that while incremental usability improvements are possible within the current platform, fundamental limitations of the legacy codebase make deeper redevelopment costly and difficult to scale. For this reason, the strategic focus shifted toward rebuilding on a new, AI-powered e-learning platform with greater flexibility and long-term potential.


These insights now serve as design and UX requirements for the new platform, helping ensure that future development is grounded in real user needs rather than assumptions. In this way, the evaluation becomes a strategic bridge—connecting a trusted but constrained legacy system to a more flexible, scalable learning environment designed to support long-term user growth and evolving educational goals.

Made on
Tilda