USER CENTRED DESIGN SERVICES

LIVE

USER TESTING, EXPERT INSIGHTS
AND CONTINUOUS IMPROVEMENT

BETA/LIVE PHASE - SUMMATIVE RESEARCH

METHODS TO CAPTURING USER FEEDBACK

Opposed to the exploratory/formative research methods used in discovery/alpha, these methods are summative
– review/testing looks to success factors seeking satisfaction/approval from users and stakeholders

SHOW AND TELL (AGILE)

A regular meeting to demonstrate what has been produced during the sprint to obtain feedback/approval from stakeholders and users - typically used to show software built but can be used to show any product stage.

USABILITY TESTING

A method of observing users as they attempt to complete tasks on the product whilst thinking aloud. The aim is to reveal areas of confusion and uncover opportunities to improve the overall user experience.

MODEL OFFICE

A simulated office where users who work together (even different sites) carry out tasks using the product. Raises human interaction insights - communication and processes, in turn how the product could support these.

REMOTE TESTING

Allows global access to users in their natural environment. Users carry out tasks via voice/screen-sharing software whilst thinking-aloud. Enables real time recording of users' screen interaction and their chosen journey.

By testing any deployment first privately with small representative groups – success should be ensured when the product is deployed to the full audience

EXPERT INSIGHTS

With the product (or most features) built – in-house or external experts can start to analyse and improve

A/B (SPLIT)

Users are directed to version A or B of a product which differ by a small number of elements (video, buttons, etc). Based on a user goal e.g. 'sign-up to newsletter' traffic tracking data is analysed to determine the most effective version in achieving the goal.

COMPETITOR ANALYSIS

Based on around 5-10 user goals to see how the product measures up against categorised direct and indirect competitors. Will consider such elements as design, content, features, user reviews, load time, customer service, shipping, social media and pricing.

MULTI-VARIATE

Same core mechanism as A/B, but compares a higher number of elements and reveals more as to how they interact with one another. Users are directed to a number of versions - tracking then measures the effectiveness each design combination has on the goal.

EXPERT REVIEW

When budgets/timescales don't allow for research, an experienced expert walks through a product in the shoes of a user type, highlights any problems - design, usability (based on heuristics), accessibility etc and recommends changes via a report or backlog.

Not every product is a new product when research is first employed – the product may exist but seek improvement
– in this case summative methods may form the start of the research lifecycle

CONTINUOUS IMPROVEMENT

Unlike the traditional waterfall approach, the product is not put live and into maintenance mode – with the user centred and agile lifecycle comes continuous improvement – the ongoing effort to improve the product or service with quality prime focus. Depending on the development model improvements are ‘incremental’ over time or ‘breakthrough’ all at once.
With the holistic service design and management approach employed, continuous improvement looks to constantly evaluating business strategy, objectives and results, processes, policy and employee/supplier relations in the light of their efficiency, effectiveness and flexibility – in view of improving the user/customer experience from the inside out.