Zalando 360 feedback process & tool
Lead UX/ Service designer - 4 month project
Skills & Techniques
Planning and hosting a design sprint workshop
User research interviews
Presenting ideas to senior management
User testing under lab conditions
UX strategy and helping shape future iterations
UX Researcher working with a dedicated research team
The largest fashion ecommerce marketplace in Europe. ( If you are from the UK I hadn't heard of it either before I worked here! But ask your european friends and invariably they would have heard of it)
Zalando has seen rapid growth, founded in 2008 as a shoe selling platform it is now the largest company of it’s kind in Europe in terms of sales and employs over 16,000 people.
I was brought in to help rethink and design the second iteration of a 360 feedback tool which was released a year earlier by a third party. It was designed to help employees develop within Zalando and build a feedback culture but the process needed simplifying. There were six different ways to give feedback on a variety of different platforms, some compulsory for performance evaluation and others for development feedback.
User interviews Two user researchers and I kicked off the project by interviewing 10 employees all from different levels and job functions. There were many insights, including:
Almost always feedback is considered an offline behaviour
Employees didn’t know what was used for their performance review leading to suspicion and a demand for more transparency.
The process took too much time for the employees taking them away from their actual jobs and creating resentment when it had to be done in their own time.
Workshop After conducting the interviews it became obvious that user behaviour was not conforming to business ideals. For example, employees did not want to give their co-workers feedback on an online tool, this was seen as rude, ambiguous and not effective for a team-building exercise. There was also a degree of suspicion as to who would see that feedback, due to previous editions of the tool being very ambiguous as to what was public and what was private. The business, however, strongly felt that all feedback should be recorded online. They felt people would be comfortable giving critical feedback if promoted by the right questions. They also felt that it was part of an employees job to conduct the feedback process and struggled to empathise with the time it took away from their core responsibilities.
It was clear that we needed an alignment and understanding between the business and their user base.
Due to time constraints, I proposed a three-day workshop. The participants included all user researchers, data analysts and the HR and product team responsible for the execution of the tool.
The first day of the workshop users needs and business objectives were voiced by the relative experts. Whilst they were going through their slides I asked the participants of the workshop to write any key points that they saw as important. I prompted people to question their bias and what they thought was the current state and voice what they had learnt, this was a good exercise to make everyone feel heard and to create group discussions. The following day's output was a defined list of "How might we" statements that integrated insights from the previous days discussions. I then asked the team to create some sketches using the crazy eight technique to help them visualise what they imagined the tool would look like. Most importantly we began to see more empathy from the business perspective to their employees and from the business to feel more involved in the design process. Without this workshop I believe that the tool that would have been designed would not meet user needs and would have been a disaster when it came to stakeholder management.
Designs and Iterations
Redesign process. Designing this tool meant we had to re-evaluate the process behind what was needed to assess an employee's performance. This highly political area meant that we had to review areas such as:
data comparison and privacy
data collection types and timing
login states and visibility according to a users position
This was done through a series of meetings, workshops and stakeholder interviews. (Due to the sensitive nature of this project I am unable to fully disclose the process and design changes made)
IA Once aligned with the processes I conducted a card sorting exercise with a variety of employees from different job functions as the tool had three login states and access variations, as an employee, as a leader and as a member of a fairness committee. This technique started the formation of the navigation of the site.
Basic Wires Working with a UI designer and developers the basic formation of the concept grew with many iterations. From the very first sketches I presented my designs twice a week to the HR team for feedback and had informal chats and guerrilla testing with Zalando employees that would be using the tool. I presented my wires mainly in sketch form as I felt for this particular set of stakeholders it was important to maintain a flexible stance to the sometimes inflexible business constraints.
Lab condition user testing We held ten user testing sessions under lab conditions to find out whether users understood the concept and were able to navigate through the process.
What performed well in the tool The solution was refined from six different ways to give feedback into two: Performance and Development. All users understood the concept within the user testing session, leaders particularly liked their new design as it was seen as much simpler to navigate and the process would take much less time. Users understood what was going to be used for their evaluation feedback and what was going to be used for general feedback. Users also understood that they were encouraged to have a dialogue with their colleagues both online and offline.
Learnings and future iterations that I set in place as consultant. As I worked at Zalando whilst the tool was rolled out I could see the problem areas. The survey still takes too much time. Employees had the option to decline giving performance feedback if they felt they did not know the requester well enough or if they had too many requests. Employees don't decline out of politeness and so the data captured may not be that valuable. Employees also still resented the time it takes away from their actual work.
What has also been interesting is that although in testing people understood the concept of what and how feedback is used for performance development and remarked that it was fair (they cannot see performance feedback due to positive bias) there has been negativity once it has been rolled out. This could be for a variety of reasons but what I would be keen to investigate further, is how and when the education and guidance was given for the tool to the employees and what help they received when unsure. I also proposed an anonymous survey to see what the feeling towards the process was across all levels of employees as a starting point for another set of user interviews. Finally I would conduct another workshop to investigate whether there are patterns in the perceived issues or/ and it could just be that as a performance tool you will always have polar opinions.
Whilst working within the apps team I worked in a consultant capacity to give feedback for the next iteration of the tool.
What I personally learnt
Evidence is is a great way to get stakeholders onboard. Insisting on a full house coming to user testing really helped people see the pain points and speed up the alignment process
Keeping stakeholders involved throughout the process and having timely check-ins really made sure that key stakeholders were actively involved and there were no sudden design changes needed.
When designing a feedback tool it is important to get feedback! I have realised as a freelancer I don't have the benefit of having consistent feedback, it is now part of process in anything I do to get feedback whether that be running a workshop, the end of a contract or a tense meeting, feedback is crucial for growth.