Employee experience is rapidly becoming one of the key topics on the CHRO agenda. Yet many of the conversations that I hear miss a critical factor: that creating valuable employee experiences is a systematic and data-driven process.
When I left a senior HR role in 2009 to build a business ‘to help make HR an empirically-driven function’ one of the key areas of information that we started with was experience data. In the diagram above, which I’ve taken from one of our earliest presentation decks, the components at the bottom right are all ways of measuring experience.
Our earliest proposition said OrganizationView focused on 3 things:
- measurement & meaning – collecting data and making sense of it through analytics
- employee-centric design – as we said ‘use a scientific approach to ensure technologies and services are closely aligned to users’ needs and behaviours
- develop and deliver – moving analysis into production
Why such a focus on experience in 2009? Well, my background in the early noughties was centred around understanding in a deep way how to systematically understand user experience. Lots of this was in the area of candidate experience. You can see some of this in a 2004 article by David Bowen in the Financial Times – subscription needed – that came after a long conversation we had about candidate needs from my research at the time. It’s about building career sites and recruitment systems that are based around optimising the candidate experience.
As an aside when I joined UBS in 2005 to launch their first global careers site on the first meeting of the project team, when we were discussing governance I added one rule: “if we can’t decide what to do we’ll test it with users in an experience lab.” We tested lots (UBS had two user-research labs and we also ran tests in London) and the bank came (joint) top in the FT web ratings in the career section that year. We cut our marketing budget that year by over-investing in research.
Some of this philosophy came from working in a couple of firms where my close peers were working on projects with IDEO. We took this view and many of the techniques into recruitment, making it candidate centric and based on experience and relationships. The key though was that the process was heavily research-centric. Experience design is highly aligned with empirical decision making. It is systematic and based on data. A central theme is to actively and constantly listen and understand the experiences your stakeholders.
IDEO, in their 51 method cards, separate their ‘measurement’ approaches to 4 categories – Learn, Look, Ask and Try. What they all are is ways of understanding how the user experiences a product or service or the part of their life where the offering will fit. Some are very qualitative, some more quantitive. I believe all qualitative data can be quantitive if you capture enough examples. Also, the first thing you do with qualitative data is to add meta-data which makes it quantitive. In the end data is just information.
From Candidate Experience to Employee Experience
The roots of Employee Experience came from Candidate Experience. From 2002 I smashed my head against the proverbial wall for a long time trying to evangelising why it was critical. The Talent Board folks did a much more effective job.
One of the slides we used to show in the early days was the following graphic. In it we compared the importance of experience as a driver of satisfaction in banking and in work. We used internal bank research (not UBS) with some re-cut data from the CEB. It turns out that in each case components of the offer which could be classified as ‘experience’ account for about 70% of what drives satisfaction, and therefore engagement.
The way an employee thinks about their organization is the sum of their experiences. At different stages in their journey from consideration, through selection to employee and alumni their perception will change. How that perception develops is the sum of their experiences. I discussed how this is linked with the EVP in early 2011.
Employee Experience and People Analytics
What we can establish is that experience design is both systematic and data-driven. Yes, it incorporates systems and user experience but critically it includes experiences that have nothing to do with systems. Even with systems you need to understand what people were doing before they go to the system and what they do after using it.
Our vision of People Analytics is that it should drive evidence-based decision making about the workforce in organisations. We have always felt that that evidence is a mixture of quantitive and qualitative data. We believe that experience measurement is a core element of the role of a People Analytics team.
In the graph above we show that 70% of the drivers of satisfaction is experience based. If we think of the current state of People Analytics too many firms only use existing data from their HR systems to develop their models. None of this data is likely to be describing experiences. They’re building models trying to squeeze meaning without signal about the important part.
The analysts’ job is not to build accurate models, it’s to answer critical questions with data. Given how important a driver experience is it needs to include experience and therefore many analyses need to include experience data these models. The analyst needs a robust and automated way of capturing this data.
At the heart, this was the basis from which we decided to build Workometry. Capturing open, reliable experience data at critical touchpoints – what some call ‘moments that matter’ – and doing so in a way so that it can be integrated into sophisticated models is critical to understanding and managing the employee experience.