Bye Bye, Gruntwork!

"Our annual performance review process was universally disliked because of the effort involved in it. In three cycles, we eliminated about 90% of the gruntwork from process in my organization. By relieving team members of the tedium, we can now focus on the meaningful and beneficial aspects of the reviews."

- Based on a customer testimonial.

The background

In our company, we have an HR mandated process that we call the annual employee evaluations. These reviews that needs to be completed before any compensation changes can be finalized. Managers are tasked with completing annual reviews for their direct reports. This is a paperless process as a part of an enterprise system and managers have access to any cloud tool imaginable. Yet, it was a nightmare for managers and not surprisingly there was wide variability in terms of compliance with the suggested guidelines and on-time performance. Having gone through the process myself as a participant, manager and a group leader, I have endured the pain, faced the wrath, and grudgingly persuaded my managers to toe the line. The good managers sought out feedback from team members interacting with the employee, reviewed the feedback from those who cared to respond, and added their own feedback to the mix. The feedback was then summarized, edited, and entered into the system. Even well intentioned managers admitted that they were forced with a choice between completing the reviews on time and getting the reviews right. The process took up too much of their time and would invariably get relegated to the bottom of their priority totem pole.

The only good part about this review, is there is no step of selecting participants. Everybody has to be a part of this.

Our first 360 Review

In our team, we often have need to do much specialized and deep dive analyses driven by specific needs. One such need a few years ago was to assess the current team as a part of the succession planning strategic initiative. We wanted to generate a reliable profile of team members based on data that was recent, observation based, free of biases, objective, and atomic. The goal was to be able to reliably answer questions such as -

Using the data from the HR Review was a non-starter, as it contained ratings, narratives that were not amenable to the kind of analyses we wanted to run. This was going to be a deep dive exercise but it was by its very nature going to be limited to a subset of individuals. While we didn't have a percentage goal, we certainly did not want to involve 100% of the team. We were hoping to focus on the top quartile or the top 2 quartiles of performers along with other exclusion criteria based on tenure, headcount and exposure.

We wanted to design a 360 degree review where all those who interact with the participant provide their own feedback. We wanted to capture the data in its most atomic form so that it could be summarized, filtered, or aggregated as needed. Fairness, uniformity and objectivity was key.

Overall, the steps involved in our review process were fairly typical. But considering that the participants had to be selected and our desire to include the managers/directors in the entire process, we wanted to minimize the time spent in selecting participants and assigning reviewers. All of this while maintaining maximum adherence with the key objectives.

And this is where Insight Magnet really made our job easier

Selecting participants

Our starting point was the output and extracts from the corporate HR system and the annual HR Review. Then the fun began. In the first meeting, I could clearly see that managers wanted to maximize their team's participants in the pool. Before I could get back to my office, I had a few emails asking to include additional participants that were excluded by the top level criteria.

Instead of having managers submit their "lists", then someone put them together, then someone review and evaluate them against the criteria, I wanted an top-down approach where all you decide are the criteria. And then the criteria define the selections. I also insisted on tracking exceptions to these to ensure that they are kept to an absolute minimum.

We were able to do this in a few minutes. The HR extract was easily imported into Insight Magnet. Insight Magnet immediately flags discrepancies from the data set and you are ready to query the user list extract that you just imported based on conditions based on the fields from the HR data extract or the attributes available in Insight Magnet.

Choose participants based on a level and job code.
Our HR data extract had multiple levels. Some job title related, some based on salary band. Insight Magnet can construct its own employee org chart and you can assign levels from top-down or bottom-up. You can also combine all the HR data attributes and then define your selection criteria. So we ended up with a criteria that used Insight Magnet Org Chart and fields from the HR extract. The ease of doing this enabled us to have multiple scenarios using small tweaks to the
Now filter out those who do not have a headcount
A few more clicks to do this. We could also set ranges on headcounts.
Additional conditions based on tenure, role, location and other factors.
We could leverage all the fields imported from the HR system and used the analytics available in Insight Magnet to arrive at the required participant pool.
Manager review, comments and finalization
The managers could see the participants selected from their group and provide comments. We were able to finalize the participant pool in a very short time -- without lists being swapped, spreadsheets flying around, emails asking for exceptions. This was all done in a top-down metrics driven approach that we very uniform and objective.

Assigning Reviewers

Ensuring uniformity and fairness

The reality is that any project will involve the fun and the not-so-fun parts. So if you can minimize the not-so-fun

Some gruntwork is unavoidable in any project, but getting annual employee reviews done includes disproportionately high number of tedious tasks affecting all the involved participants. We observed this in the early days of the development of Insight Magnet. Some of our senior team members had experienced this first hand in their previous lives either as a manager reviewing their direct reports or as an employee put through the dreaded process. Driving the this component to the bare minimum was a key design criterion for Insight Magnet. Our customers have managed to eliminate up to 90% of the previously needed gruntwork from their review process after about 3 iterations. The steps we refer to here are not frivolous or unnecessary. They are absolutely needed for the success of the project but it is the tedium involved, time wasted that make it unpleasant and soul-destroying for the person tasked with completing these tasks. The amount of set up work depends directly on the objectives and the attention to detail. We often hear customers absolutely insistent on ensuring uniformity in the execution of the review process. This can be achieved by eliminating or limiting the subjectivity in the design decisions that go into the design of the review. So let us assume that we are designing a 360 degree review, where participants are to be reviewed by all the other employees they interact with. How would we go about assigning the reviewers to the selected participants? The age-old method is to use the dreaded email. Send an email message to the managers of the selected participants and ask them to select reviewers for their participants. A minimum or maximum number of reviewers can be suggested and the list provided by the mangers can be used as the selected reviewers. There are so many issues that make a process like this inefficient, time consuming and in the end not very fruitful towards achieving the stated goal. The person in charge of collecting and collating the reviewers has a thankless job of cracking the whip, the managers selecting the reviewers are free to use their own interpretation of who and how many reviewers each participant needs and

We wanted a make our review process objective, uniform and unbiased. For a true 360 degree review, we wanted to ask feedback on an participant from all the team members that the participant interacted with. The feedback had to be based on first-hand observations or experience during the specified review period (One year in our case). Also we wanted the data to be captured in its atomic state, which meant that every person's input was captured as-is without any aggregation, summarization or editing.