Using UDS Rankings to accurately define the development plan for employees

Mr. Mendez was the head of the group and P&L owner of a large engineering organization. It was a highly matrixed and multi-level organization of talented individuals with varying degree of tenure with the company. Mr. Mendez wanted to put in place a program where a subset of his high potential senior managers could be placed on a track to potentially take on higher responsibility within the organization. He wanted to identify the good and help them become great.


He started by seeking help from his HR partners hoping that he would get most of what he was looking for from the annual HR evaluations. Annual employee evaluation was the single most hated activity within his engineering organization. Managers would dread the 6 week period where they had to complete the lengthy evaluations for their direct reports. On taking a deeper look at the data, Mr. Mendez quickly realized that the HR data was unusable for his project. Here were some of his observations -

  • Single data point - While some managers reached out to other team members and peers of the employee, the evaluation for each employee was written by the employee’s manager. This involved summarizing, prioritizing, and, potentially a personal bias of the manager writing the evaluation.
  • No access to details - Mr. Mendez wanted details, specifics and anecdotes. He wanted the ability to get a better picture of the employee to put in place the most effective development plan targeted to the identified areas of improvement. This was not the primary objective of the Annual Evaluation as it was geared towards making compensation decisions.
  • Lack of Team View - The evaluations were written for individuals. He would have to spend considerable amount of time to get the team view he was looking for. And because of the single data point, he was hesitant to use the ratings to compare across his team.

Mr. Mendez also had data from a culture assessment and engagement survey done earlier but the data in these surveys compared and benchmarked his company against other companies in the US. That comparative data was too broad and lacked root cause analyses.


We designed a survey based on the study objectives. For the senior manager pool, soft skills were extremely important and Mr. Mendez wanted the ability to get an assessment at a very fine grain level. In particular, he wanted to emphasize communication as a core competency considering its importance in his matrixed organization. We used a 10 point rating scale for all focus attributes and asked for detailed feedback on the employee. Considering the nature of the organization, it was important to get feedback from direct reports, peers, managers, and, from across dotted-lines where managers had to work with folks from other teams. So a 360 Survey was implemented where all direct reports, supervisors, managers and peers were invited to complete the evaluations. Mr. Mendez selected the pool of managers to be reviewed and the system selected the reviewers based on the organizational structure of his team, thereby, eliminating any bias and ensuring uniformity across all employees evaluated.

95% of the evaluations were completed. Respondents spent less than 4 minutes to complete the numerical rating portion of the survey. A lot of detailed feedback was received from the open-ended questions and system metrics showed that respondents spent a good bit of time thinking and adding the feedback.


Here is how the rankings came in for effective communication. The table shows the list of evaluated employees ranked in the descending order by the average score received. In addition to the individual scores and employee rank within the group, Mr. Mendez could also see the statistical measures to get a grasp on the range of the rankings and the spread of the rankings.

One of the managers on the fast-track shortlist was Andrew LONG. Mr. Mendez was surprised to see that he ranked 14th in the list. Andrew’s direct supervisor had the same impression and they began to doubt the results.


Then they looked at the ranking on the same attribute, but this time, with the Up, Down, Sideways (UDS) spread. The first column in this view is the same as seen previously -- rankings are calculated based on the average of all the responses received. The other three columns are based on the orientation of the reviewer. The rankings in the “Down” column include all the reviewers who are below Andrew in the organizational chart -- Andrew’s direct reports in this case. Up includes Andrew’s managers, supervisors and others above him in the organizational chart. Sideways include Andrew’s peers and other dotted-line relationships outside the group.

This view is very revealing. It shows the difference between how Andrew’s communication is perceived by his coworkers above, below and across from him. He seems to be very effective in communicating with higher ups, but doesn’t have the same outstanding rating from his direct reports and peers. Mr. Mendez or Andrew’s manager would have found it very difficult to arrive at this conclusion considering that matrixed nature of the organization.


Mr. Mendez asked Andrew’s manager to drill down and get at the atomic data -- data from individual evaluations of Andrew. He was able to see details right away. The histogram on the top shows the spread of the 8 ratings that Andrew received. Andrew has the highest rating of 10 and the lowest rating was 5. That is a wide range of ratings as evidenced by the standard deviation of almost 2.

Now the obvious question is to find out “who” contributed to the low ratings. Because that is what will help Andrew develop from good to great. The Sankey chart of the UDS split has the answers. Its all green in the “Up” direction. Andrew had one rating of 5 and that came from Sideways. The ratings of 5, 6 and 7 which are dragging down Andrew’s overall score all came from “Down” and “Sideways”.

The Sankey View lets you choose your starting point. We chose the numerical scores as the starting point previously. If you wish to view how Andrew was rated in each of the segments, you can also peruse the chart from right to left. A glance at the chart confirms what led us to this level of detail and that is the lower ratings in “Sideways” as compared to “Up” and “Down”.

Andrew’s development plan included accolades for his effective communication up the chain and highlighted the development area of increasing the effectiveness of communication with peers and direct reports.


Analytics vs. Insights
This is one of the example of the difference between analytics and insights. Calculating average score, ranking employees in descending order, showing a tabular view of the data is analytics. Helping managers uncover the not so obvious issues is where Insight Magnet excels. We use all the tools and capabilities at our disposal to help users answer questions. Any assumptions can be examined as all the information is based on the collected data. Rolling up from collected data ensures the objectivity and eliminates bias from the inferences.
Up, Down, Sideways views

As in this case, just based on the average rating, communication probably would not have made it to the development plan. With the insights available, there is a precise identification of the excellent communication with higher ups and constructive feedback to improve your communication with peers and direct reports. This would contribute towards making the development plan for Andrew data driven, laser focussed and measurable.

Insight Magnet provides the level of detail and ability to dice and slice by orientation and the users can decide the applicability and usage in the particular cases. The UDS views can have an impact based on the level of the employee and the goals of the study. Sideways communication could be emphasized over individual contributor core technical skills if a transition from a Director to a VP is being reviewed. Communication in the Up direction might be an area of emphasis for new hires in an organization.

Data driven Insights

It is important to have comprehensive and trustworthy data to make any process data driven. By using the 360 survey, Mr. Mendez collected invaluable data on his talent pool. After initial hesitation owing to prior experiences of time and effort needed in other employee surveys, he discovered that with the right tool and optimal survey design, reviewers actually spend less time completing the surveys and at the same time provide valuable insight that would be otherwise unavailable. Instead of one data point, he had multiple data points on each employee.

The value of using data without manipulation and subjective bias also is highlighted. By using data as entered by the source, the company can retain valuable talent development information that is not tied to any individuals. If individuals move on, are transferred or assume new roles, the insights for talent development stay with the company and are always easily accessible.