Katherine Heffernan | 15 Jun 2023

Job evaluation remains valued tool for ensuring equal pay and designing grading structures

Job evaluation (JE) is the process of systematically and equitably assessing and defining relativities between different roles within an organisation. It can be used by employers to provide a solid foundation on which robust pay and grading frameworks can be built, putting them in a better position to ensure equal pay. The outcome of a JE process may give employees a clearer view of where their role sits in relation to others, making it easier for them to plot their career paths within an organisation – with potential knock-on benefits for employee engagement and retention. Such exercises can also inform workforce and succession planning activities.

It used to be the preserve of specialists but following a series of high-profile equal pay cases in the 1990s and early 2000s, and then the enactment of the legislation on gender pay gap reporting, it has become more widely utilised and understood by HR and reward practitioners. The IDR survey of job evaluation, carried out in February and March this year, explored how and why employers use such tools in practice. The survey received responses from 45 organisations; the findings below are based on data provided by the 40 employers that use job evaluation. Of the remaining five, two anticipate implementing such a scheme in the future.

Types of job evaluation scheme used

The most common schemes used are analytical (or ‘points factor’) schemes and job classification schemes. Under analytical or ‘points factor’ approaches, which are used by 63% of organisations surveyed, various common factors of jobs, including areas like knowledge or the requirement to communicate with others, are assigned a points score, with the total of all the factor scores used to indicate where the job sits relative to others in the organisation. A further fifth of organisations use a job classification approach, whereby jobs are compared with a selection of representative ‘benchmark’ jobs to assist in deciding their grade. In the majority (70%) of cases, the same scheme is used to cover all roles.

In most cases (68% of respondents), employers use an ‘off-the-shelf’ proprietary job evaluation scheme, with no further customisation to the organisation’s particular requirements. Meanwhile 15% have designed bespoke schemes in partnership with external consultants and a further two respondents (5%) have designed their schemes wholly in-house. Five respondents (13%) use hybrid job evaluation schemes that involve limited tailoring of off-the-shelf products (in one case, this was to more fully assess financial and management responsibilities). The most popular proprietary job evaluation schemes in our sample are Willis Towers Watson and Hay/Korn Ferry in that order (42% and 33% respectively of respondents with off-the-shelf schemes).

Reasons for using job evaluation

Respondents use job evaluation schemes for various reasons (see chart) but those most commonly cited are for benchmarking purposes (85%); to ensure equal pay for jobs of equal value/to conduct equal pay reviews (73%); to inform new (58%) or revised (33%) pay and grading structures; to assign newly-created jobs to an existing pay and grading structure (55%); and to create job descriptions and role profiles (33%). While the process of job evaluation might offer scope for highlighting development needs, no respondent currently uses their scheme for such a purpose, although 28% have used it to facilitate career development (by showing where jobs sit and thereby scope for progression or lateral moves) and 13% have used JE to inform the development of competency frameworks (which in turn may be used to guide progression and career development). 

Scheme design

Within our sample, job evaluation schemes most commonly contain seven factors, although 17% of respondents assess jobs against as many as nine or more factors. 

The most common aspects of work considered by the factors are problem-solving and accountability (both of which feature in 93% of schemes); communication and knowledge (90% in both cases); business impact (87%); and leadership (83%). By contrast, the emotional demands of roles feature in just two schemes (a manufacturer and a housing association).  

Respondents were asked if all job evaluation components included in their scheme carried the same weight. Over two-fifths of respondents (41%) said all factors had equal weighting. Factors often carrying a larger weight in job evaluations include the number of direct reports and financial responsibility. In most (86%) cases, organisations use the same factors across all roles and departments.

Scheme administration

Around two-thirds (65%) of respondents manage scheme administration and analysis wholly electronically, eg by means of software and/or online tools. A minority (15%) operate paper-based evaluation schemes, while the remaining 20% use a combination of these two approaches. Where respondents use job evaluation software (or some other online tool), it is relatively unusual for this to link to other HR programs/management systems, with just two organisations indicating that this was the case.

Managing a job evaluation exercise

On the whole, when organisations conduct a job evaluation exercise they include all jobs rather than a sample (79% of respondents). The most common means of gathering evidence about jobs is to study role profiles or job descriptions (93% of respondents); however, 8% also conduct interviews with jobholders and the same proportion also ask staff to complete questionnaires about their roles. Just under two-thirds (65%) of organisations enlist line managers to validate the information on each job, while 48% also or instead use a project team or job evaluation panel (usually consisting primarily of reward or HR team members, sometimes with additional support from line managers) to do so.

Trade unions, where present (half the sample), generally appear to have a limited role in job evaluation. Of the 11 respondents that provided information about trade unions’ involvement in the process, just three actively consult with them on the scheme or when new roles are evaluated.

Working with the results of job evaluation

Of the 26 respondents that have used job evaluation to inform new and/or existing pay and grading structures, around a quarter (23%) look at how scores cluster around certain points and use these to inform grade boundaries. More commonly (half of these organisations), the number of grades is established in advance, with evaluated jobs then fitted into these accordingly. 

We asked respondents how they handled situations where roles ended up being assigned a higher or lower value, with this affecting grading and therefore salary. Around a quarter (23%) of organisations reported that this had not arisen. A further 35% have ‘red-circled’ pay/grading or offered a period of pay protection for employees who were on a considerably higher salary than other employees doing equivalent work. Where respondents indicated how long this ‘red-circling’ applies for, the majority reported that the employee’s salary remains frozen until market pay caught up. At one organisation, the salary freeze remains in place until the employee leaves the company entirely.

The proportion of respondents that have responded to these circumstances by moving roles to a higher rate of pay was much lower, at 10%. Meanwhile around a fifth (18%) of organisations reported that they had resized jobs as a result of job evaluation, while 13% helped staff to move to new roles on grades commensurate with their previous salary.

Only around a fifth (22%) of respondents operate a formal appeal process for job evaluation outcomes. In practice, the proportion of employees who have appealed decisions regarding evaluation of their role appears to be low, with most organisations reporting that fewer than 1% of employees having disputed evaluation decisions. The success rate of appeals ranged significantly among respondents, from 1% to 80%.

Ongoing use of job evaluation

Having conducted an initial job evaluation exercise, only 5% of respondents report that they have gone on to re-evaluate all roles (or all benchmark roles, if not all their jobs are evaluated by the scheme). However, almost all (88%) have evaluated newly-created roles, while 13% have evaluated jobs at the instigation of employees.

About the survey

The IDR survey of job evaluation was conducted among private-sector employers in February and March 2023. It attracted 45 responses altogether, of which 36% came from the manufacturing and primary sector and the remainder from private services. The survey covered 698,759 employees altogether, with a median workforce size of 3,600.

We offer bespoke research and consultancy services on all areas of pay and reward

Our consultants are highly experienced at undertaking and delivering bespoke research covering a wide range of reward areas including:

  • Pay and benefits benchmarking
  • Total reward benchmarking
  • Job evaluation
  • Employment cost assessments
  • Reward policy and practice
  • Labour market trends

By choosing IDR you can guarantee independently sourced data and practical, easy to interpret results. Contact us to discuss your options.