52  HR Analytics: Talent, Retention, and Performance

52.1 Why HR Analytics Matters

For most of corporate history HR reported on people the way a librarian reports on books — counts, categories, updates, and nobody’s decision changed.

HR analytics is the discipline of moving the function from counting to deciding — connecting the people line of the P&L to the levers leadership can actually pull. Thomas H. Davenport et al. (2010) argued that talent analytics is not a sub-discipline of HR but a strategic capability, and that the companies competing on it look quite unlike the ones still running headcount reports.

For a BI analyst, HR clusters into three jobs. Talent acquisition and workforce planning answers who do we need, when, where, and at what cost? Retention and engagement answers who is at risk, why are they leaving, and what works to keep them? Performance and productivity answers who is delivering, who is stuck, and is the curve fair? Janet H. Marler & John W. Boudreau (2017), in their evidence-based review, find that HR analytics adds business value mainly when the visualisations are clear enough for line managers — not just HR — to act on.

TipThe HR-dashboard contract

Three rules separate HR dashboards from every other kind:

  1. Privacy is the first design choice, not the last. Individual-level data must be aggregated, masked, or behind row-level security. A dashboard that shows a single employee’s salary or rating to anyone other than their manager is a compliance breach.
  2. Action over insight. 26 percent attrition in tech in Q3 is an insight; 5 specific managers with 3+ exits in 6 months is an action. The dashboard must navigate to the latter.
  3. Trend beats snapshot. People metrics move slowly. A snapshot rarely tells the story; a 12- or 24-month trend usually does.

52.2 Talent Acquisition and Workforce Planning

Workforce planning is the most under-instrumented part of HR in most organisations. Hiring decisions are usually made one role at a time, by one manager, in a hurry. Analytics shifts the conversation from fill this requisition to do we have a plan for the next 18 months and is hiring on that plan?

TipThe talent-pipeline funnel

flowchart LR
  A[Sourced<br/>candidates] --> B[Screened]
  B --> C[Phone interview]
  C --> D[Onsite or technical]
  D --> E[Offer extended]
  E --> F[Offer accepted]
  F --> G[Joined]
  style A fill:#E8F0FE,stroke:#1A73E8
  style G fill:#E6F4EA,stroke:#137333

The talent funnel has more steps than the marketing funnel and each step has a meaningful pass-through rate. A funnel chart per role family — engineering, sales, plant supervision, customer support — shows where the leak is. A 60 percent offer-acceptance rate, for example, is a compensation problem; an 18 percent screen-to-phone-interview rate is a sourcing problem. Without the funnel the team mistakes the symptom for the cause.

TipThree Pipeline-Time Metrics
Metric Definition Owned by Visualisation lens
Time-to-fill Days from requisition open to offer accepted. Recruiting Histogram with median and 90th percentile.
Time-to-hire Days from candidate first contact to offer accepted. Recruiting and hiring manager Box plot per role family.
Time-to-productivity Days from join date to fully productive (per role definition). Hiring manager and L and D Cohort line chart of productivity ramp.

The 90th-percentile of time-to-fill is more decision-relevant than the median: it shows the long-tail roles that are blocking projects. A box plot per role family makes the long tail visible immediately.

TipWorkforce-plan vs actual

A workforce plan is a forecast of headcount by function, level, and location, by quarter, for the next 12-18 months. The dashboard view is a stacked area chart with planned headcount as the bands and actual headcount as a heavy overlay line. Where actual lines lag the plan, the recruiting team is late; where actual leads the plan, the function is over-hiring. Both deviations matter to finance.

WarningShow the cost of the gap

A vacancy is not free; it costs lost productivity, overtime on the rest of the team, and sometimes contract-staff premium. Add a cost-of-vacancy card to the workforce-planning dashboard, computed as (loaded cost per head per day) × (days vacant) × (productivity factor). Translating the gap into rupees is what gets the recruiting budget approved.

52.3 Retention and Engagement: The Attrition Story

Attrition is the most-watched HR number, and the most-misread. Headline attrition — we lost 18 percent last year — hides the only useful question: who left, by what segment, and was it the people we wanted to keep?

TipSix Cuts of Attrition, Each Surfacing a Different Decision
Cut Why it matters Visualisation lens
Voluntary vs involuntary Different processes, different levers. Stacked column trended over time.
Regretted vs non-regretted Losing high performers is the actual cost. Two-line trend with regretted attrition highlighted.
By tenure band Early-tenure attrition is an onboarding signal. Cohort retention curve.
By performance rating High-performer attrition is the warning light. Heatmap of rating x exit count.
By manager Manager-level attrition reveals leadership issues. Manager-level table sorted by attrition rate.
By location and function Localised hot-spots may need targeted retention. Choropleth or grouped bar.
TipThe cohort retention curve

The cohort retention curve plots, for each hiring cohort (say, all hires in Q1 FY24), the percentage still employed at 3, 6, 12, 18, and 24 months. Cohorts stack on the same chart. A typical pattern shows steeper drops at the 6-12 month mark for poorly-onboarded cohorts and a flatter curve for cohorts that benefited from a programme intervention. The chart is the single most useful HR-analytics visual because it converts the headline attrition number into a story about when and which cohort.

flowchart LR
  A[Hire cohort<br/>Q1 FY24] --> B[Track active status<br/>at 3, 6, 12, 18, 24 mo]
  B --> C[Compute<br/>% still active]
  C --> D[Plot per cohort<br/>on shared axis]
  D --> E[Compare cohorts<br/>spot the breakpoint]
  style E fill:#E6F4EA,stroke:#137333

TipEngagement surveys: scores and drivers

Engagement-survey dashboards are usually three views together. A score-trend line chart shows the headline engagement score by quarter. A driver heatmap shows the constituent factors (role clarity, manager quality, growth opportunity, recognition) by team or function, with cell colour = mean score. A comment cloud or topic chart shows what employees actually wrote in open-ended fields, summarised by NLP topics or keyword tags.

Janet H. Marler & John W. Boudreau (2017) caution that engagement-score dashboards often fail to drive action because line managers see only the headline number and not the actionable drivers. The dashboard design fix is the driver heatmap: it gives the manager a specific thing to work on, not a number to feel anxious about.

TipPredictive flight risk

Predictive flight-risk models combine tenure, performance trajectory, recent salary action, manager change, internal mobility history, and engagement-survey responses to score each employee’s probability of leaving in the next 6-12 months. Visualised at the team level (with individual scores hidden by role-based access), the chart is a stacked bar showing low / medium / high flight-risk shares per team. Managers see the count of high-risk people on their team without seeing who; HR business partners see the names with the appropriate access.

WarningPredictive flight risk is the highest-stakes HR-analytics product

A flight-risk model can be misused as a ranking for adverse action — a list to not invest in. The ethical deployment is the opposite — a list to invest in, support, and have a conversation with. Build access controls, audit logs, and manager training before the model goes live; without these, a well-intentioned dashboard can break trust irreversibly.

52.4 Performance and Productivity Analytics

Performance dashboards report whether the rating distribution is healthy, whether calibration is consistent across managers, and whether pay tracks performance — three questions with very different politics around them.

TipThe rating distribution

A histogram or stacked bar of performance ratings by function, level, or manager surfaces three patterns immediately:

  • Compression — almost everyone in the middle band; nobody differentiated.
  • Inflation — the curve is heavy on the high end.
  • Skew by manager — some managers rate consistently higher or lower than peers, suggesting calibration issues.

Calibration dashboards visualise the third pattern with a small-multiples bar chart, one panel per manager, with a reference line at the function average. Outliers in either direction are flagged for the calibration session.

TipPay-for-performance scatter

The pay-for-performance scatter plots performance rating on the x-axis and total compensation (base + variable) on the y-axis, with each point an employee. A regression line shows the average pay-for-performance slope. Points well above or below the line are anomalies — well-paid low performers, or low-paid high performers — and the dashboard is built to identify both. Both kinds of anomaly destroy trust if left unaddressed.

TipFive Performance-Analytics Visuals
Visual What it shows
Rating distribution histogram Shape of the curve overall and by segment.
Calibration small-multiples Whether managers rate similarly.
Pay-for-performance scatter Whether pay tracks performance, with anomalies.
9-box talent grid Performance vs potential, four to nine cells.
Span-of-control table How many reports each manager has, flagging spans too wide or too narrow.

The 9-box places employees into a three-by-three grid: x-axis is performance (low, medium, high), y-axis is potential (low, medium, high). The cells become talent categories — star, contributor, under-performer, enigma. The grid is controversial because it embeds judgement, and the dashboard must surface that judgement transparently. Cell colour shows count; click drills to the names (with role-appropriate access). Trended over time, movement between cells tells the more useful story.

52.5 Common Pitfalls

CautionWhat goes wrong
  1. Headline attrition without the regretted/non-regretted split. A 22 percent attrition number with mostly low performers leaving is healthy; the same number with high performers leaving is an emergency.
  2. Engagement scores as the only engagement view. A headline number that goes up or down 0.3 points without driver decomposition tells the line manager nothing they can act on.
  3. Predictive flight-risk used for ranking adverse action. The fastest way to destroy trust in HR analytics.
  4. Salary or rating data without RLS. A privacy and compliance breach waiting to happen.
  5. Snapshot dashboards for slow-moving people metrics. Without a 12- or 24-month trend, normal noise looks like a crisis.
  6. No financial translation. 18 percent attrition is a number; 18 percent attrition costing 32 crore in loaded backfill cost is a budget conversation.
  7. Calibration views that publish manager identity. A leaderboard of who rates highest creates strategic gaming. Anonymise managers, show the distribution, identify outliers privately.
  8. Time-to-fill medians without 90th-percentile. The median looks fine while the long tail strangles project plans.
  9. Survey response rates buried. A 12 percent response rate makes the headline score noise. Show response rate prominently.
  10. HR dashboards built only for HR. Janet H. Marler & John W. Boudreau (2017) emphasise that analytics adds value when line managers read it. Build manager-facing views, not just HR-facing ones.

52.6 Illustrative Cases

NoteThree case sketches

Yuvijen Telecom regretted-attrition pivot. The HR analytics team replaces the monthly attrition tile with a Power BI dashboard that splits voluntary attrition into regretted and non-regretted. The headline number — 19 percent — barely changes, but the regretted line was rising faster than the headline implied. Drilling into manager-level tables reveals four field-circles where regretted attrition was over 30 percent, all reporting to the same regional ops director. The board approves a leadership intervention; regretted attrition in those circles drops by 11 points in two quarters.

Yuvijen Stores onboarding cohort fix. Cohort retention curves built in Tableau show that hires from a particular sourcing channel had a 35 percent drop in the first 90 days, while other channels held above 90 percent. The team rebuilds onboarding for that channel — a buddy programme, structured training, weekly check-ins — and the next two cohorts converge with the rest of the company. The retention curve is the chart that won the budget.

Yuvijen Forge Components Ltd. pay-for-performance audit. An anonymous internal complaint about pay fairness triggers an audit. The compensation team builds a Power BI scatter of rating vs total compensation, faceted by gender, function, and tenure. The chart reveals two specific bands where female engineers in the 5-8 year tenure range fell below the regression line by 8 to 12 percent. A targeted correction is funded and implemented in the next compensation cycle; the dashboard is rerun every quarter and made available (in aggregate form) to the works council.

52.7 Hands-On Exercise: Build an HR Analytics Dashboard

NoteThree-page HR dashboard

Aim. Build a three-page HR-analytics dashboard in Power BI that ties pipeline, retention, and performance to the same workforce, with the privacy and access controls the function requires. Tableau equivalents are noted.

Scenario. You are the BI lead in HR at Yuvijen Telecom. The CHRO has asked for a dashboard that lets her see, by Monday morning each week, what the recruiting funnel looks like, which cohorts and managers are at attrition risk, and whether the rating-and-pay relationship is healthy. Manager-level access must be enforced.

Deliverable. A three-page Power BI report — Pipeline, Retention, Performance — with row-level security, an anonymisation layer for distribution to skip-level managers, and a manager-friendly view that hides individual-level data while still being actionable.

52.7.1 Step 1 — Load and model the data

Use Get Data in Power BI to load five CSVs:

  • headcount.csv — EmployeeID, Function, Level, Location, JoinDate, ExitDate, Manager, Tenure, Rating, Compensation.
  • requisitions.csv — RequisitionID, Function, Level, Location, OpenDate, FilledDate, ChannelOrigin.
  • pipeline.csv — RequisitionID, Stage, CandidateCount, StageDate.
  • engagement.csv — Quarter, Function, Manager, Score, ResponseRate, DriverScores (long format).
  • flightrisk.csv — EmployeeID, RiskScore, ScoreDate (output of the model team’s notebook).

Type the columns. Build a DimDate calendar table; mark it as the date table. Build a DimEmployee table with hashed EmployeeID_Hash for views shown above the manager line. Build a ManagerSecurity mapping table that lists, for every Manager UPN, the EmployeeIDs they are allowed to see.

52.7.2 Step 2 — Page 1: Pipeline

Build four visuals.

Talent funnel. A funnel visual with the seven stages from Sourced to Joined. A slicer lets the recruiter filter by Function and Location. Annotated stage-to-stage conversion percentages.

Time-to-fill box plot. A box-and-whisker visual (use the marketplace visual or a calculated equivalent) per Function, with median, 25-75 IQR, and 90th percentile clearly marked.

Workforce plan vs actual. A stacked area showing planned headcount by Function, with a heavy line overlay for actual.

Cost-of-vacancy ticker. A card visual with daily roll-up: SUMX(VacantRoles, LoadedDailyCost * DaysVacant * ProductivityFactor). The number is the COO’s preferred metric.

Tableau alternative: funnel as a sorted bar; box plots native; stacked area native; card as a single big-number sheet.

52.7.3 Step 3 — Page 2: Retention

Build four visuals.

Cohort retention curve. A line chart with one line per hire cohort (quarterly). Y-axis: percentage still active. X-axis: months since joining.

Attrition decomposition. A 100 percent stacked column trended monthly, with bands for Regretted Voluntary, Non-regretted Voluntary, and Involuntary.

Manager attrition table. A table with columns Manager, Headcount, Voluntary Exits 12mo, Regretted Exits 12mo, Engagement Score, sorted by regretted exit rate. Conditional formatting flags the top decile.

Engagement driver heatmap. A matrix with rows = Function, columns = Driver (Role Clarity, Manager Quality, Growth, Recognition), values = mean score, conditionally formatted.

Add a Flight-risk band card showing the count of employees scored High by the model, broken into bands by Function. Use a measure that filters out individual identity at this aggregate level; identity is only visible on a drill-through page protected by RLS.

DAX measures:

RegrettedAttrition_12mo =
DIVIDE(
    CALCULATE(
        DISTINCTCOUNT(headcount[EmployeeID]),
        FILTER(headcount,
            headcount[ExitDate] >= TODAY() - 365
            && headcount[ExitDate] <= TODAY()
            && headcount[ExitFlag] = "Regretted Voluntary")
    ),
    CALCULATE(
        AVERAGEX(VALUES(DimDate[Date]),
            CALCULATE(DISTINCTCOUNT(headcount[EmployeeID]),
                FILTER(headcount,
                    ISBLANK(headcount[ExitDate]) || headcount[ExitDate] >= DimDate[Date])
            )
        ),
        DimDate[Date] >= TODAY() - 365 && DimDate[Date] <= TODAY()
    )
)

Tableau alternative: cohort curve as a line with cohort dimension on Colour; stacked column native; tables and heatmaps native.

52.7.4 Step 4 — Page 3: Performance

Build three visuals.

Rating distribution. A 100 percent stacked column by Function, with Rating bands (1, 2, 3, 4, 5) on the colour series.

Calibration small-multiples. A small-multiples bar chart, one panel per Manager (anonymised as M001…), showing the share of each rating. A reference line marks the function average for each rating.

Pay-for-performance scatter. Scatter with Rating on x-axis, Total Compensation on y-axis, points coloured by Function. A trend line is fitted; outliers beyond a parameter-controlled threshold (e.g., more than 1.5 standard deviations from the trend) are highlighted in red.

Tableau alternative: stacked bars native; small-multiples via Trellis; scatter with reference line and calculated outlier flag.

52.7.5 Step 5 — Row-Level Security and audit logging

Implement RLS in Power BI:

  • Manager role. [EmployeeID] IN VALUES(ManagerSecurity[EmployeeID]) WHERE ManagerSecurity[ManagerUPN] = USERNAME(). A manager sees only their direct and skip-level reports.
  • HRBP role. Sees their assigned function and location. RLS table maps HRBP UPN to allowed Function-Location combinations.
  • CHRO role. Sees aggregate views; individual identity is replaced by hashed EmployeeID_Hash everywhere except the specific drill-through page.
  • Compensation Audit role. Time-bound role granted only during compensation reviews, audited separately.

Test by viewing the report as each role to confirm scope. Enable workspace audit logging.

52.7.6 Step 6 — Manager-friendly view

Build a fourth, lighter page targeted at line managers. It contains three things only:

  • The manager’s own team’s regretted-attrition trend.
  • The manager’s own team’s engagement driver heatmap.
  • A next-conversation card that names — only visible to that manager — up to three high-flight-risk team members and suggests prep questions.

Distribute as a Power BI app, not as a workspace, so the audience experience is curated.

52.7.7 Step 7 — Privacy review and ethics sign-off

Before publication, run a privacy review with the legal and ethics teams: confirm that the flight-risk model output is used only for retention investment, never for adverse action; confirm that the calibration small-multiples mask manager identity in views above the people leadership line; confirm that compensation outliers are handled in a structured remediation process, not in a public ranking. Record the sign-off in a metadata page on the dashboard.

TipConnect to the Visualisation Layer

HR analytics relies on the visualisation idioms established earlier. Cohort retention curves are the same shape as the survival curves of Chapter 25. Funnels carry the recruiting story the same way they carry marketing campaigns in Chapter 49. Heatmaps from Chapter 12 surface engagement drivers and pay-for-performance distributions. Bullet charts from Chapter 12 replace the engagement-score gauges that procurement-ed survey vendors keep building into their dashboards. Box plots from Chapter 21 carry time-to-fill. The privacy and RLS discipline of Chapter 36 is more central here than anywhere else because the data is uniquely sensitive. The storytelling discipline of Chapter 48 is what turns a regretted-attrition trend into a recruiting-budget conversation.

TipFiles and Screen Recordings

Power BI three-page HR dashboard with RLS (yuvijen-telecom-hr.pbix), Tableau equivalent (yuvijen-telecom-hr.twbx), workshop dataset (yuvijen-telecom-hr-data.xlsx), separate manager-facing app build (yuvijen-telecom-manager-view.pbix), and a screen recording of the dashboard tour (yuvijen-telecom-hr-walkthrough.mp4) will be embedded here.

Summary

Concept Description
HR-Dashboard Contract
Privacy First Individual-level data must be aggregated, masked, or behind RLS
Action over Insight 26 percent attrition is an insight; 5 specific managers is an action
Trend over Snapshot People metrics move slowly; 12-24 month trends tell the story
Three HR Jobs
Talent and Workforce Planning Who do we need, when, where, and at what cost?
Retention and Engagement Who is at risk, why are they leaving, what works to keep them?
Performance and Productivity Who is delivering, who is stuck, is the curve fair?
Pipeline Visuals
Talent Funnel Seven-stage funnel from sourced through joined per role family
Time-to-Fill Box Plot Box plot per role with median and 90th-percentile clearly marked
Workforce Plan vs Actual Stacked area of planned headcount with actual line overlay
Cost-of-Vacancy Ticker Loaded daily cost times days vacant times productivity factor
Attrition Decomposition
Voluntary vs Involuntary Different processes, different levers — split first
Regretted vs Non-Regretted Losing high performers is the actual cost
By Tenure Band Early-tenure attrition is an onboarding signal
By Performance Rating High-performer attrition is the warning light
By Manager Manager-level attrition reveals leadership issues
By Location and Function Choropleth or grouped bar reveals localised hotspots
Cohort Retention
Active Percentage Tracking Per-cohort active percentage at 3, 6, 12, 18, 24 months
Cohorts on Shared Axis Cohorts stacked on the same chart compare onboarding effects
Onboarding Diagnostic Reveals where intervention worked and where it did not
Engagement Analytics
Score-Trend Line Quarterly headline engagement score line chart
Driver Heatmap Driver factors by team or function with cell colour as score
Comment-Topic View NLP topic summary of free-text survey comments
Response Rate Prominent Response rate must be visible alongside the headline score
Driver Decomposition Decomposition by driver gives line manager a specific lever
Predictive Flight Risk
Retention Investment Use Use predictive flight risk to guide investment, never adverse action
Aggregate Manager View Managers see counts of high-risk people, not identities
Audit and Ethics Sign-Off Access controls, audit logs, and manager training before launch
Performance Visuals
Rating Distribution Histogram Histogram by function, level, manager surfaces compression and inflation
Calibration Small-Multiples Anonymised manager panels show calibration outliers
Pay-for-Performance Scatter Rating on x-axis, total compensation on y-axis with trend line and outlier flags
9-Box Grid Three-by-three performance versus potential grid with movement over time
Span-of-Control Table Reps per first-line manager flagging spans too wide for coaching
Common Pitfalls
Pitfall: Unsplit Attrition Headline attrition without regretted split misleads decisions
Pitfall: Engagement Without Drivers Headline engagement number without driver decomposition is unactionable
Pitfall: Flight-Risk Misuse Predictive flight risk used as a ranking for adverse action breaks trust
Pitfall: Salary Without RLS Salary or rating data without row-level security is a compliance breach
Pitfall: Snapshot Dashboards Snapshot dashboards for slow-moving metrics turn noise into a crisis
Pitfall: No Financial Translation 18 percent attrition is a number; 32 crore loaded backfill is a budget
Pitfall: Named Calibration A leaderboard of who rates highest creates strategic gaming
Pitfall: Median-Only Time-to-Fill Median time-to-fill looks fine while long tail blocks projects
Pitfall: Buried Response Rate 12 percent response rate makes the headline score noise
Hands-On Dashboard
Page 1 — Pipeline Talent funnel, time-to-fill box plot, workforce plan, cost-of-vacancy
Page 2 — Retention Cohort retention curve, attrition decomposition, manager table, driver heatmap
Page 3 — Performance Rating distribution, calibration small-multiples, pay-for-performance scatter
Role-Based RLS Manager, HRBP, CHRO, Compensation-Audit roles with separate scopes
Manager-Friendly View Manager-only app with regretted-attrition trend, drivers, conversation prompts
Privacy and Ethics Sign-Off Recorded sign-off in metadata page before publication