I have become increasingly concerned with the BLS Monthly Jobs Report’s new jobs calculation, and the report is not at all what it is cracked up to be. 

My argument involves the report’s structure, how the data is gathered, and corollary measures to test validity.

Let’s start with the components of the report:

  • The Establishment Survey of Participating Businesses
  • The Household Survey of Random Calls
  • The Birth/Death Model calculation input

The establishment survey is a voluntary system in which participating organizations submit monthly reports on employment and job openings for select roles to the BLS.

The Birth/Death model is designed to predict the number of companies that “died” and released their employees and the number of new companies that were “born” and began hiring employees. It relies on historical data and is a projection.

BLS collects household surveys either in person or by telephone. The most basic question is, “How many adults in the household are employed?”

Unfortunately, these inputs have degraded over time, yielding increasingly inaccurate results. The proof is in the following revisions, and the picture is not pretty.

Approximately 120,000 organizations currently participate in the establishment survey, and about 43% provide current employment data monthly.

Ten years ago, the monthly participation rate in CES was 63%. This steep 20% drop has reduced the quality of this measurement.

This issue also flows down into the Birth/Death data. The data produced by the model used to be less than 30% of the total number; now, it represents nearly 50%, adding significant distortion or “noise” in the monthly jobs report.

The household survey has its own problems, with most response rates falling below 50%. One recurring issue is that many mobile services identify the call as coming from an Unknown Number, which many people will not answer. Phone calls are not the only method BLS uses to gather household information, but they are a very important one.

Accordingly, the monthly jobs report’s data quality has measurably eroded over the last decade. The question is, to what degree?

I look to several measures for correlation. These include BLS revisions of the prior month’s data, wages, the Workforce Participation Rate, the number of workers picking up part-time jobs for economic reasons, the JOLTs report quit rate, wages, and the unemployment rate. Unfortunately, they tell a different story. 

  • Negative revisions to the initial Jobs Reports for the prior three months total 20,000
  • The workforce participation rate fell to 62.5%
  • The number of workers picking up part-time jobs to keep up increased by 1M YoY
  • The BLS JOLTs report quit rate dropped or stayed steady in most states
  • The Unemployment Rate hit 4% for the first time in 27 months

If the jobs report is correct, I must ask the following questions:

  • Why are the revisions to the Jobs Report down and not up?
  • How did the Workforce Participation rate decline instead of increase?
  • Why are millions working part-time jobs on top of their own? Are the job double counted?
  • Who took the new jobs if the quit rate dropped or did not move?
  • Wages are down. How, if the jobs market is so hot?
  • How did the unemployment rate rise in the face of more available jobs?

Although not precise, none positively correlate with a robust jobs market. I concede that there may be plausible reasons for these discrepancies, but they do not add up for me.

I do not believe the BLS is “cooking” the numbers or is incompetent. However, I believe the methodology behind the monthly jobs report has become compromised for various reasons, and I no longer have high confidence in its validity.

Organization and HR leaders should be cautious about using the monthly jobs Report as a highly accurate reading of the actual jobs market, the relative availability of human capital, or planning for labor acquisition costs.