Department of Labor Logo United States Department of Labor
Dot gov

The .gov means it's official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you're on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Tag Archives: Methodology

New Recommendations on Improving Data on Contingent and Alternative Work Arrangements

The workplace is changing. We have seen more evidence of that in recent months as workplaces have adapted to the COVID-19 pandemic. Even before the pandemic, many of us wanted to learn more about telework, flexible work hours, and independent contracting. We also wanted to know more about intermittent or short-term work found through mobile devices, unpredictable work schedules, and other employment relationships we might not think of as traditional. It’s our job at BLS to keep up with these new work relationships and figure out how to measure them.

In 2018, we released data collected in 2017 about people in contingent and alternative work arrangements. Contingent workers are people who do not expect their jobs to last or who report their jobs are temporary. Alternative work arrangements include independent contractors, on-call workers, temporary help agency workers, and workers provided by contract firms. We also published data in 2018 about electronically mediated work. All of these data reflect the rapidly changing workplace.

Those reports received a lot of attention, but policymakers, employers, researchers, and others told us they want to know more about these nontraditional workers. We need to understand people in jobs that often involve doing short-term tasks, such as ridesharing or data-entry services. Our 2017 survey included a few questions about these arrangements, but this work can be complex and varied. That makes it hard to measure nontraditional work arrangements with just a few questions.

To effectively analyze these hard-to-measure work arrangement, BLS sought out experts on nontraditional work. In 2019, we contracted with the Committee on National Statistics to explore what we should measure if we had the funding to collect and publish more data about these workers. We asked the committee not to recommend changes to the main Current Population Survey, the large monthly survey of U.S. households from which we measure the unemployment rate and other important labor market measures. The committee had free rein, however, to recommend topics we should examine in any future edition of the Contingent Worker Supplement to the Current Population Survey. We also asked the committee to recommend changes to the survey design and methods of data collection if we were to conduct the supplement again.

The Committee on National Statistics is a federally supported independent organization whose mission is to improve the statistical methods and information that guide public policies. The committee moved quickly to form a group of experts on the relevant topics. I asked these experts to review the Contingent Worker Supplement and consider other sources of information on nontraditional work arrangements. The group was impressive and included a former BLS Commissioner, a former Administrator of the U.S. Department of Labor Wage and Hour Division, and several experts in economics and survey methods. They all volunteered their time to help us improve the Contingent Worker Supplement.

The group held public meetings and a workshop, hearing from experts, data users, and policymakers to understand what data would be the most valuable. At the end of their year-long review, they produced a report with specific recommendations in July of 2020 about measurement objectives and data collection.

BLS thanks the Committee on National Statistics and the expert panel for the time and effort they put into the report. Their recommendations thoughtfully balanced the desire to measure everything about this important topic with the limited time and information survey respondents can give us. In the coming months, we will study the report. It will guide us as we consider how to update the Contingent Worker Supplement to reflect the variety of work arrangements in the U.S. labor market.

Update on the Misclassification that Affected the Unemployment Rate

How hard can it be to figure out whether a person is employed or unemployed? Turns out, it can be hard. When BLS put out the employment and unemployment numbers for March, April, and May 2020, we also provided information about misclassification of some people. I want to spend some time to explain this issue, how it affected the data, and how we are addressing it.

In the monthly Current Population Survey of U.S. households, people age 16 and older are placed into one of three categories:

  • Employed — they worked at least one hour “for pay or profit” during the past week.
  • Unemployed — they did not work but actively looked for work during the past 4 weeks OR they were on temporary layoff and expect to return to work.
  • Not in the labor force — everyone else (including students, retirees, those who have given up their job search, and others).

Again, how hard can this be? It starts to get tricky when we talk to people who say they have a steady job but did not work any hours during the past week. In normal times, this might include people on vacation, home sick, or on jury duty. And we would continue to count them as employed. But during the COVID-19 pandemic, the collapse of labor markets created challenges the likes of which BLS has never encountered. People who reported zero hours of work offered such explanations as “I work at a sports arena and everything is postponed” or “the restaurant I work at is closed.” These people should be counted as unemployed on temporary layoff. As it turns out, a large number of people—we estimate about 4.9 million in May—were misclassified.

With the onset of the COVID-19 pandemic, the unemployment rate—at a 50-year low of 3.5 percent in February—rose sharply to 4.4 percent in March and to 14.7 percent in April, before easing to 13.3 percent in May. Despite the stark difference from February, we believe the unemployment rate likely was higher than reported in March, April, and May. As stated in our Employment Situation news releases for each of those months, some people in the Current Population Survey (also known as the CPS or household survey) were classified as employed but probably should have been classified as unemployed.

How did the misclassification happen?

We uncovered the misclassification because we saw a sharp rise in the number of people who were employed but were absent from their jobs for the entire reference week for “other reasons.” The misclassification hinges on how survey interviewers record answers to a question on why people who had a job were absent from work the previous week.

According to special pandemic-related interviewer instructions for this question, answers from people who said they were absent because of pandemic-related business closures should have been recorded as “on layoff (temporary or indefinite).” Instead, many of these answers were recorded as “other reasons.” Recording these answers as “on layoff (temporary or indefinite)” ensures that people are asked the follow-up questions needed to classify them as unemployed. It does not necessarily mean they would be classified as unemployed on temporary layoff, but I’ll get into that in a moment.

When interviewers record a response of “other reasons” to this question, they also add a few words describing that other reason. BLS reviewed these descriptions to better understand the large increase in the number of people absent from work for “other reasons.” Our analysis suggests this group of people included many who were on layoff because of the pandemic. They would have been classified as unemployed on temporary layoff had their answers been recorded correctly.

What are BLS and the Census Bureau doing to address the misclassification?

BLS and our partners at the U.S. Census Bureau take misclassification very seriously. We’re taking more steps to fix this problem. (The Census Bureau is responsible for collecting the household survey data, and BLS is responsible for analyzing and publishing the labor market data from the survey.) Both agencies are continuing to investigate why the misclassification occurred.

Before the March data collection, we anticipated some issues with certain questions in the survey because of the unprecedented nature of this national crisis. As a result, interviewers received special instructions on how to answer the temporarily absent question if a person said they had a job but did not work because of the pandemic. Nevertheless, we determined that not all of the responses to this question in March were coded according to the special instructions. Therefore, before the April data collection, all interviewers received an email that included instructions with more detailed examples, along with a reference table to help them code responses to this question. However, the misclassification was still evident in the April data. Before the May data collection, every field supervisor had a conference call with the interviewers they manage. In these conference calls, the supervisors reviewed the detailed instructions, provided examples to clarify the instructions, and answered interviewers’ questions.

Although we noticed some improvement for May, the misclassification persisted. Therefore, we have taken more steps to correct the problem. Before the June collection, the Census Bureau provided more training to review the guidance to the interviewers. The interviewers also received extra training aids. The electronic survey questionnaire also now has new special instructions that will be more accessible during survey interviews.

Why doesn’t BLS adjust the unemployment rate to account for the misclassification?

As I explained above, we know some workers classified as absent from work for “other reasons” are misclassified. People have asked why we just don’t reclassify these people from employed to unemployed. The answer is there is no easy correction we could have made. Changing a person’s labor force classification would involve more than changing the response to the question about why people were absent from their jobs.

Although we believe many responses to the question on why people were absent from their jobs appear to have been incorrectly recorded, we do not have enough information to reclassify each person’s labor force status. To begin with, we don’t know the exact information provided by the person responding to the survey. We know the brief descriptions included in the “other reasons” category often appear to go against the guidance provided to the survey interviewers. But we don’t have all of the information the respondent might have provided during the interview.

Also, we don’t know the answers to the questions respondents would have been asked if their answers to the question on the reason not at work had been coded differently. This is because people whose answers were recorded as absent from work for “other reasons” were not asked the follow-up questions needed to determine whether they should be classified as unemployed. Specifically, we don’t know whether they expected to be recalled to work and whether they could return to work if recalled. Therefore, shifting people’s answers from “other reasons” to “on layoff (temporary or indefinite)” would not have been enough to change their classification from employed to unemployed. We would have had to assume how they would have responded to the follow-up questions. Had we changed answers based on wrong assumptions, we would have introduced more error.

In addition, our usual practice is to accept data from the household survey as recorded. In the 80-year history of the household survey, we do not know of any actions taken on an ad hoc basis to change respondents’ answers to the labor force questions. Any ad hoc adjustment we could have made would have relied on assumptions instead of data. If BLS were to make ad hoc changes, it could also appear we were manipulating the data. That’s something we’ll never do.

How much did the misclassification affect the unemployment rate?

We don’t know the exact extent of this misclassification. To figure out what the unemployment rate might have been if there were no misclassification, we have to make some assumptions. These assumptions involve deciding (1) how many people in the “other reasons” category actually were misclassified, (2) how many people who were misclassified expected to be recalled, and (3) how many people who were misclassified were available to return to work.

In the material that accompanied our Employment Situation news releases for March, April, and May, we provided an estimate of the potential size of the misclassification and its impact on the unemployment rate. Here we assumed all of the increase in the number of employed people who were not at work for “other reasons,” when compared with the average for recent years, was due solely to misclassification. We also assumed all of these people expected to be recalled and were available to return to work.

For example, there were 5.4 million workers with a job but not at work who were included in the “other reasons” category in May 2020. That was about 4.9 million higher than the average for May 2016–19. If we assume this 4.9 million increase was entirely due to misclassification and all of these misclassified workers expected to be recalled and were available for work, the unemployment rate for May would have been 16.4 percent. (For more information about this, see items 12 and 13 in our note for May. We made similar calculations for March and April.)

These broad assumptions represent the upper bound of our estimate of misclassification. These assumptions result in the largest number of people being classified as unemployed and the largest increase in the unemployment rate. However, these assumptions probably overstate the size of the misclassification. It is unlikely that everyone who was misclassified expected to be recalled and was available to return to work. It is also unlikely that all of the increase in the number of employed people not at work for “other reasons” was due to misclassification. People may be correctly classified in the “other reasons” category. For example, someone who owns a business (and does not have another job) is classified as employed in the household survey. Business owners who are absent from work due to labor market downturns (or in this case, pandemic-related business closures) should be classified as employed but absent from work for “other reasons.”

Regardless of the assumptions we might make about misclassification, the trend in the unemployment rate over the period in question is the same; the rate increased in March and April and eased in May. BLS will continue to investigate the issue, attempting both to ensure that data are correctly recorded in future months and to provide more information about the effect of misclassification on the unemployment rate.

When Worlds Converge: Statistics Agencies Learning from Each Other during the Pandemic

We never know when our worlds are going to converge. I have used this blog to tell you about how BLS operations are continuing—and changing—due to the COVID-19 pandemic. I also plan to tell you about our international activities and will continue writing about the BLS Consumer Price Index (CPI) and other programs. Today, all three of these topics converge into one.

The COVID-19 pandemic has compelled BLS and statistical agencies worldwide to examine our processes and concepts to ensure the information we collect and publish reflects current conditions. For BLS, this means suspending all in-person data collection and relying on other methods, including telephone, internet, and email. Adding to our toolbox, BLS is now piloting video data collection. To be flexible, we have changed some collection procedures to accommodate current conditions. For example, we are now doing all of our work at home instead of in our offices. We are learning more every day about teleworking more effectively, and we are training our staff as we learn.

Once we collect the data, we are examining how we need to adapt our processing and publication. Will our typical procedures to account for missing data still apply? Will seasonal patterns in the data change due to COVID-19? Will we be able to publish the level of detail our data users have come to expect? These and more are open questions. We will make informed decisions as we learn more about the pandemic’s impact on our data and operations. What I do know is that BLS has a long practice of sharing its procedures and methods, including any changes. We already have extensive information about COVID-19 on the BLS website, and we continue to update that information. We also provide program-specific information with each data release to alert users to any unique circumstances in the data.

Since BLS has long been known for producing gold-standard data, information about our procedures and methods is also of great interest to our international colleagues. In fact, BLS has helped statistical organizations throughout the world with the collection, processing, analysis, publishing, and use of economic and labor statistics for more than 70 years. We provide this assistance primarily by our Division of International Technical Cooperation. They strengthen statistical development by organizing seminars, consultations, and meetings for international visitors with BLS staff. This division also serves as the main point of contact for the many international statistical organizations that compile information, publish comparable statistics worldwide, share concepts and definitions, and work to incorporate improvements and innovations.

A hallmark of our international activities has been onsite seminars at BLS, often attended by a multinational group of statistical experts and those working to become experts. At these seminars, BLS technical staff present details on every aspect of statistical programs, including concept development, sampling, data collection, estimation procedures, publishing, and more. In recent years, funding, travel restrictions, and other limitations have reduced the number of in-person events, replaced to some extent by virtual events. And of course, the current COVID-19 pandemic and related travel restrictions mean all such events are now being held virtually. But they still go on.

Recently, our international operations converged with our COVID-19 response when the International Technical Cooperation staff set up a virtual meeting between BLS staff primarily from our Consumer Price Index program and their counterparts at India’s Ministry of Statistics and Programme Implementation (MOSPI). They met to discuss challenges in producing consumer price data during the ongoing pandemic. The discussion was largely about methodology: what to do with missing prices and how to adjust weights to reflect real-time shifts in spending that consumers are making in response to the pandemic. It is helpful to hear from worldwide colleagues who are facing similar challenges. These issues are unprecedented, and we know the potential solutions for one country may not be ideal for the nuanced conditions in another country.

In India, for instance, commerce has been limited to essential commodities—food, fuel, and medicine. This will likely leave them unable to publish some indexes. While this is unfortunate in the present time, it is fairly straightforward; they can’t publish what they don’t have. It gets more complicated a year from now. What does it mean to have an annual price change when the denominator is missing? The CPI deals with this by having a fairly robust imputation system—basically “borrowing” price change from similar areas and items—but we will be monitoring the situation closely to make sure our assumptions about what is similar remain valid.

One advantage BLS has over MOSPI is that we are able to collect data by telephone, email, or on the web. MOSPI has traditionally only done in-person collection. Both agencies are transitioning to different modes of collection, but we have significantly greater experience.

Sharing information with our international colleagues, about the CPI and other programs, and about our COVID-19 experience, is a key part of the BLS mission. These worlds continue to converge, not just during organized meetings but also on websites and wikis maintained by statistical organizations and through participation in expert groups and conferences. For example, the United Nations Economic Commission for Europe hosts a ”statswiki” that currently has pages dedicated to COVID-19 and Official Statistics. It is a small world after all, and the worldwide social distancing we are all experiencing makes it clear that we are all in this together. And together, BLS and our international colleagues, reacting to COVID-19 and making adjustments to consumer price indexes and other statistics, will continue to provide vital information that tracks changes in the world economy.

Projected Occupational Openings: Where Do They Come From?

Toward the beginning of each school year, BLS issues a new set of Employment Projections, looking at projected growth and decline in occupations over the next decade. These estimates are important for understanding structural changes in the workforce over time. But to identify opportunities for new workers, we need to look beyond occupational growth and decline, to a concept we call “occupational openings.”

Occupational openings are the sum of the following:

  • Projected job growth (or decline)
  • Occupational separations — workers leaving an occupation, which includes:
    • Labor force exits — workers who leave the labor force entirely, perhaps to retire
    • Occupational transfers — workers who leave one occupation and transfer to a different occupation.

This video explains the concept of occupational openings further.

BLS publishes the projected number of occupational openings for over 800 occupations. Not surprisingly, some of the largest occupations in the country have some of the largest number of openings. For example, certain food service jobs, which include fast food workers, are projected to have nearly 800,000 openings per year over the next decade. I guess this isn’t a surprise in an occupation with over 3.7 million workers.

But when we delve into the information on occupational openings a little further, more stories emerge. Some related occupations have very different patterns of openings. And some occupations have similar levels of openings for different reasons. Let’s take a look at a few examples.

In 2018, there were over 800,000 lawyers in the U.S., and a projected 45,000 annual openings for lawyers, about 5.5 percent of employment. At the same time, there were fewer than half the number of paralegals and legal assistants (325,000), with projected annual openings around 40,000 per year – 12.4 percent of employment. These two related occupations had similar numbers of projected openings, but those openings represented different proportions of current employment. Such differences reflect required education, demographics, compensation, and other variables. Lawyers tend to have professional degrees that are specialized for that occupation and are therefore more closely tied to their occupation than paralegals, who have more diverse educational backgrounds. You can find out more about how worker characteristics affect these numbers in the Monthly Labor Review.

Now let’s look at the sources of occupational openings. In this first example, we compare two occupational groups: installation, maintenance, and repair occupations and healthcare support occupations. These are broad categories that include a number of different individual occupations.

Average annual occupational openings for installation, maintenance, and repair occupations and healthcare support occupations, 2018–28

Editor’s note: Data for this chart are available in the table below.

In this example, both occupational groups have projected annual openings of a little over 600,000 per year, yet they come from different sources. Two-thirds of the openings among installation occupations result from workers leaving to go to other occupations; in contrast, just under half the openings among healthcare support occupations are from people moving to other occupations. Looking at projected job growth, BLS projects that healthcare support occupations, the fastest growing occupational group, will add more than three times as many new jobs as installation occupations, annually over the next decade (78,520 versus 23,320).

Now let’s look at two individual occupations — web developers and court, municipal, and license clerks. These are very different jobs, but both are projected to have about 15,000 annual openings over the next decade. Here, too, occupational openings come from very different places, as this chart shows:

Average annual occupational openings for web developers and court, municipal, and license clerks, 2018–28

Editor’s note: Data for this chart are available in the table below.

In this case, around 67 percent of openings for web developer jobs come from workers transferring to other jobs, compared with only 49 percent transfers for clerks. But a greater share of clerks are exiting the labor force. Once again, differences are due to a variety of factors, although the age of workers is a significant factor in this case — web developers have a median age of 38.3, while clerks tend to be older, with a median age of 49.1. Younger workers are more likely to transfer occupations, while older workers are more likely to exit the labor force, as for retirement.

So what does all this really mean? If nothing else, you can see that the thousands of individual data elements available through the BLS Employment Projections program tell a thousand different stories, and more. Whether large or small, growing or declining, there’s information about hundreds of occupations that can be helpful to students looking for careers, counselors helping those students and others, workers wanting to change jobs, employers thinking about their future, policymakers considering where to put job training resources, and on and on. These examples just scratch the surface of what BLS Employment Projections information can tell us. Take a look for yourself.

Average annual occupational openings, 2018–28
OccupationEmployment growthExitsTransfers

Installation, maintenance, and repair occupations

23,320195,700413,900

Healthcare support occupations

78,520235,500299,600
Average annual occupational openings, 2018–28
OccupationEmployment growthExitsTransfers

Web developers

2,0902,90010,100

Court, municipal, and license clerks

6707,0007,300

BLS Learns from Civic Digital Fellows

In the few months that I’ve had the pleasure of occupying the Commissioner’s seat at the Bureau of Labor Statistics, it’s been clear that I’m surrounded by a smart, dedicated, and innovative staff who collect and publish high-quality information while working to improve our products and services to meet the needs of customers today and tomorrow. And soon after I arrived, we added to that high-quality staff by welcoming a cadre of Civic Digital Fellows to join us for the summer.

In its third year, the Civic Digital Fellowship program was designed by college students for college students who wanted to put their data science skills to use helping federal agencies solve problems, introduce innovations, and modernize functions. This year, the program brought 55 fellows to DC and placed them in 6 agencies – Census Bureau, Citizenship and Immigration Service, General Services Administration, Health and Human Services, National Institutes of Health, and BLS. From their website:

Civic Digital Fellowship logo describing the program as "A first-of-its-kind technology, data science, and design internship program for innovative students to solve pressing problems in federal agencies."

BLS hosted 9 Civic Digital Fellows for summer 2019. Here are some of their activities.

  • Classification of data is a big job at BLS. Almost all of our statistics are grouped by some classification system, such as industry, occupation, product code, or type of workplace injury. Often the source data for this information is unstructured text, which must then be translated into codes. This can be a tedious, manual task, but not for Civic Digital Fellows. Andres worked on a machine learning project that took employer files and classified detailed product names (such as cereal, meat, and milk from a grocery store) into categories used in the Producer Price Index. Vinesh took employer payroll listings with very specific job titles and identified occupational classifications used in the Occupational Employment Statistics program. And Michell used machine learning to translate purchases recorded by households in the Consumer Expenditure Diary Survey into codes for specific goods and services.
  • We are always looking to improve the experience of customers who use BLS information, and the Civic Digital Fellows provided a leg up on some of those activities. Daniel used R and Python to create a dashboard that pulled together customer experience information, including phone calls and emails, internet page views, social media comments, and responses to satisfaction surveys. Olivia used natural language processing to develop a text generation application to automatically write text for BLS news releases. Her system expands on previous efforts by identifying and describing trends in data over time.
  • BLS staff spend a lot of time reviewing data before the information ends up being published. While such review is more automated than in the past, the Civic Digital Fellows showed us some techniques that can revolutionize the process. Avena used Random Forest techniques to help determine which individual prices collected for the Consumer Price Index may need additional review.
  • Finally, BLS is always on the lookout for additional sources of data, to provide new products and services, improve quality, or reduce burden on respondents (employers and households). Christina experimented with unit value data to determine the effect on export price movements in the International Price Program. Somya and Rebecca worked on separate projects that both used external data sources to improve and expand autocoding within the Occupational Requirements Survey. Somya looked at data from a private vendor to help classify jobs, while Rebecca looked at data from a government source to help classify work tasks.

The Civic Digital Fellows who worked at BLS in summer 2019

Our cadre of fellows has completed their work at BLS, with some entering grad school and the working world. But they left a lasting legacy. They’ve gotten some publicity for their efforts. Following their well-attended “demo day” in the lobby at BLS headquarters, some of their presentations and computer programs are available to the world on GitHub.

I think what most impressed me about this impressive bunch of fellows was the way they grasped the issues facing BLS and focused their work on making improvements. I will paraphrase one fellow who said “I don’t want to just do machine learning. I want to apply my skills to solve a problem.” Another heaped praise on BLS supervisors for “letting her run” with a project with few constraints. We are following up on all of the summer projects and have plans for further research and implementation.

We ended the summer by providing the fellows with some information about federal job opportunities. I have no doubt that these bright young minds will have many opportunities, but I also saw an interest in putting their skills to work on real issues facing government agencies like BLS. I look forward to seeing them shine, whether at BLS or wherever they end up. I know they will be successful.

And, we are already making plans to host another group of Civic Digital Fellows next summer.