Tag Archives: Methodology

Why This Counts: Breaking Down Multifactor Productivity

Productivity measures tell us how much better we are at using available resources today compared to years past. All of us probably think about our own productivity levels every day, either in the workplace or at home. I find my own productivity is best in the morning, right after that first cup of coffee!

On a larger scale, here at the U.S. Bureau of Labor Statistics, we produce two types of productivity measures: labor productivity and multifactor productivity, which we will call “MFP” for short. An earlier Why This Counts blog post focused on labor productivity and its impact on our lives. In this blog we will focus on why MFP measures matter to you.

Why do we need two types of productivity measures?

Labor productivity compares the amount of goods and services produced—what we call output—to the number of labor hours used to produce those goods and services.

Multifactor productivity differs from labor productivity by comparing output not just to hours worked, but to a combination of inputs.

What are these combined inputs?

For any given industry, the combined inputs include labor, capital, energy, materials, and purchased services. MFP tells us how much more output can be produced without increasing any of these inputs. The more efficiently an industry uses its combination of inputs to create output, the faster MFP will grow. MFP gives us a broader understanding of how we are all able to do more with less.

Does MFP tell us anything about the impact of technology?

It does. But we cannot untangle the impact of technology from other factors. MFP describes the growth in output that is not a result of using more of the inputs that we can measure. In other words, MFP represents what is left, the sources of growth that we cannot measure. These include not just technology improvements but also changes in factors such as management practices and the scale or organization of production. Put simply, MFP uses what we do know to learn more about what we want to know.

What can MFP tell us about labor productivity?

Labor productivity goes up when output grows faster than hours. But what exactly causes output to grow faster than hours? Labor productivity can grow because workers have more capital or other inputs or their job skills have improved. Labor productivity also may grow because technology has advanced, management practices have improved, or there have been returns to scale or other unmeasured influences on production. MFP statistics help us capture these influences and measure their impact on labor productivity growth.

How are MFP statistics used?

We can identify the sources of economic growth by comparing MFP with the inputs of production. This is true for individual industries and the nation as a whole.

For example, a lot has been written about the decline of manufacturing in the United States. MFP increased between 1992 and 2004 by an average of 2.0 percent per year. In contrast, MFP declined from 2004 through 2016 by an average of 0.3 percent per year. A recently published article uses detailed industry data to analyze sources of this productivity slowdown.

MFP is a valuable tool for exploring historical growth patterns, setting policies, and charting the potential for future economic growth. Businesses, industry analysts, and government policymakers use MFP statistics to make better decisions.

Where can I go to learn more?

Check out the most recent annual news release to see the data firsthand!

If you have a specific question, you might find it answered in our Frequently Asked Questions. Or you can always contact MFP staff through email or call (202) 691-5606.

Just like your own productivity at work and at home, the productivity growth of our nation can lead to improvements in the standard of living and the economic well-being of the country. Productivity is an important economic indicator that is often overlooked. We hope this blog has helped you to learn more about the value of the MFP!

BLS Measures Electronically Mediated Work

Are you a ride-share driver using a mobile app (like Uber or Lyft) to find customers? Maybe you do household chores or yardwork for others by finding short-term jobs through a website (such as TaskRabbit or Handy) that arranges the payment for your work. Or perhaps you perform online tasks, like taking surveys or adding descriptive keywords to photos or documents through a platform (like Amazon Mechanical Turk or Clickworker). If so, you are an electronically mediated worker. That’s a term BLS uses to identify people who do short jobs or tasks they find through websites or mobile apps that connect them with customers and arrange payment for the tasks. Have you ever wondered how many people do this kind of work?

BLS decided to find out. In the May 2017 Contingent Worker Supplement to the Current Population Survey, we asked people four new questions designed to measure electronically mediated employment.

Measuring electronically mediated work is difficult

After studying respondents’ answers to the new questions and other information we collected about them, we realized the new questions didn’t work as intended. Most people who responded “yes” to the questions clearly had not found their work through a website or app. For example, a vice president of a major bank, a local police officer, and a surgeon at a large hospital all said they had done electronically mediated work on their main job. Many people seemed to think we were asking whether they used a computer or mobile app on their job. That could apply to many jobs that aren’t electronically mediated.

But it wasn’t all for naught. After extensive evaluation, we concluded we could use the other information in the survey about respondents’ jobs to identify and recode erroneous answers. That allowed us to produce meaningful estimates of electronically mediated employment.

So, who does electronically mediated work?

Based on our recoded data, we found that 1.6 million people did electronically mediated work in May 2017. These workers accounted for 1.0 percent of total employment. Compared with workers overall, electronically mediated workers were more likely to be ages 25 to 54 and less likely to be age 55 or older. Electronically mediated workers also were slightly more likely to be Black, and slightly less likely to be White, than workers in general. In addition, electronically mediated workers were more likely than workers overall to work part time (28 percent versus 18 percent).

Workers in the transportation and utilities industry were the most likely to have done electronically mediated work, with 5 percent of workers in this industry having done such work. Self-employed workers were more likely than wage and salary workers to do electronically mediated work (4 percent versus 1 percent).

What’s next?

We currently don’t have plans to collect information on electronically mediated work again. And even if we did, we wouldn’t want to use the same four questions. At the least, we would need to substantially revise the questions so they are easier for people to understand and answer correctly.

Taking a broader look, we are working with the Committee on National Statistics to learn more about what we should measure if we field the survey again. The committee is a federally supported independent organization whose mission is to improve the statistical methods and information on which public policies are based.

How can I get more information?

The data are available on our website, along with an article that details how we developed the questions, evaluated the responses, recoded erroneous answers, and analyzed the final estimates.

If you have a specific question, you might find it in our Frequently Asked Questions. Or you can contact our staff.

A Clearer Look at Response Rates in BLS Surveys

Hands holding a tablet computer and completing a surveyPeople know BLS for our high-quality data on employment, unemployment, price trends, pay and benefits, workplace safety, productivity, and other topics. We strive to be transparent in how we produce those data. We provide detailed information on our methods for collecting and publishing the data. This allows businesses, policymakers, workers, jobseekers, students, investors, and others to make informed decisions about how to use and interpret the data.

We couldn’t produce any of these statistics without the generous cooperation of the people and businesses who voluntarily respond to our surveys. We are so grateful for the public service they provide.

To improve transparency about the quality of our data, we recently added a new webpage on response rates to our surveys and programs. We previously published response rates for many of our surveys in different places on our website. Until now there hasn’t been a way to view those response rates together in one location.

What is a response rate, and why should I care?

A response rate is the percent of potential respondents who completed the survey. We account for the total number of people, households, or businesses we tried to survey (the sample) and the number that weren’t eligible (for example, houses that were vacant or businesses that had closed). Response rates are an important measure for survey data. High response rates mean most of the sample completed the survey, and we can be confident the statistics represent the target population. Low response rates mean the opposite, and data users may want to consider other sources of information.

Do response rates tell the whole story?

A low response rate may mean the data don’t represent the target population well, but not necessarily. How much a low response rate affects how well the estimates represent the population is called nonresponse bias. Some important research by Robert M. Groves and Emilia Peytcheva published in the January 2008 issue of Public Opinion Quarterly looked at the connection between response rates and nonresponse bias in 59 studies. The authors found that high response rates can reduce the risk of bias, but there is not a strong correlation between response rate and nonresponse bias. Some surveys had a very low response rate but did not have evidence of high nonresponse bias. Other surveys had high nonresponse bias despite high response rates.

This means we should look at response rates with other measures of data quality and bias. BLS has studied nonresponse bias for many years. We have links to many of those studies in our library of statistical working papers.

What should I be looking for on the new page?

With response rates from multiple surveys in a single place, you can look for patterns across surveys and across time. For example, across every graph we see that response rates are declining over time. This is happening for nearly all surveys, government and private, on economic and other topics. It is simply getting harder to persuade respondents to answer our surveys.

Individual survey response rates are also interesting compared with other BLS surveys. We see that some surveys have higher response rates than others. To understand why this might be, we’ll want to look at the differences between the surveys. Each survey has specific collection procedures that affect response rates. For example, the high response rate for the Annual Refiling Survey (shown as ARS in the second chart) may catch your eye. When you see that it has a 12-month collection period and is mandatory in 26 states, the rate makes more sense.

We also can see how survey-specific changes have affected a survey’s response rate. For example, we see a drop in the response rate for the Telephone Point of Purchase Survey around 2012. This drop likely resulted from a change in sample design, as the survey moved from a sample of landline telephones to a dual-frame sample with both landlines and cell phones. Because the response rate for this survey continues to decline, we are developing a different approach for collecting the needed data.

What should I know before jumping into the new page?

There’s a lot of information! We’ve tried to make it as user friendly as possible, including a glossary page with definitions of terms and a page to show how each survey calculates their response rates. On the graphs, you can isolate a single survey by hovering over each of the lines. You can also download the data shown in each graph to examine it more closely.

We hope you will find this page helpful for understanding the quality of BLS data. Please let us know how you like it!

Ensuring Gold-Standard Data in the Eye of a Storm

“Hurricanes Harvey, Irma and Maria were the most notable storms of 2017, leaving paths of death and destruction in their wake.”
Colorado State University’s Tropical Meteorology Project 2017 summary report

Colorado State University’s Tropical Meteorology Project is forecasting the 2018 hurricane season activity (as of May 31) to be average, with 13 named storms, 6 hurricanes, and 2 major hurricanes expected. Is BLS ready?

How does BLS deal with hurricanes?

Since June starts hurricane season, we want to share with you one example of how last year’s storms affected our data. We present a case study using our national employment survey, the Current Employment Statistics program. This program provides monthly estimates we publish in The Employment Situation—sometimes called the “jobs report.”

We have procedures to address natural disasters. We highlight some of our challenges and how we address them. We do everything possible to provide you with gold-standard data to help you make smart decisions!

2017 Hurricane Destruction

Two major hurricanes—Harvey and Irma—blasted the U.S. mainland in August and September 2017. Hurricane Maria devastated Puerto Rico and the U.S. Virgin Islands later in September.

  • Harvey first made landfall in Texas on August 25. The Federal Emergency Management Agency (FEMA) declared 39 Texas counties eligible for federal disaster assistance after Harvey. Harvey also caused heavy damage in Louisiana.
  • Irma hit the Florida Keys on September 10 and then later hit Florida’s southern coast. FEMA declared 48 Florida counties eligible for federal disaster assistance. Before Irma hit the lower Florida Keys, the hurricane already had caused severe damage in St. Thomas and St. John in the U.S. Virgin Islands and in Puerto Rico.
  • Hurricane Maria made landfall in St. Croix in the U.S. Virgin Islands and in Puerto Rico on Wednesday, September 20, causing catastrophic damage. These areas already had suffered damage from Hurricane Irma earlier in the month.

Some things to know about the monthly employment survey

The monthly employment survey is a sample of nonfarm businesses and government agencies. The reference period is the pay period that includes the 12th of the month. The sample has just over 23,000 active reporting units in the disaster areas, representing about 6 percent of the entire active sample.

What does it mean to be employed? If the employer pays someone for any part of the reference pay period, that person is counted as employed.

How did BLS collect data in these disaster areas?

Our biggest challenge is to collect representative sample data so we publish high-quality estimates. In the “old days,” the survey was a mail survey (yes, I mean snail mail), but no more! Now we collect data electronically by several different methods. These are the most common:

  • About half the total sample uses electronic data interchange. That’s a centralized electronic data reporting system for multi-establishment firms. The firm provides an electronic file directly from their payroll system to BLS for all establishments included in the report. Most of the firms reporting are outside of the hurricane-affected areas, although they may report on establishments within the affected areas.
  • About 23 percent of establishments use computer-assisted telephone interviews.
  • Another 16 percent report using our Internet Data Collection Facility.

Using these methods, we were able to collect data from most sampled businesses in these areas using normal procedures.

What about the emergency workers working in the disaster areas? How are they counted?

  • We count emergency workers where their employer is located, not where they are working.
  • We don’t count volunteers as employed because they are not paid.
  • Activated National Guard troops are considered active duty military and are outside the scope of the survey.

Did the estimation procedures change?

Once we collect the data from businesses in the affected areas, we consider whether we need to change our estimation procedures to adjust for missing data. The survey staff determined that we didn’t need to change our methods because the collection rates in the affected areas were within normal ranges.

How did the hurricanes affect national employment data for September 2017?

Hurricanes Harvey and Irma reduced the estimate of national payroll employment for September 2017. We can’t measure the effects precisely because the survey is not designed to isolate the effects of catastrophic events. National nonfarm employment changed little (+14,000) in September 2017, after increasing by an average of 189,000 per month over the prior 12 months. A steep employment decline in food services and drinking places and below-trend growth in some industries likely reflected the impact of Hurricanes Harvey and Irma.

What about Puerto Rico and the U.S. Virgin Islands?

National nonfarm employment estimates do not include Puerto Rico or the U.S. Virgin Islands.

Because of the devastation caused by Hurricanes Irma and Maria, Puerto Rico and the U.S. Virgin Islands could not conduct normal data collection for their establishment surveys. The September estimates for Puerto Rico and the Virgin Islands were delayed. The October and November estimates for the Virgin Islands also were delayed. Puerto Rico and the Virgin Islands eventually were able to produce estimates for September, October, and November 2017.

Want more information?

For more information on the impact of Harvey, Irma, and Maria, check out these pages:

What else does BLS know about hurricanes?

The Quarterly Census of Employment and Wages produces maps of businesses and employment in flood zones for states on the Atlantic and Gulf Coasts that are vulnerable to hurricanes and tropical storm. You can read more about those maps in another recent blog.

We hope the 2018 hurricane season won’t bring the loss of life and destruction of property that we saw in 2017. Regardless of what the season brings, BLS will be ready to continue providing gold-standard data about the labor market and economy.

Using Seasonally Adjusted Data or Not: a Case Study

The Current Employment Statistics survey helps us track employment trends in the economy. The headline figures, such as the 164,000 increase in payroll employment in April, are seasonally adjusted. Seasonal adjustment smooths out increases or decreases that occur around the same time each year to make it easier to see the underlying movements in the data.

Consider the construction industry, where employment varies throughout the year, often because of the weather. The chart below shows employment each month in 2017, both seasonally adjusted and not seasonally adjusted. The not seasonally adjusted level ranged from about 6.4 million to 7.2 million jobs, but it is hard to see a trend. The seasonally adjusted level was consistently between 6.8 million and 7.1 million jobs. When we remove the seasonal variation, we can see a slight increase in construction employment over the year.

Construction employment in 2017, seasonally adjusted and not seasonally adjusted

Editor’s note: Data for this chart are available in the table below.

While seasonally adjusted data help us see long-term trends, there are times when short-term trends can provide some insight. One example is holiday-season hiring. Certain industries, such as retail trade and parcel delivery services, ramp up hiring in the fall to prepare for increased business during the holiday season. We can see this holiday-related employment buildup with data that are not seasonally adjusted. For example, employment growth in selected retail trade industries increased by 609,000 from October to December 2017, less than the 650,000 jobs gained in the same months of 2016.

Note: Selected retail trade industries include furniture and home furnishings stores; electronics and appliance stores; health and personal care stores; clothing and clothing accessories stores; sporting goods, hobby, book and music stores; general merchandise stores; miscellaneous store retailers; and nonstore retailers.

Seasonal holiday employment buildup in selected retail trade industries, 2012–17 (not seasonally adjusted)

Editor’s note: Data for this chart are available in the table below.

We have to be careful when we use data that are not seasonally adjusted. For example, sometimes there are 4 weeks between monthly surveys and sometimes there are 5 weeks. Seasonal adjustment accounts for these differences. When using not seasonally adjusted data, users must be aware that an extra week between surveys can exaggerate seasonal employment increases or decreases. For example, in 2017, there were 5 weeks between surveys in November, just as there were in 2012 and 2013.

Looking across the October-to-December period, the seasonal employment buildup in retail trade slowed each year following a large increase from 2012 to 2013. In each of the next four holiday seasons, job gains over the 3-month (13-week) period were less than the prior year. But 2017 included some anomalies – a strong November (72 percent of the seasonal total), followed by a weak December (7 percent of the seasonal total).

Share of seasonal holiday employment buildup in each month, selected retail trade industries, 2012–17 (not seasonally adjusted)

Editor’s note: Data for this chart are available in the table below.

Examining the not seasonally adjusted data may provide some insights into changing hiring patterns, especially in seasonal industries. The 2017 retail trade data suggest declining holiday employment buildup but also earlier holiday employment buildup. Will this pattern continue? We’ll know more when Current Employment Statistics data come out later this year.

We can analyze other industries with seasonal patterns in a similar way. One industry is transportation, and specifically couriers and messengers, which includes parcel delivery services. As the trend in online shopping continues, employment in parcel delivery services has increased, especially during the holiday season. Other seasonal industries include ski resorts in the winter, gardening shops in the spring, and amusement parks in the summer. We can also use not seasonally adjusted data to look at layoff patterns in seasonal industries, such as certain retail industries after the holiday season.

All these data are available from the Current Employment Statistics program.

Construction employment in 2017
Month Seasonally adjusted Not seasonally adjusted
Jan 6,873,000 6,459,000
Feb 6,919,000 6,527,000
Mar 6,922,000 6,634,000
Apr 6,917,000 6,816,000
May 6,924,000 6,990,000
Jun 6,940,000 7,157,000
Jul 6,934,000 7,197,000
Aug 6,962,000 7,228,000
Sep 6,971,000 7,177,000
Oct 6,988,000 7,182,000
Nov 7,030,000 7,117,000
Dec 7,072,000 6,970,000
Seasonal holiday employment buildup in selected retail trade industries, 2012–17 (not seasonally adjusted)
Year October November December
2012 132,000 456,000 103,000
2013 142,000 435,000 184,000
2014 169,000 392,000 158,000
2015 175,000 389,000 127,000
2016 148,000 358,000 144,000
2017 128,000 438,000 43,000
Share of seasonal holiday employment buildup in each month, selected retail trade industries, 2012–17 (not seasonally adjusted)
Year October November December
2012 19.1% 66.0% 14.9%
2013 18.7 57.2 24.2
2014 23.5 54.5 22.0
2015 25.3 56.3 18.4
2016 22.8 55.1 22.2
2017 21.0 71.9 7.1