YOUR BUSINESS AUTHORITY

Springfield, MO

Log in Subscribe

Technology gains hard to analyze

Posted online

by Adam M. Zaretsky

for the Business Journal

For many Americans, the proliferation of the personal computer has transformed the workplace more than any other innovation.

Over the past 15 years or so, this transformation, sometimes called the "Information Revolution," has caused many firms to rethink their organizational structures and management procedures.

The Information Revolution is also commonly credited with creating huge gains in workplace productivity, which, in turn, have led to higher wages.

The catch, though, is that these huge gains in productivity have not shown up in the national data. Rather, year-over-year gains in overall productivity measured as output per hour of all persons in the business sector have failed to suggest that anything unique was occurring in the workplace during this business expansion relative to previous expansions.

In 1996 and 1997, for example, output per hour increased about 2 percent a year; in 1995, it declined 0.1 percent. In fact, since the end of the 1990-91 recession, productivity growth has so far peaked at 3.3 percent in 1992. After the 1981-82 recession, productivity growth peaked at 3.2 percent in 1983. In the 1950s and '60s, in contrast, growth rates above 4 percent were quite common.

The missing pieces.

The absence of huge productivity gains has created what economists call the productivity paradox. Basically, the paradox is that the official statistics have not borne out the productivity improvements expected from new technology.

The United States is not unique in this respect. As economists Erwin Diewert and Kevin Fox have observed, there has been a "measured productivity slowdown in industrialized countries in the last 25 years, the very time when we would have expected to see large increases in productivity growth due to rapid technological change."

So, what happened?

Part of the explanation is that there have been payoffs to firms from computer investment, but these payoffs are hard to measure.

For example, Zvi Griliches has argued that in measuring productivity, "(computer) investment has gone into our 'unmeasurable sectors,' and thus its productivity effects, which are likely invisible in the data."

A different argument, advanced by Paul David, compares the onset of computers with the advent of electricity, which took about 40 years before its impact on productivity was observed.

Jack Triplett counters that the rapid fall in the price of computing power indicates a different diffusion process for computers than for electrification, making the analogy weak. He writes: "In the computer diffusion process, the initial applications supplanted older technologies for computing. Water and steam power long survived the introduction of electricity; but old pre-computer-age devices for doing calculations disappeared long ago."

Another explanation for the paradox put forth by David Romer in 1988 is that, because investment's share of the gross domestic product is relatively small, large changes in investment translate into only small changes in output. And since computers represent a modest part of total investment, huge increases in computer investment result in only meager increases in measured output and, hence, measured productivity.

Diewert and Fox believe this analysis doesn't work for computers because they are inherently different from other types of capital. They write: "(Computers) can be used to control other capital (and labor), so that the other capital (and labor) is used more efficiently, for example, the management of a warehouse or coordinating the movement of trucks and airplanes."

In other words, computers may actually substitute for other capital including human capital thereby replacing, rather than adding to, some of the productivity gains.

Is this where it's been hiding?

The productivity paradox has not affected all sectors of the economy, though. U.S. manufacturing, for instance, has experienced relatively strong annual productivity growth over the past few years. In fact, output per hour in this sector has grown more than 4 percent a year since 1995 a sustained rate of increases unequaled since the end of World War II.

Perhaps, then, the expected productivity gains are more isolated than anticipated, occurring mostly in those sectors that are extremely capital-intensive, like manufacturing. What hasn't accompanied this relatively strong growth in manufacturing productivity, however, is a commensurate increase in real wages.

While output per hour has been growing at more than 4 percent a year, real compensation per hour at manufacturing firms has been growing at less than 1 percent a year. It wasn't until the beginning of 1998 that year-over-year growth in real compensation per hour spiked up to almost 4 percent.

This leads to yet another conundrum, since traditional labor market theory predicts that productivity gains should drive wage increases. Why? Because theory says workers should receive a wage that exactly compensates them for their added value to total output, otherwise known as their marginal revenue product.

This is calculated by determining how much output workers can produce in an hour their marginal product and then figuring out how much extra revenue that output will bring the firm its marginal revenue hence the term, marginal revenue product. The chain of events, then, would be: investment in computers leads to increases in output per hour (higher productivity), which, in turn, leads to higher wages.

For the U.S. manufacturing sector, the chain appears to be holding, although the last link seems to be weaker than the first. Perhaps, then, investment in computers and information technology has had other, not as easily observed, outcomes.

Hidden consequences.

This is exactly the proposition that David Autor, Lawrence Katz and Alan Krueger examined. In their 1997 article, they argued that the rapid spread of computer technology in the workplace may explain as much as 30 percent to 50 percent of the increase in the growth rate of demand for more-skilled workers since 1970.

The three economists found that the demand for college-level workers grew more rapidly on average from 1970 to 1995 than from 1940 to 1970. This increased demand was initially met with a sufficient supply of college-educated workers. That supply slowed at the beginning of the 1980s, however, eventually causing a shortage that led to a widening of the wage gap between those with and without college degrees.

An even more striking finding by the authors was that industries displaying the largest increases in skill requirements legal services, advertising and public administration, for example were the biggest users of computers. Relative to other industries, these have exhibited greater growth in employee computer use and more capital investment in computers both per worker and as a share of total investment.

In addition, these high computer-use sectors appear to have reorganized their workplaces in a manner that disproportionately employs more educated and higher paid workers.

But simply reorganizing the workplace to accommodate higher-skilled workers isn't the end of the story. How firms reorganize is also important. In a 1997 article, Sandra Black and Lisa Lynch asserted that manufacturing firms that gave workers a significant decision-making role were markedly more productive than firms that did not.

Black and Lynch also found that productivity was higher in plants in which a high proportion of nonmanagerial workers used computers, and in plants where workers had high average levels of education.

The authors also noted, however, that not all restructuring plans are equal for example, neither profit-sharing plans designated exclusively for managers nor total quality management programs did anything to improve plant productivity. Profit-sharing plans that included all workers, though, did improve productivity. Black and Lynch's bottom line, then, is that practices that encourage workers to think and interact to improve the production process are strongly linked to increased productivity.

All told, the recent research offers various explanations that support the belief that productivity has been increasing because of computer investment, despite what the data show.

The Information Revolution has forced many firms to reconsider their production processes more carefully, often resulting in reorganization. It has also altered their demands for labor, requiring them to recruit better-educated workers than they had previously.

Meanwhile, researchers are striving to prove that, upon closer examination, the productivity paradox is not a paradox at all, but, instead, a puzzle in which firms are putting the pieces together faster than the official statistics can.

(Adam M. Zaretsky is an economist at the Federal Reserve Bank of St. Louis. Gilberto Espinoza provided research assistance. This article first appeared in the October 1998 edition of The Regional Economist.)

INSET CAPTION:

Computers may actually substitute for other capital including human capital thereby replacing, rather than adding to, some of the productivity gains. [[In-content Ad]]

Comments

No comments on this story |
Please log in to add your comment
Editors' Pick
Open for Business: The Quilted Cow

A franchise store of a Branson West-based quilting business made its Queen City debut; Grateful Vase launched in Lebanon; and Branson entertainment venue The Social Birdy had its grand opening.

Most Read
SBJ.net Poll
Update cookies preferences