By Theo Anthony. Edited by Arjun Chandrasekar.
Overview
In many skilled careers, the use of information technology (IT) is much more applicable compared to that of unskilled professions. Hence, technological change has, in recent years, increased the productivity of a vast quantity of workers. That said, only decreasing the relative supply of skilled workers in a given profession, or accelerating technological change is bound to influence the wages of skilled labor.
Figure 1:
SBTC
Skill-Biased Technological Change, more commonly referred to as SBTC, has noticeably increased the productivity of skilled workers in the 21st century. Figure 1 illustrates how labor productivity has evolved in Economically Developed Countries (EDCs) and Advanced Countries (ACs) over the long term.
Analysis
SBTC increases the firm’s demand for skilled workers to a greater extent than the demand for unskilled. The substitution effect outlines how technological change affects each firm’s relative demand for unskilled or skilled labor. If some firms experience a technological shock that increases their demand for high-skilled workers and consequently increases the demand for high-skill workers relative to the supply of high-skill workers in the economy, then wages for high-skill workers increase relative to that of low-skill workers.
Linking back to the effect of supply and demand of different types of labor, the impact on wages paid to high-skilled workers in high-tech firms compared to the rest of the firms depends on the impact on the total demand for skilled workers. Historically, it can be determined that skilled workers using more advanced technology in their work have earned more. A study conducted by Krueger (1993) found that workers using a computer at work increased by 13% between 1984 and 1989; as a result, he estimated that workers who used a computer earned 10-15% more than those who did not.
Figure 2:
Figure 2 quantifies recent technological change by measuring the relative size of the IT sector within the overall economy. It also presents a timeline of key events associated with the development of personal computers, plotted along with two simple measures of the extent of computer-related technological change. Although electronic computing devices were developed during World War II, and the Apple II was released in 1977, many observers date the beginning of the “computer revolution” to the introduction of the IBM-PC in 1981.
The use of the internet grew very rapidly after the introduction of Netscape’s Navigator program in 1994: the number of internet hosts rose from about 1 million in 1992 to 20 million in 1997, and to 100 million in 2000. Qualitative information on the growth of technological change is potentially helpful in drawing connections between specific innovations and changes in wage inequality. For example, the sharp rise in wage inequality between 1980 and 1985 points to technological innovations that occurred very early in the computer revolution as the key skill-biased events.
This is the end of part 1 of this article, and part 2 can be found here!