Wednesday, September 20, 2000
By Andrew Cassel
A job crisis? IT all depends
Walk into any information-technology company these days and you'll almost
certainly hear complaints about how hard it is to find qualified help.
Software programmers, network designers, system analysts - all those arcane
job categories that now dominate the help-wanted ads - there just aren't
enough of them.
If the situation doesn't change, moreover, a lack of tech workers could
choke America's chip- and software-led economic boom.
Or so they say.
Prescriptions for meeting this looming crisis range from more
computer-science courses in colleges and high schools to new rules to
encourage immigration by foreign-trained technology workers. But Peter
Cappelli, a Wharton School management professor, thinks employers can
find a solution closer to home.
Cappelli, a human-resources specialist, examined the high-tech employment
issue recently for the consulting firm McKinsey & Co. He found, first, great
disagreement over whether there is any problem at all.
Is there a shortage?
Recent studies have found technology worker "shortages" ranging from more
than a half-million to zero. "Industry representatives argue that there is a
labor shortage, while outside experts and labor-market analysts see, at
most, a temporarily tight labor market," Cappelli writes.
Why the discrepancy? Partly because information technology is different
from making widgets or raising hogs. And partly because it's not. Most
industries live with some variation of what Iowa farmers call the "hog
cycle" - not a Harley-Davidson, but the up-and-down nature of supply and
demand. When pork prices go up, farmers raise more hogs; when it drops,
they raise fewer. But it takes time for little piggies to get to market,
so supply and demand never quite even out.
Similarly, rising demand for computer geeks makes salaries go up, which
attracts more students to the field. But training takes several years, and
when the new crop of brains is ready, demand already might be in
This is, in fact, what happened in the 1980s, Cappelli writes.
Technology salaries fell in real terms from about 1985 through the early
'90s, causing interest in the field to drop just as demand for workers
was about to surge again.
Unlike hog-farming, however, technology's supply needs change rapidly.
Skills that were prized in the 1980s - knowledge of the DOS language,
for instance - were less useful in the Internet era. Experience became a
liability rather than an asset, creating problems both for older workers
and for the employers trying to replace them.
A retention problem
Indeed, much of the "shortage" of workers in technology businesses is
really a problem of high turnover, according to Cappelli. One survey
found only 19 percent of computer science grads still working in the
field 20 years later. For all the dot-coms luring recruits with stock
options and new BMWs, the industry has trouble keeping workers.
Why? Bad management, Cappelli argues. "Aside from pay, many IT jobs, but
especially computer-programming jobs, would qualify as lousy jobs," he
writes. Programmers are planted in cubicles, given pieces of large
projects to work on, and then largely ignored. Training, motivation and
rewards for performance are often low priorities.
Constant recruiting often serves as a substitute for retention, but even
there many firms are inept, Cappelli says. "Most employers don't know
how to hire the right people," he writes. It's not all their fault;
since computer skills are so new, it's tough to screen applicants
As a result, companies are "quite probably underpaying their best
applicants - dramatically so - and overpaying their poor ones," he
notes. "They just cannot tell which is which."