Are there really 100s of thousands of unfilled positions in tech or is that an urban legend

(written by lawrence krubner, however indented passages are often quotes). You can contact lawrence at: lawrence@krubner.com, or follow me on Twitter.

Nikema wrote:

Are there really 100s of thousands of unfilled positions in tech or is that an urban legend? There’s a lot of gatekeeping bullshit for an industry that claims to need new people. Also, if that’s the case what’s stopping companies from providing training and financial support…

In 1998 I had an older friend who was experienced. They did freelance work for $125 an hour. Adjusting for inflation that’s almost $240 today. Today I know senior devs that get $125 an hour, but adjusting for inflation that’s a 50% cut. Is demand high? Where are the wage hikes?

At the high end the situation is complicated. I have friends who are devops experts who charge $200 an hour, but there were Oracle experts who were charging $200 an hour in 1995. I’m not seeing the big wage hikes I’d expect in an industry that is suffering a shortage of skills.

What’s sometimes overlooked is that in 1990 most programming jobs were tied to industrial centers, so deindustrialization lost tens of thousands of dev jobs in places like Ohio, Indiana, Illinois. The growth of jobs for the Web and mobile apps is partly offset by the losses of so many older computer programming jobs.

Outsourcing has also exerted major downward pressure on wages for the last 25 years. Businesses in the middle of the country can now send their accounting grunt work to India or Vietnam. Programming jobs are concentrating into a few strongholds, especially California and New York.

Some argue that it is easier to get started nowadays as a developer, thanks to dev bootcamps, or Stackoverflow, YouTube, Google, Coursera, etc. That’s a different way of making the same point. You can say there is ample supply or there is a lack of demand or things are easier, the result is the same. The point, as Nikema said, is this industry claims it faces a skills shortage, but all the evidence suggests otherwise.

Also, is it true that it is easier to start now? “Ease of getting started” was huge in the 1990s. Microsoft did a great job with VisualBasic – tens of thousands of people learned to build forms for databases. And building dynamic websites simply meant learning basic PHP. There was no git back then, no Docker, no frameworks, you didn’t have to learn the CLI, everything was much more basic, it was much easier to build a successful Web company, there was less competition.

The growing complexity, much of which seems unnecessary, that is heaped on new software developers seems to fit into Nikema’s point about “There’s a lot of gatekeeping bullshit”.

Someone replied to Nikema that demand for developers was higher now. That is mostly incorrect. Read this government report about the trends for all IT jobs.

The technology boom of the 1990s led to a steep increase in IT workers. The number of IT workers more than doubled to 3.4 million between 1990 and 2000 (2.5 percent of the labor force). Careers involving computers expanded into new areas. After … the technology bubble burst in 2000, causing the demand for IT workers to increase at a slower pace than in previous decades. By 2014, 4.6 million IT workers accounted for 2.9 percent of the labor force

This chart is from the report:

You see there was significant job growth in the 70s, 80s, 90s, followed by stagnation after 2000, then a recent uptick. The issue is complicated because the BLS groups many occupations into the broad category of IT. Still, you can see, during the 80s and 90s the USA became a digital society, whereas things have mostly stagnated since then. The IT category grew from 2.5% to 2.9% during 15 years that include the Great Recession and the loss of many industrial jobs. Does that mean that demand is greater now? What’s clear is that growth is very slow now, compared to the 80s and 90s. The IT jobs more than doubled in the 90s, so if we had another decade like that, then obviously wages would rise dramatically, even if the barriers to entry are softer now than before.

As to the uptick from 2.5% to 2.9%, how much of that is the decline of other industries during the Great Recession? As a thought experiment, imagine a society with 100 million jobs, and 2.5 million IT jobs. The IT jobs represent 2.5% of the workforce. Now imagine the worst recession in a century strikes and the economy loses 14 million jobs, but it doesn’t lose any IT jobs. The economy now has 86 million jobs, and it still has 2.5 million IT jobs, but that is now 2.9% of the total jobs. Obviously, in real life, the USA did not lose 14% of its jobs, but it lost a lot of jobs after the crisis hit in 2008, so when you see IT go from 2.5% to 2.9%, between 2010 and 2014, remember some of that seeming uptick is because of non-IT jobs that were lost.

Oddly enough, some people in that Twitter thread suggested that wages were held back because the companies didn’t have enough money to hire more people. Is it meaningful to say that companies want something, but can’t afford it? We all want things that we are unable to pay for. That’s true for corporations and also individuals. This is true during booms, and also during busts. Is that a reasonable explanation for wage stagnation? In terms of wanting to hire more people than can be afforded, that would be every startup that has ever existed.

Aside from the issue of weak demand, and general stagnation, there is also the issue of increasing monopoly in the economy, which serves to hold back wages. This is often talked about in the context of Walmart and McDonalds, but it’s probably also happening in tech, as Microsoft and Apple and Google use up the oxygen. See:

Monopsony, Rigidity, and the Wage Puzzle

Source