If you have a small development team of a dozen people, or fewer, you can probably do fine developing all of your software onshore. It might even be the right call, provided you can afford it, since at that point you might just be figuring out what your product really is, and it’s better to work with developers who sit with “the business” and are able to react quickly to what the business is seeing in the marketplace. Or you might need some specialized skills that just aren’t available offshore. Sometimes if you’re working with the latest of the latest technologies, or something a bit arcane, then you might have trouble finding that skill set offshore. Or, if you need some awareness of the business or social environment in your home country, then people at home would have a better intuitive sense of what you’re trying to accomplish. For example, if you’re building a social media product for bar-hoppers in US cities, you’re probably best to develop version 1.0 as a “Lean Startup” with 2-3 developers in a major US city, since you’re going to have a tough time explaining to offshore developers exactly what your product vision is. And if you try to explain it, they’ll most likely give you a product that doesn’t fit the bill, and tell you that “you kept changing the requirements” (and they’d probably be correct about that).
But at a certain point in your company’s life, assuming you are above a certain scale and size, you’ve most likely got to do some work offshore. Of course there’s a cost argument. Loaded costs for developers at an outsourcing shop in India are around 50% of the costs for comparable people in the US. And if you have the time and knowledge to form a “captive” offshore development shop, you might save more than that. But apart from cost, it seems there just aren’t enough software developers in the US to meet the demand anymore. There’s a contrarian point of view regarding this, and you can hear it from commentators like Ron Hira, who claims that actually there is no shortage of developers in the US, and when companies go offshore they are just looking for low costs and there’s no supply issue when it comes to developer talent. In some of his other articles, Hira makes some good points about the H1B visa program in the US which, while generally positive for the US, has been abused by some of the IT outsourcing firms to bring in undistinguished talents, and pay them below the US norm. But in this case, I think Hira’s work suffers because he lumps all IT workers into one big bucket. I’m not talking about a shortage of generic IT workers. But if you need to hire highly skilled software developers, who are current in the latest web and mobile technologies, and who are the best in the world at what they do, there is data that shows that there is a shortage in the US.
The Wall Street Journal recently reported that there are over 150,000 openings for software developers in the US right now. Intuitively that sounds a bit high to me, but there’s no doubt that there are major shortages in some key areas, such as mobile development—who isn’t thinking about their mobile strategy these days? And in certain areas of the US, there just aren’t enough people in the region engaged in software development, and if you need to hire a few experienced Java developers, it turns out to be quite difficult. If you’re looking for these people in quantity in the US, and your company is not called Facebook or Google, you might be fighting a losing battle. So you might be better off offshore, where you can raise your wages a bit above market and attract some strong talent.
Is that the future of work in the US? Is all of the software going to be built offshore, while those of us living in the US will all be “idea people” (and burger flippers and retail clerks, I suppose)? Do we need to encourage more American kids to study “hard” subjects like science, engineering and math as their college majors, in order to make us more competitive as a nation? One would think that the market would signal the need for more developers through rising salaries, and more people would respond to those signals and would study disciplines like computer engineering and computer science in college. Good luck with that. The problem is that the dogs (students?) just aren’t eating that dog food. A recent study reported in the NY Times found that students drop out of these “hard” subjects in droves. It’s not exactly clear why this is the case, but it seems to have something to do with these courses of study being something like the Bataan death march. It seems for most American kids, the prospect of $60K+ starting salaries for entry level developer jobs just doesn’t justify having a shitty undergraduate experience, just when they are spreading their wings and getting out of the house. Apparently President Obama set a goal of graduating an additional 10,000 engineers per year. But if you believe the experts (at least the one guy quoted in the Times article), that’s just not gonna happen. Even if we got to 10,000 additional engineers per year, and we peel away the civil and chemical engineers, and others who can’t help us close our 150,000 person software developer gap, then it would appear that our gap wouldn’t close for years anyway.
I don’t really have a realistic “three-step plan” to solve this software developer gap in the US, and I’m not aware that anyone else does either. But I think it’s a real gap, so for the foreseeable future the only way US companies are going to get the developers they need is through a mixture of offshore and onshore software development. So we might as well just admit this and figure out how to be great at managing mixed teams of onshore and offshore developers, probably using Agile-type processes. I’m not of the belief that there’s no future for US developers. It’s just the opposite—there’s plenty of demand for them. But there just aren’t enough developers with the skills we need onshore, so companies above a certain scale and size need to think offshore.