Language Selection

English French German Italian Portuguese Spanish

Programming Jobs Losing Luster in U.S.

Filed under
News

As an eager freshman in the fall of 2001, Andrew Mo's career trajectory seemed preordained: He'd learn C++ and Java languages while earning a computer science degree at Stanford University, then land a Silicon Valley technology job.

The 22-year-old Shanghai native graduated this month with a major in computer science and a minor in economics. But he no longer plans to write code for a living, or even work at a tech company.

Mo begins work in the fall as a management consultant with The Boston Consulting Group, helping to lead projects at multinational companies. Consulting, he says, will insulate him from the offshore outsourcing that's sending thousands of once-desirable computer programming jobs overseas.

More important, Mo believes his consulting gig is more lucrative, rewarding and imaginative than a traditional tech job. He characterized his summer programming internships as "too focused or localized, even meaningless."

"A consulting job injects you into companies at a higher level," he said. "You don't feel like you're doing basic stuff."

Mo's decision to reboot his nascent career reflects a subtle but potentially significant industry shift. As tens of thousands of engineering jobs migrate to developing countries, many new entrants into the U.S. work force see info tech jobs as monotonous, uncreative and easily farmed out - the equivalent of 1980s manufacturing jobs.

The research firm Gartner Inc. predicts that up to 15 percent of tech workers will drop out of the profession by 2010, not including those who retire or die. Most will leave because they can't get jobs or can get more money or job satisfaction elsewhere. Within the same period, worldwide demand for technology developers - a job category ranging from programmers people who maintain everything from mainframes to employee laptops - is forecast to shrink by 30 percent.

Gartner researchers say most people affiliated with corporate information technology departments will assume "business-facing" roles, focused not so much on gadgets and algorithms but corporate strategy, personnel and financial analysis.

"If you're only interested in deep coding and you want to remain in your cubicle all day, there are a shrinking number of jobs for you," said Diane Morello, Gartner vice president of research. "Employers are starting to want versatilists - people who have deep experience with enterprise-wide applications and can parlay it into some larger cross-company projects out there."

Career experts say the decline of traditional tech jobs for U.S. workers isn't likely to reverse anytime soon.

The U.S. software industry lost 16 percent of its jobs from March 2001 to March 2004, the Washington-based Economic Policy Institute found. The Bureau of Labor Statistics reported that information technology industries laid off more than 7,000 American workers in the first quarter of 2005.

"Obviously the past four or five years have been really rough for tech job seekers, and that's not going to change - there are absolutely no signs that there's a huge boom about to happen where techies will get big salary hikes or there will be lots of new positions opening for them," said Allan Hoffman, the tech job expert at career site Monster.com.

Not everyone from the class of 2005 thinks programming is passe, and companies are always eager to hire Americans who can write great code - the type of work that, in recent years, produced innovations including file-sharing software at Napster and search engine tech at Google.

But even the most dedicated techies are entering the profession with less zeal than their predecessors.

The erosion of "deep code" and other technology jobs in the next decade is creating a high-stakes game of musical chairs for geeks, Silicon Valley recruiters say.

Dimming career prospects have been particularly ego-bruising for people who entered the profession during the late '90s, when employers doled out multiple job offers, generous starting salaries, and starting bonuses including stock options and Porsches.

"The current situation is getting back to the '70s and '80s, where IT workers were the basement cubicle geeks and they weren't very well off," said Matthew Moran, author of the six-month-old book "Information Technology Career Builder's Toolkit: A Complete Guide to Building Your Information Technology Career in Any Economy."

"They were making an honest living but weren't anything more than middle-class people just getting by," Moran said.

Thousands of U.S. companies have opened branches or hired contractors in India, China and Russia, transforming a cost-saving trick into a long-term business strategy. Offshoring may be a main factor in eroding enthusiasm for engineering careers among American students, creating a vast supply of low-wage labor in eastern Europe and Asia and driving down worldwide wages.

The average computer programmer in India costs roughly $20 per hour in wages and benefits, compared to $65 per hour for an American with a comparable degree and experience, according to the consulting firm Cap Gemini Ernst & Young.

According to the most recent data from the Science Foundation, 1.2 million of the world's 2.8 million university degrees in science and engineering in 2000 were earned by Asian students in Asian universities, with only 400,000 granted in the United States.

U.S. graduates probably shouldn't think of computer programming or chemical engineering as long-term careers but it's "not all gloom and doom," said Albert C. Gray, executive director of the National Society of Professional Engineers.

He says prospects are good for aeronautic, civil and biomedical engineers, the people who design and build artificial organs, life support devices and machines to nurture premature infants.

"In this country, we need to train our engineers to be at the leading edge," Gray said. "That's the only place there's still going to be engineering work here."

At Stanford, career experts are urging engineering and science majors to get internships and jobs outside of their comfort zones - in marketing, finance, sales and even consulting.

They suggest students develop foreign language skills to land jobs as cross-cultural project managers - the person who coordinates software development between work teams in Silicon Valley and the emerging tech hub of Bangalore, India, for example.

Stanford listed 268 job postings in its computer science jobs database in the spring quarter - roughly double the number from last year.

But that doesn't necessarily indicate a plethora of traditional tech jobs. About half of the new postings would prefer applicants who speak at least two languages and many were for management-track positions, said Beverley Principal, assistant director of employment services at Stanford.

"When they're first hired at the entry level, just out of school, people can't always become a manager or team leader," Principal said. "But many employers see these people moving into management roles within two years. They need to know how to step into these roles quickly."

Associated Press

More in Tux Machines

LibreOffice Office Suite Celebrates 6 Years of Activity with LibreOffice 5.2.2

Today, September 29, 2016, Italo Vignoli from The Document Foundation informs Softpedia via an email announcement about the general availability of the first point release of the LibreOffice 5.2 open-source and cross-platform office suite. On September 28, the LibreOffice project celebrated its 6th anniversary, and what better way to celebrate than to push a new update of the popular open source and cross-platform office suite used by millions of computer users worldwide. Therefore, we would like to inform our readers about the general availability of LibreOffice 5.2.2, which comes just three weeks after the release of LibreOffice 5.2.1. "Just one day after the project 6th anniversary, The Document Foundation (TDF) announces the availability of LibreOffice 5.2.2, the second minor release of the LibreOffice 5.2 family," says Italo Vignoli. "LibreOffice 5.2.2, targeted at technology enthusiasts, early adopters and power users, provides a number of fixes over the major release announced in August." Read more

OSS Leftovers

  • But is it safe? Uncork a bottle of vintage open-source FUD
    Most of the open source questioners come from larger organisations. Banks very rarely pop up here, and governments have long been hip to using open source. Both have ancient, proprietary systems in place here and there that are finally crumbling to dust and need replacing fast. Their concerns are more oft around risk management and picking the right projects. It’s usually organisations whose business is dealing with actual three dimensional objects that ask about open source. Manufacturing, industrials, oil and gas, mining, and others who have typically looked at IT as, at best, a helper for their business rather than a core product enabler. These industries are witnessing the lighting fast injection of software into their products - that whole “Internet of Things” jag we keep hearing about. Companies here are being forced to look at both using open source in their products and shipping open source as part of their business. The technical and pricing requirements for IoT scale software is a perfect fit for open source, especially that pricing bit. On the other end - peddling open source themselves - companies that are looking to build and sell software-driven “platforms” are finding that partners and developers are not so keen to join closed source ecosystems. These two pulls create some weird clunking in the heads of management at these companies who aren’t used to working with a sandles and rainbow frame of mind. They have a scepticism born of their inexperience with open source. Let’s address some of their trepidation.
  • Real business innovation begins with open practices
    To business leaders, "open source" often sounds too altruistic—and altruism is in short supply on the average balance sheet. But using and contributing to open source makes hard-nosed business sense, particularly as a way of increasing innovation. Today's firms all face increased competition and dynamic markets. Yesterday's big bang can easily become today's cautionary tale. Strategically, the only viable response to this disruption is constantly striving to serve customers better through sustained and continuous innovation. But delivering innovation is hard; the key is to embrace open and collaborative innovation across organizational walls—open innovation. Open source communities' values and practices generate open innovation, and working in open source is a practical, pragmatic way of delivering innovation. To avoid the all-too-real risk of buzzword bingo we can consider two definitions of "innovation": creating value (that serves customer needs) to sell for a profit; or reducing what a firm pays for services.
  • This Week In Servo 79
    In the last week, we landed 96 PRs in the Servo organization’s repositories. Promise support has arrived in Servo, thanks to hard work by jdm, dati91, and mmatyas! This does not fully implement microtasks, but unblocks the uses of Promises in many places (e.g., the WebBluetooth test suite). Emilio rewrote the bindings generation code for rust-bindgen, dramatically improving the flow of the code and output generated when producing Rust bindings for C and C++ code. The TPAC WebBluetooth standards meeting talked a bit about the great progress by the team at the University of Szeged in the context of Servo.
  • Servo Web Engine Now Supports Promises, Continues Churning Along
    It's been nearly two months since last writing about Mozilla's Servo web layout engine (in early August, back when WebRender2 landed) but development has kept up and they continue enabling more features for this next-generation alternative to Gecko. The latest is that Servo now supports JavaScript promises. If you are unfamiliar with the promise support, see this guide. The latest Servo code has improvements around its Rust binding generator for C and C++ code plus other changes.
  • Riak TS for time series analysis at scale
    Until recently, doing time series analysis at scale was expensive and almost exclusively the domain of large enterprises. What made time series a hard and expensive problem to tackle? Until the advent of the NoSQL database, scaling up to meet increasing velocity and volumes of data generally meant scaling hardware vertically by adding CPUs, memory, or additional hard drives. When combined with database licensing models that charged per processor core, the cost of scaling was simply out of reach for most. Fortunately, the open source community is democratising large scale data analysis rapidly, and I am lucky enough to work at a company making contributions in this space. In my talk at All Things Open this year, I'll introduce Riak TS, a key-value database optimized to store and retrieve time series data for massive data sets, and demonstrate how to use it in conjunction with three other open source tools—Python, Pandas, and Jupyter—to build a completely open source time series analysis platform. And it doesn't take all that long.
  • Free Software Directory meeting recap for September 23rd, 2016

Security News

  • security things in Linux v4.5
  • Time to Kill Security Questions—or Answer Them With Lies
    The notion of using robust, random passwords has become all but mainstream—by now anyone with an inkling of security sense knows that “password1” and “1234567” aren’t doing them any favors. But even as password security improves, there’s something even more problematic that underlies them: security questions. Last week Yahoo revealed that it had been massively hacked, with at least 500 million of its users’ data compromised by state sponsored intruders. And included in the company’s list of breached data weren’t just the usual hashed passwords and email addresses, but the security questions and answers that victims had chosen as a backup means of resetting their passwords—supposedly secret information like your favorite place to vacation or the street you grew up on. Yahoo’s data debacle highlights how those innocuous-seeming questions remain a weak link in our online authentication systems. Ask the security community about security questions, and they’ll tell you that they should be abolished—and that until they are, you should never answer them honestly. From their dangerous guessability to the difficulty of changing them after a major breach like Yahoo’s, security questions have proven to be deeply inadequate as contingency mechanisms for passwords. They’re meant to be a reliable last-ditch recovery feature: Even if you forget a complicated password, the thinking goes, you won’t forget your mother’s maiden name or the city you were born in. But by relying on factual data that was never meant to be kept secret in the first place—web and social media searches can often reveal where someone grew up or what the make of their first car was—the approach puts accounts at risk. And since your first pet’s name never changes, your answers to security questions can be instantly compromised across many digital services if they are revealed through digital snooping or a data breach.
  • LibreSSL and the latest OpenSSL security advisory
    Just a quick note that LibreSSL is not impacted by either of the issues mentioned in the latest OpenSSL security advisory - both of the issues exist in code that was added to OpenSSL in the last release, which is not present in LibreSSL.
  • Record-breaking DDoS reportedly delivered by >145k hacked cameras
    Last week, security news site KrebsOnSecurity went dark for more than 24 hours following what was believed to be a record 620 gigabit-per-second denial of service attack brought on by an ensemble of routers, security cameras, or other so-called Internet of Things devices. Now, there's word of a similar attack on a French Web host that peaked at a staggering 1.1 terabits per second, more than 60 percent bigger. The attacks were first reported on September 19 by Octave Klaba, the founder and CTO of OVH. The first one reached 1.1 Tbps while a follow-on was 901 Gbps. Then, last Friday, he reported more attacks that were in the same almost incomprehensible range. He said the distributed denial-of-service (DDoS) attacks were delivered through a collection of hacked Internet-connected cameras and digital video recorders. With each one having the ability to bombard targets with 1 Mbps to 30 Mbps, he estimated the botnet had a capacity of 1.5 Tbps. On Monday, Klaba reported that more than 6,800 new cameras had joined the botnet and said further that over the previous 48 hours the hosting service was subjected to dozens of attacks, some ranging from 100 Gbps to 800 Gbps. On Wednesday, he said more than 15,000 new devices had participated in attacks over the past 48 hours.

Android Leftovers