The Information Technology Job
Market: What Has Happened and What Will Happen
Everybody rejoiced when a nerdy teenager Peter Parker, played by a gentle Tobey Maguire, blossoms quite visibly into manhood with great power and great responsibility, like a male Cinderella in the recent Marvel Comics-based movie Spider Man. When I was growing up in the 40s, 50s, and early 60s, I successfully fought for the notion that an awkward and physically backward boy could legitimatize himself by succeeding academically. I rather got away with it but judging from the controversy over bullying in many of our public school systems, many teens do not. Yet, a new icon has appeared: a lonely but winsome teen who stays home and pounds away on his computer, developing his own kind of competitiveness with a technical and often mainly mechanical, basically masculine, curiosity.
The results from this trend vary. Some teens fear and resent authority or the notion of conventional competitiveness, so they distinguish themselves by writing and propagating viruses and worms, or by hacking into classified websites (sometimes this requires real mental gymnastics to deal with buffer overflows, packets and bit-streams). At the other extreme, some start real businesses and become millionaires. These enterprises range from firewall companies to legitimate music review services. Not all were physical sissies like me. Sean Fanning, who taught himself peer-to-peer computing and started Napster (and I leave aside the legal controversy over music copyrights for the moment) might have been a baseball player.
Nevertheless we have a new stereotype to ponder, the “geek.” Not that this is negative, pejorative, dilettante or anti-social at all. Navy officer Paul Thomasson, who challenged the military gay ban while stationed at the Pentagon, had characterized himself as an all-purpose “geekolator.” A tech company in Minneapolis, The Geek Squad, boasts that its employees would rather play with computers (hardware as much as programming) on Saturday nights than go on dates. My financial planner tells me that his personal computer desktop was set up by his I.T. guy. There is something reassuring about working with technology, with things, and particularly with abstractions. For one thing, you don’t have to peddle to other people.
Geek culture, however, is still relatively new to the information technology world, even if that culture is distorting it. It is useful to go back into history and look at the culture of information technology as a “profession” and to understand how it evolved. I can weigh in on this with thirty-five years worth of personal experience..
Perhaps the ultimate personal encounter with this kind of geek mentality occurred in the summer of 2002 when a woman asked me if I could fix the time-of-day counter on her cell phone. Without directions (“documentation”), I could not. Her phone was very different from mine. “But a systems analyst does computers,” she protested. “Systems analysts figure out how computers work and fix them.” And a cell phone is effectively a computer, a PDA perhaps, even if it’s wireless. This reminded me of an occasion twenty years before when a woman at work asked if I could change her flat tire because a man was (in her mind) supposed to do such things for a woman.
The idea of computers as the basis of a new profession probably goes back to World War II, when British and American intelligence developed enormous machines to break Nazi codes (as in the movie Enigma). Government and business quickly recognized the potential economic value of large-scale computing in defense, space, and later commerce, and the engineering work for it grew quickly after World War II, as did programming languages. By the 1950s, businesses often had crude computer systems, sometimes EAM equipment with plug boards, to do accounting and inventory. Schools used such equipment for class schedules and grades. IBM, Burroughs, Univac, NCR, Control Data and RCA (and especially Cray) all worked on larger computers, which at first were divided into “scientific” and “business” categories.
By the mid 1960s the “general purpose” IBM 360 series had evolved and would be answered by other vendors. Scientific computing, such as mission command and control, was really quite well developed in the 1960s. Higher languages like FORTRAN and COBOL came into considerable use, and jobs for programmers at both the machine and conceptual level developed. Even COBOL, an English-like language designed for business use, has its origins with the work of Dr. Grace Hopper working for the Naval Reserve.
I dove into this with summer jobs with the Navy in the 1960s. The academic background for computer programmers then was usually mathematics or electrical engineering. “Computer science” as an academic discipline and major came into being in the late 1960s, just a bit too late for me.
The winding down of Vietnam and defense cuts followed by the oil and Mid East crises produced a recession in the 1972-1974 period (and actually a mini dip around 1970) but the computer field continued to grow. Major commercial employers, especially banks and insurance companies as well as manufacturers, invested heavily into bread-and-butter mainframe batch and online applications in the 1970s and early 1980s, especially in areas like accounting, inventory, investment, as well as health care and welfare programs usually developed by government contractors. In the earliest days defense culture heavily influenced information technology values, and some companies like EDS frankly and openly preferred to hire ex-military people. Many companies enforced strict dress codes in order to maintain a public image of professionalism.
This culture would gradually change as civilian commercial applications became more important. The growing commercialism would help IBM edge out all of its competitors in the mainframe business. By the late 1970s, having an IBM background clearly gave an advantage in the job market. Soon CICS (the major IBM teleprocessing monitor) and various databases would become important components in one’s background. People would change jobs to “get IBM” and again to get the expected hands-on experience coding CICS and various databases (especially IMS and then DB2).
Since early programming jobs tended to emphasize meticulous, almost “feminine” attention to detail and mental concentration, the field of systems analysis—the translation of business requirements (and, often enough, specification and assembly of business requirements) into detailed procedural instructions (such as “structured English” and even pseudocode) for programmers became an informal discipline of its own. The job of systems analyst was a natural promotion for a programmer, with higher level business responsibilities and less focus on “the trees.” Gradually, the concept of a full systems life cycle was nurtured.
Another important development in the 1970s especially was programmer productivity. In the earliest days of keypunched cards there was considerable emphasis on desk-checking and tedious, methodical dump analysis, since programmers often had only one “shot” a day or had to use scheduled time at night. Without the ability to solve problems without excessive machine use, one could not keep a job. But in the 1970s employee terminals became more common, first at vendors (like Univac) other than IBM. Some companies would have a “tube city” room but by the 1980s most employees had their own dedicated mainframe terminals. Productivity dump-analysis and debugging tools developed and relieved the need for tedium.
A further enhancement of mainframe culture in most commercial shops would be the development of separation of functions and security access, by the late 1980s. Generally, programmers would no longer be able to update production data without specific access.
All of this gradual improvement in information technology tended to lead to a new generalist job category, the “programmer-analyst.” The responsibilities for combining technical business requirements analysis with coding, testing, implementation and support could become one position. This led to more cost-savings for employers and often benefited the associate as well. A computer professional could earn a steady high income without formal movement into or deliberate grooming for management (often distasteful for the somewhat introverted character of many information technology people during those days).
Even so he could become more valuable to his organization while being himself. Often he or she became the guru of some company-specific application in regular production and would be paid very well to stay (sometimes at the risk of not keeping up with technical skills likely to be expected by other employers). Companies could get into positions of severe exposure if such key people left.
On the other hand, the computer professional was held to a very high standard of accuracy and dependability for his system, which would process huge volumes of data in production, often 24 x 7, so the idea of being on “nightcall” grew rapidly. With no union to represent them and no clear direction from certification bodies as to proper human resources policies, salaried programmers were often expected to put in uncompensated overtime. Availability for night support became perceived as a “moral issue” in some shops.
There were, of course, many other kinds of jobs. For example, data centers would be populated by operators who were paid hourly for shifts that ran at all times. Mainframe technical support tended to become a different kind of job, salaried but much more likely to be done off-hours. Technical support professionals (and so-called “systems programmers”) did not write their own programs; rather, they installed system software provided by major vendors and then maintained the software by installing and testing patches and fixes.
The job market diversified in other areas. Consulting companies hired programmers to staff large projects at customer companies. Typical assignments ran from a few months to several years but sometimes programmers were compensated even when “on the bench,” a good time for training. Vendors would develop “asset persons,” firefighters who drove from one customer site to another.
In the late 1980s and early 1990s a major sea change began to develop in the market. Mergers (including hostile takeovers) and leveraged buyouts led to major data center consolidations including application consolidations and then layoffs. Gradually less capable programmers would be weeded out of the market. However, new opportunities that gradually changed the whole culture of computing emerged.
In the mainframe area companies tended to migrate from homemade or “in house” systems to purchased packages from large software vendors. Actually, packages (like CFO from IBM in the life insurance industry), often written in assembler, had been in use for years and offered quite sophisticated end- user options. Even so shops had always spent a lot of time and money on in-house interfaces and add-ons. Newer packages, like Vantage in the life insurance business and various financial packages like MSA from Dun and Bradstreet, tended to try to “rule the world” and force customers into mastering whole new systems cultures.
These vendor packages began to affect the mainframe job market, as employers needed people with detailed expertise in these packages. Sometimes these packages would tend to drive applications programmers into technical support areas and potentially compromise the separation of functions concept that businesses had developed as part of their “best practices” for security and audit.
More important were the development first of the personal computer before 1980 (remember the Commodore and the Osborne, as well as TRS-80?) and then the opening of Internet, which (though originally conceived as a defense and academic facility) was turned loose to the public by the National Science Foundation in 1992 toward the end of the “first Bush administration.” Even before the Internet became public, the personal computer was seen as providing opportunities for scalable and cheaper applications such as retail point-of-sale.
The late 1980s saw the rapid increase in practical individual computing, with word-processing packages and then database management packages (when DBase III+ had its heyday). Smaller companies (such as a public policy think tank that I worked for) faced the choice between renting mainframe disk space and computer time, and bringing applications in-house onto personal computers that they could “control.” For a time, mainframe platforms like IBM VM tried to mimic PC-style modes of operation.
At the same time, companies tried to provide mainframe-style database technologies for the PC, such as when Rbase (Microrim) and then DBase IV (then, Ashton-Tate) offered relational SQL processing. The mini-computer market (such as the VAX, Silverlake/AS400, MAI Basic 4) had grown steadily into the mid 1990s, and emphasized less verbose “PC type” languages such as C and Basic, and the Unix (as opposed to the IBM mainframe DOS and MVS) operating system. Larger telecommunications companies (building on earlier defense-oriented experience) had already learned how to work with the operating systems and languages to build their own commercial networks when the Internet became available.
The rules of the game changed quickly as the Internet developed. Some of the changes were applicable to the mainframe world. For example, database technology gradually pulled away from specific design models (IMS and Focus for hierarchal, IDMS and Adabas for network paradigm) to relational models: first DB2 for IBM, and then Sybase and Oracle on Unix-style (or Microsoft) networks. Object-oriented computing (“responsibility driven design”) began to compete with procedural programming (“structured design”).
During the 1990s the mainframe job market remained strong because of the attention paid to the Y2K problem. All along, there was developing a new emphasis on formatting interfaces for the end user, especially the external consumer, who would use the Internet to make purchases. This worldview required a new emphasis on scalability and portability of applications. So-called “sexy” languages like C++ and java (and now C#, in connection with Microsoft’s .NET), as well as visually-driven development packages like Powerbuilder and (most of all) Visual Basic, were much more adaptable to this scalability requirement than was mainframe-style programming in COBOL and early 90s releases of CICS. (COBOL, however, does now have a rich object-oriented facility.)
After the Y2K event was traversed with (to some hungry minds) relatively few problems, the demand for mainframe skills seemed to fall off of a cliff. This seemed particularly true during the economic slowdown towards the end of 2000, well before the 9-11 tragedy. On the surface, this seems to happen because client-server architecture is much more flexible for the new economy. But there are questions. Scalability and portability have to be weighed against the need, in the short term, for a large number of software licenses and people to support the various components.
In some conservative environments, mainframe could still be economical. IBM has developed an embedded Unix System Facility within mainframe OS390 and is in a position to offer very disciplined management of all client-server production environments with mainframe-style source and implementation control, as well as enormous database server capacity. The open architecture of modern computing has, of course, made it very vulnerable to virus, worm and packet-sniffing attacks—a development that leads to a tremendous demand for geeks with in-depth security skills, perhaps in a pace that exceeds the ability of companies to do background checks. But even mainframe environments had their vulnerabilities, as when source and load module control was not carefully enforced.
All of this does relate to a cultural divide between older “baby boomer” professionals (like myself) and younger geeks. Okay, I over-simplify with these loaded words, but I did find the transition to client-server, at least in a support role, difficult. The knowledge base in a practical corporate environment tended to be fragmented, with different components in different languages patched together with various kinds of complicated interfaces. There was an unexpected emphasis on being able to solve problems that the programmer had never seen before, with business applications written in cryptic languages or poorly documented implementations done by consultants who were long since gone. The younger temperament vouches for quick learning curves and quick development, with source and scripting languages that seem more mechanical and less verbose than structured COBOL , command level CICS, and various mainframe 4GL’s. Computer magazines have sometimes predicted that older mainframe or procedural programmers won’t be able to grasp the mental connections required by object-orientated design.
Younger
programmers tend to learn the newer languages on campuses as an engineering
discipline. A major factor in the technical culture is that the aim of computing
is changing, not only with the new emphasis on the external consumer but also
with the interest in passing data in “peer-to-peer” fashion among customers or
stakeholders without always monitoring all transactions from one database. The low-level C or assembler coding of
“firmware,” for those devices that “Q” so proudly displayed in James Bond
movies, seems to have increased relative to the market, making the emphasis on
engineering more prominent.
Yet vendors are promising a very disciplined, almost mainframe-style environment for future enterprises. Sun proposes to handle almost any conceivable corporate computing problem with complete implementation of its java platforms, with a new emphasis on XML data, schemas and style sheets to manipulate the content sent to the end user. Java can do practically anything, including managing threads and system performance, so there is a tendency for the intellectual disciplines for business logic and underlying system software support to merge into one seamless whole.
Once again, the distinction between business logic and systems maintenance can become blurred (a big security concern) but Enterprise Java Beans are offered as a new way to implement such a separation. The “java final solution” carries portability to its ultimate incarnation. On the other hand, Microsoft (having displaced IBM as the ruler of the world and inviting plenty of litigation in the process) offers .NET, with the ability to handle almost any language, as the ultimate solution for parsing information sent to the end user.
Part of understanding the future job market harkens for a review of just what we use our information for. I took advantage of the technology boom of the mid-1990s to self-publish a huge book at low cost. But others saw technology as a way to mass-market simple products and services (the opposite of developing the rich content of books and movies). Part of the problem with the “Startup.com” phenomenon and subsequent pre-9-11 bust was the superficiality of what was being offered to an impatient consumer.
There is a lot of emphasis, especially with .NET, in point-to-point messaging capability, as if one’s top priority were knowing the just-in-time inventory levels of a supplier five minutes ago, or of reliable wireless and mobile email communications when setting up a clandestine luncheon rendez-vous (for me, at least, in Boston’s Legal Seafoods). Yet IBM, Sun and Microsoft all propose flexible visions of computing that can support rich and educational multimedia content development as well as major business infrastructure upgrade. The most glaring example of which might be the need for a new air traffic control system and for new systems in intelligence and law enforcement to connect the dots.
Where does this leave the job seeker, particularly the experienced professional, in today’s market? Between a rock and a hard place? Maybe, because the market is sputtering in so many directions as it tries to get started again. Already there is demand for mainframe programmers but generally for very narrowly defined skill-sets for relatively short term W-2 contracts. There has been some controversy in the literature (as with the Meta Group) as to whether companies will soon need older mainframe programmers as the college campuses fail to replace those that leave and retire. In April, 2002 Bob Weinstein published a “Tech Watch” syndicated column in which he predicted that companies would seen have to search for large number of mainframe programmers in their 50s and 60s to maintain their backbone legacy systems.[1] Whether outsourcing these systems off-shore will eliminate this need remains to be seen, especially in light of world instability. Outsourcers may not have the business knowledge and communications skills to maintain intricate legacy systems as they run in production, so some demand of older professionals who grew up in the business could continue.[2]
Sometimes there may appear sudden interest from employers in specific areas, as in the summer of 2002 when carriers for Medicaid programs and other health care providers suddenly began recruiting (at least for short term contracts) mainframe people to comply with the Health Insurance Portability and Accountability Act.[3] At the same time, the overwhelming majority of open positions on jobs web sites seem to be with newer “open systems” technologies.
With a profession changing gears so suddenly and unpredictably, one of the most critical issues seems to be how it should define its notion of “professionalism.” Information technology has historically been one of the least “regulated” fields for professionals. Medicine, law, financial planning, accounting, actuarial and engineering all have processes of certification and regulation (shared by industry certification processes and state regulation) that are far more controlled. The growing popularity of advanced degrees in information technology may prove useful in providing some organized basis for disciplined knowledge, as is the case with MBA’s. Even graduate students in I.T. seem little informed about the heavily controlled processes in mainframe shops.
There have been various attempts to bring certification to the business. The Institute for the Certification of Computer Professionals offers general certification that is based mostly on passing conceptual exams in areas such as business systems, systems design, software engineering and security; the offering in specific languages is somewhat limited but growing. A company called Brainbench offers adaptive online testing and certification in many very specific programming languages and disciplines. Software vendors have for a few years offered their own “boot camps” and certifications, mostly in newer technologies.
Many of these appear to be directed at younger workers, often those without college degrees—again, going along with the idea of information technology as a “trade” as well as a “profession.” Some computer stores actually employ high-school students as desktop repair technicians and enable the students to earn A+ certifications before graduating from high school!
At the same time, in a market where supply of candidates presently exceeds demand, employers have demanded very specific, job-ready skills from applicants. Their demands require particular combinations of specific skills with recent experience, often with considerable practical depth and hands-on practice beyond a certification or campus course. To some candidates the behavior of employers and recruiters probably seems erratic. But as the economy gradually stabilizes into a slow-growth pattern, employers will probably be more concerned with candidate goals and with some kind of direction and consistency in their past careers.
All of this brings up the testy problem of career motivation. I.T. people have, for the price of eternal vigilance regarding the production business systems that they support (often including regular batch cycles), long enjoyed the relative comfort of steady job demand growth without too much hucksterism. The trend towards client-server requires the professional to be much quicker to pick up new non-linear skills, even at his own time and expense, whereas the large mainframe applications of the past often used relatively small parts of a professional’s potential knowledge base.
I.T. people have often disdained being too closely identified publicly with the business customers or interests that they serve (and some I.T. professionals have a particular disdain for sales or business people whom they see as not doing “real jobs.”[4]). There is an ethical tradeoff here. We can envision a paradigm where large businesses are more willing to develop associates’ careers in exchange for a public loyalty to the aims of the business (and a willingness to invest heavily in business as well as technical expertise). Professionals who are uncomfortable with a commitment to one kind of business (especially to the point of moving into management and advocating the company publicly) would be expected to emphasize technical curiosity for its own sake. A few years ago where a particularly geeky coworker chided me for my “astonishing lack of curiosity” about all the little technical niches on my desktop—he saw mechanical curiosity as the justification for his position and an excuse to remain apolitical about everything!
This paradigm means that I.T. professionals would regularly take certification tests in a variety of areas to make sure that they are committed to keeping up with less-frequently used skills (like unaided dump analysis, for that matter, on the mainframe). The explosive growth in software, along with the change in design paradigms, means that today I.T. professionals will have to remain very deliberate and diligent in keeping themselves trained with specific expertise, regardless of the immediate demands in jobs that they have now. “Resume stuffing” with lists of areas in which one has light or superficial experience would be discouraged, and professionals would have to weigh carefully the balance between versatility and commitment with specific expertise to skills that may not always remain in high demand. Employers should make keeping up with technology a major performance evaluation component and requirement. Geek curiosity presents a certain paradox with regard to professionalism, as it is predicated upon readiness to solve new problems created by others. Is “geek professionalism” an oxymoron? Perhaps not if it is defined by its own mechanical inquisitiveness.
Computer User has provided some sobering accounts of how demand is changing. Jim Thompson in “Hot Careers in a Cool Market”[5] wrote that skills having to do with infrastructure, administration, peer-to-peer contacts, security (of course) and even multimedia were in much more demand than “content related” application programming skills. Robert McGarvey predicts that more jobs will migrate to the administration area in his discussion of certification.[6] At least one company, Advanced Internet Technologies in Fayetteville, N.C. has tried to model itself after the military,[7] as had (to an extent, especially with its prudish dress codes) Electronic Data Systems (EDS) back in the 1960s and 1970s.
The Information Technology Association of America presented (in May 2002) a report, “Bouncing Back: Jobs, Skills, and the Continuing Demand for IT Workers,” discussed the paradox of overall reduced reductions in positions together with unfilled positions in specialized areas. The shakeout in the I.T. job market may indeed force most computer professionals to focus and narrow their goals. No longer will it be reasonable to sit in a comfortable place and (even given dedication to systems one has worked on) and collect a good paycheck without a strategy for advancement and for addressing the needs of specific kinds of customers. Technical people will have to focus on the details of their craft and accept opportunity costs (going “Greyhound” and leaving controversies outside of their expertise areas to others) until they are prepared to move into focused business areas for which they feel comfortable with a public commitment.
Ten years ago in the previous recession, there was a certain disdain for middle management, as companies flattened their organizations with wider spans of control. The conventional wisdom was to market a job-ready, hands-on capability in a practical way, and some people thought that managers should prove that they could do the jobs of their reports. I have encountered the “Peter Principle” idea that managers or even business analysts weren’t “smart” enough to remain “real programmers,” just as I have encountered the opposing view that some programmers just don’t want to “grow up” and advance! That technical focus is still valid (as many positions seem to look for exact skill matches), but today there seems to be more emphasis to multi-task with the ability to assume leadership roles and formal or project management roles (with soft skills) when needed.
Whatever an I.T. professional’s choice, Internet culture will force him or her to become public about his or her professional goals and business associations, and even in a recessionary environment some companies prefer to hire only professionals prepared for a long-term somewhat publicly visible commitment rather than short term W2 contractors (receivng no benefits except regular social security tax rates and liability insurance). There is a distinction between contractor and consultant. Like it or no, you will become what you do.
I spent thirty-one years in I.T. after leaving the Army and went thirty years without a layoff, with very dependable income. But one Thursday morning in December 2001 a Novell server told me that my account was disabled. Ten minutes later a director was offering a handshake as he awarded a long severance package. I had remained a freelancer since that moment. As of September 2002, I have yet to log on to a work computer.
ăCopyright 2002 by Bill Boushka
Return to doaskdotell home page
Return to DADT Sequel Contents page
[1] Bob Weinstein, “Mainframes are still around, and so is the demand for programmers,” Tech Watch, King Features Syndicate, Minneapolis Star Tribune, April 28, 2002, page J1.
[2] Julia King, “Mainframe skills, pay at a premium,” Computerworld, March 4, 2002.
[3] Here is the URL for it: http://www.hipaaplus.com/abouthippa.htm
[4] Barbara Ehrenreich, Nickel and Dimed: On (Not) Getting By in America (New York: Henry Holt, 2001), provides a chilling look at minimum wage work in “real jobs,” manual labor that the rest of us depend up shamelessly.
[5] Jim Thompson, “Hot Careers in a Cool Market,” Computer User, January 2002.
[6] Robert McGarvey, “On Site from Afar: Distance Training Is Changing the Way IT Managers Keep Employee Skills Fresh,” Computer User, July 2002.
[7] Bob Weinstein, Tech Watch, “Enlist in Web firm AIT’s boot camp and be all that you can be,” Star Tribune, Aug. 4, 2002.