Training: Sliding Into the Brave New Automated World

Sept. 29, 2016

In our last issue we talked about something called “universal basic income.” The idea, promulgated by some sociologists and economists, is that government provides its citizens with a fixed monthly income to help compensate for jobs lost to technology.

We reported an Oxford University study, which estimates that in the next two decades as many as 47% of the jobs in this country will be candidates for automation. As is typical when academics take on an issue, these findings are disputed by a group of researchers at the Centre for European Economic Research, who suggests that things aren’t quite so bad. After all, many jobs involve “bundles” of tasks, many of which machines can’t handle, and this bundling will make these jobs more difficult to automate.

Historians and economists, who focus on the relationship between technology and society, take comfort in the fact that we’ve been through this before, when machines first made their appearance 100+ years ago, in that period we look back on as the Industrial Revolution. We adapted, they say. We survived. And very well, thank you.

Some observers attempt to quell our fears of monolithic technology by pointing out that what our big brains have developed has provided us the wherewithal to produce goods more effectively, provide services more efficiently, and generally live better. They remind us how we’ve benefitted from the expanded use of computers in the workplace as well as in our private lives. (So why is it we can’t develop something more high tech than a chip reading machine that honks to remind us to remove our credit card?)

Machines stealing jobs is a storyline that distresses different segments of our population variably. To experienced folk who know a world before PCs, the march of computers is a mixed blessing. We were raised with face-to-face communication, with live voices at the other end of the line, with ambiguity, subtlety, and nuance. A friend once remarked that our generation should pat ourselves on the back for how well we’re handling our disemboweled digital world. On the other hand, for the X Generation and Millennials who were born with smartphones in their cribs, this is just how the world is, the way things work. They have little patience with—or interest in—arguments that our contemporary infatuation with technology threatens our highly developed interpersonal skills and may go so far as to alter fundamental aspects of our human nature.

Master everything from OSHA regulations, to high-tech safety equipment in this FREE Special Report: Construction Safety Topics That Can Save Lives. Download it now!

In sorting all this out, it’s useful to differentiate between technology, computer-based automation, and the end point of all this dithering—artificial intelligence (AI).

Broadly defined, technology refers to applying science to the invention of useful tools to problem solve, accomplish tasks, and generally make human effort more efficient. Webster’s is more precise and experienced-based: “a manner of accomplishing a task…using technical processes, methods, or knowledge.”

Add Grading & Excavation Contractor Weekly to  your newsletter preferences and keep up with the latest articles on grading and excavation: construction equipment, insurance, materials, safety, software, and trucks and trailers.    

Thus, a computerized credit card reader where we run our piece of plastic through a slot or a chip reader replaces the old, hand-powered apparatus that required laying the card in a tray, filling out a multi-copy form, carbon paper inserted between each copy, laying the form in the tray, and swiping an arm across the form. One of those carbon copies went to the customer, one was sent to the credit card company, and the merchant kept one—quaint by today’s standards.

Credit: iStock/milindri

Artificial intelligence is one logical long-term intention of computer automation. Effectively a subfield of computer science, it harbors the goal that machines will able to do things typically accomplished by people, particularly those things that require what we proudly claim as human intelligence. AI’s developers assure us that it’s not necessarily that computers will mimic how we humans think, that in fact they’re likely go their own way. Suggesting what? That human obsolescence won’t be far behind?

In the last issue we reported that the jobs that are the most susceptible to automation are jobs that are the most routine. Reality of course is never that black and white. Delving further into this brave new world, we are encouraged to believe that computers increase our individual capacity to perform discrete (and essential skills) with the net effect of making each one of us more expert at what we do (and thus hopefully, less disposable). The bad news is that given the way technology has quick-stepped through our lives, we should be prepared to be constantly updating those precious skills, suggesting, perhaps, that the nature of expertise will morph from a static state to a constant game of catch-up.

The experts assure us that technology has more often changed the nature of jobs rather than eliminate them outright. According to one study, in the last 30 years employment in this country grew significantly faster in occupations where computers were integral to getting the job done. Computers speed up one part of a job while providing the opportunity for workers to do the other parts more effectively and productively. The recommended takeaway is that computers “relocate” rather than “displace” jobs. But again there’s the thought that workers must be primed to be constantly learning new skills, a challenge that one observerdescribes dashingly as “learning how to relearn.” This future we’re looking at is one where jobs will be “redefined” rather than “destroyed,” where whole new industries will develop. In come iPhones, out go film and photo development labs and their accouterments.

Even if none of this tips you over the edge and for you the picture looks rosy (or however you describe a clockwork automated world), the tricky part is predicting where technology will intervene to eliminate and create—and at what pace. Viewed this way, things almost seem to be operating backwards—put a computer to work reworking, defining or developing new jobs, then build operating and management procedures around them.

The good news is that such trends are projected to be two decades off and are likely to come in fits and starts instead of along a linear continuum. And what do we do in the meantime? How do we manage a workforce that includes people who remember those old credit card machines with people who interface with the world via mobile apps? How do we transfer the experience of mature workers to future-oriented employees with an inbuilt belief that any problem can be solved with a click of a mouse or a few taps on a mobile keyboard?

How do we meld our divergent assets? Do we dump the older folks and wave goodbye to their cache of corporate knowledge? How about managers whose worldview, and perhaps skill levels, lag behind the younger people who work for them? How do we show these young people the value of trial an error, and that computers are best when they work with us, not replace us? How do managers sharpen their skills so they work backward from what’s technologically possible to developing a product or service that they can market and sell?

Where do we learn this new skill of “relearning” and how do we establish systems and opportunities and resources in our organizations to aid our motley crew of employees in doing the same? The challenge of heavy equipment operators adjusting to machine control will seem like a walk in the park (as silly as pilots who once balked at computerized flight controls), compared to the challenges with automation that lie ahead. What will the machines in the future look like and what kind of ancillary jobs will be created around operator-less backhoes and excavators? What will the job look like that, that assists an artificial intelligence machine in designing a grading plan?

How do we define the steps we need to take to adapt to and take advantage of the largesse that’s being offered us? How do we learn to think about possibilities that may seem beyond our contemporary capacity to imagine? How do we manage our employees during this transition, develop personnel principles and procedures that work now, will be malleable as conditions change and can become building blocks for the automated world that’s predicted?

The observers and theoreticians and historians, technologists and digital geeks express a certain dispassion about how all this will work out. They assure us that humans are resilient and adaptable. Seen from such their lofty perch or behind their computer screens, the picture may indeed look rosy. But what about the managers down there in the trenches that have to roll up their sleeves and cope with the day-to-day manifestation of these changes? Have we learned from integrating the technology we have already adopted—computer-aided management of estimating and takeoff, site design, onsite management, asset management? One thought might be to keep track of the adaptive strategies that have served us well in the past and mold them into a framework that will help us cope with the future. Another might be to take a solid look at the vulnerabilities in what we do and how we do it so we’re one step ahead when the technological Juggernaut roles into town.

Twenty years from now will the construction industry still be slow on the uptake, as it has been, say, in attracting and integrating young, tech savvy workers or picking up the pace with innovative training programs? Will we learn lesson for our past and be prepared to move forward? Or will all of this be unnecessary as we stand aside and watch computer-assisted change management do the job for us?