TEACHING IN THE AGE OF TECH REVOLUTIONS

By Charlice Hurst | Spring 2019

Printer Friendly

How Business Schools Can Prepare Students for Jobs that Don’t Yet Exist

For years, I’ve asked my undergraduate business students whether they track developments in their intended occupations. Few do. I now also ask how many have examined how artificial intelligence (AI) will affect their careers. Even fewer do.

head of an Aristotle statue pixelated on the right sideI don’t want to discourage students aspiring to stable, well-paid careers, which are reasons I often hear them give for pursuing business degrees. But the likelihood of their employment ever being “stable” is slim. They — in fact, any of us not near retirement — are in for a ride. We’re the ones who will feel the first waves of impact from the “Fourth Industrial Revolution,” so termed because of the extensive disruption already being wrought by a raft of emerging technologies, including AI, that promise to change our lives to a degree that stretches the imagination.

There is no easy answer to the question of how business schools should adapt. Even experts are confounded by the range of possible outcomes of the proliferation of AI. Predictions abound and vary wildly, largely because of uncertainty around a range of influential factors such as the pace of AI adoption and the regulatory environment. A 2017 McKinsey Global Institute report, “Jobs Lost, Jobs Gained: Workforce Transitions in a Time of Automation,” projects that, depending on several scenarios, anywhere from 75 million to 375 million people globally will need to switch occupations by 2030, an estimate that is neither reassuring nor precise.

Despite the inability to pinpoint the magnitude of change, there seems to be broad agreement around key trends relevant to business schools. In a sense, there is nothing new. We already teach the skills the soothsayers are predicting will be critical: technical knowledge, critical thinking and socioemotional competencies.

So what is different about the challenge we currently face as business educators? Put simply, three things: the relative importance of the skills we teach, the urgency driving the need to update our offerings and requirements to keep up with market demands, and the need to equip students to grapple with the highly consequential and delicate ethical concerns that are pervasive across the AI landscape.

FILLING THE SOFT SKILLS GAP

Despite the general anxiety about AI rendering all of us obsolete, college-educated professionals are far less at risk of job loss than of job upheaval. Firms are more concerned with using AI to improve product offerings, business processes, decision-making and innovation than with reducing labor costs, according to industry reports such as the Deloitte Insights’ “State of AI in the Enterprise, 2nd Edition” report. In fact, there is expected to be a net gain in jobs at the higher end of the wage scale, although they won’t simply be more of the same. Some of the occupations most popular among business students are on the decline, some of the less popular will grow in value, new job types will emerge and all will require us to adapt over and over as technology evolves.

There will be major changes in what firms seek from employees. In a McKinsey survey of 3,031 executives in the United States and Canada, more than 80 percent said that their employees with college degrees or above will need moderate to extensive retraining over the coming decade. However, employers express weak confidence in the likelihood of accomplishing that. In another survey by the World Economic Forum, chief human resources officers at 313 companies cited labor force deficiencies as one of the top five obstacles to their firm’s adoption of new technologies.

The skills expected to be most valued in the coming era may surprise you. Clearly, there will be heavy demand for people who build and manage AI systems, including data scientists and programmers. It’s also easy to see why those employees not in technology-intensive jobs need basic programming and digital literacy skills to make better use of the AI tools that will become integral to their work.

marble column pixelated in gold on the right sideLess intuitive is the expected increase in the value of “soft” skills, particularly critical thinking, creativity and interpersonal competence. While the share of task hours requiring technological expertise will rise to 17 percent in the U.S. and Europe over the coming decade, “human-intensive” work will comprise 22 percent of task hours, according to McKinsey’s report “Skill Shift Automation and the Future of the Workforce.”

As automation releases professionals from the mundane aspects of their work, they’re able to devote more time to strategy, clients and innovation. Naturally, if these activities come to comprise more of their jobs, workers need to get better at them. Interpersonal skills and creativity are also crucial during periods of radical change, which often induces discomfort and resistance. In fact, one of the job categories expected to be in greater demand is organizational development, which is largely concerned with supporting change.

What will this mean for business education?

Business schools need to reexamine curricula in light of shifting skills demands. Most, like Mendoza, are already strengthening their offerings in analytics and information technology. For those not pursuing technical roles, such learning is less about being able to perform technical tasks than about how to scrutinize the quality of output from AI systems, synthesize it with other sources of information, and apply it to business problems.

Herein lies the mounting value of courses that cultivate capabilities such as critical thinking, creativity, persuasion and change management. Fortunately, such offerings are already plentiful in the curriculum. The challenge may lie in modifying how we teach those skills for application within this shifting context.

Another possible necessity is that, as with technical courses, schools will need to require more soft skills coursework. Most business students gravitate toward finance, accounting and marketing rather than management consulting and human resources. Because they tend to lean toward electives within their majors, the only exposure many of them get to instruction on the human side of organizations is the core course in organizational behavior (OB), which offers only a brief and broad dive into dozens of management concepts.

We must also prepare students for the changes likely to occur in their desired industries and roles. Based on an analysis by the World Economic Forum, some of the job types most prized by business students, including accountancy and financial analysis, are among those highest in automatable tasks. On the other hand, all of the professional roles that are expected to remain stable or emerge over the coming years are technology-intensive (e.g., data analysts and digital transformation specialists), human-intensive (e.g., management analysts and people and culture specialists) or both (e.g., human-machine interaction designers).

Even in fields ripe with opportunity, professionals can expect the requirements of their jobs to evolve constantly. And because organizations are flattening hierarchies in order to become more agile, internal and external job changes will more often be lateral, with employees moving from one job role to another, adapting existing skills and picking up new ones.

Given this, the one skill we may need most to develop in students is proactive learning. This calls for a different approach to learning than most students have experienced. It may also mean a different approach to teaching for many professors, relying more on participative and self-directed instruction. Also, heightened use of business cases and simulations will improve students’ facility and comfort with complex decision making.

Finally, it’s vital that we keep in mind the unique needs of mid-career and executive students. The limitation on effective adoption of AI stemming from top managers’ lack of understanding of the technology is one of the top concerns of HR executives, according to McKinsey’s report, “Skill Shift Automation and the Future of the Workforce.”

Managers and executives are less likely to need to apply technical skills to their work. But they are shorter on time and increasingly will seek training to address specific challenges arising in their work. Finally, mid-career professionals will face job, occupational and industry changes. Business schools might respond to this by offering intensive career management support and more flexible, non-degree short courses to build specific skills.

NAVIGATING THE ETHICAL DIMENSIONS OF AI

A minority of executives at early adopting AI firms rate ethical risks as their top concern, although 32 percent see it as one of their top three. However, the risks given higher priority, such as cybersecurity and critical systems failures, have implicit ethical dimensions and potential to impact public trust, according to the Deloitte Insights report. In light of the content of negative media coverage of AI, it’s not surprising that the proliferation of false information and bias are executives’ top two ethical concerns.

AI does hold the promise of more accurate decision making, freeing us from the limitations of human information processing. But if the data being fed to the system is biased, the program will simply end up coming to flawed conclusions as well. Furthermore, the people developing the systems represent a rather narrow slice of society. The perspectives of average Americans in general are not represented in the AI development and deployment process, which is at least partially responsible for broad public suspicion. This is not only a risk to business, but a major barrier to society reaping the tremendous potential benefits of AI.

A natural response by business schools would be to increase attention specific to ethics instruction. Highlighting ethical concerns should be integrated into courses in step with increased technical content. We may also need to examine how we teach design thinking, which is widely embraced by the tech industry as a method of innovation and has recently become a common and popular business school elective. Developing “minimum viable products” (MVPs), releasing them rapidly and going through additional iterations based on market feedback are key aspects of design thinking. There is no need to seek perfection the first time around. Failure is seen as an opportunity to improve.

Comfort with failure is a necessity in an environment where one must constantly update skills and take on new roles. We do, therefore, want students to learn to fail well. However, in the context of product development, the drive to release prototypes quickly leaves little room for contemplation of risk, especially as AI is increasingly incorporated into the design thinking process for quickly testing and personalizing offerings. It may be fine for building a new video game, but not for products that can affect whether, for instance, women are disadvantaged in recruitment processes, people are misidentified as criminal suspects or those with disabilities continue receiving public supports.

Mary statue on top of the golden dome pixelated on the right sideWendy Angst, an associate teaching professor in the Management & Organization Department who teaches design thinking, believes that AI has a role to play, but that care must be taken to ensure that it does not introduce bias. “Design thinking is grounded in empathy for users,” she said. “AI is a powerful tool to analyze information about users and to provide personalized MVPs, but it can also introduce bias in how we interpret information and deliver solutions, enabling us to quickly group users into persona categories based on their digital preferences. AI enables us to move quickly, but this speed can cause us to lose sight of nuances that could ultimately have negative implications.”

Hemant Taneja, a managing director at venture capital firm General Catalyst, recently declared in Harvard Business Review that designers need to shed the ideology of “fail fast, fail often,” seeking rather to build “minimum virtuous products.” To accomplish this, it is less necessary to abandon design thinking than to take seriously its core intent to build products that are human-centered. We also need to expand the definition of an end user. They are buyers, of course, but also anyone who stands to be impacted by the product. In the case of technologies such as facial recognition used in law enforcement, that’s everyone.

Finally, students would benefit from grounding in the complex historical and social dynamics of the contexts where AI is being implemented. Many of the recent gaffes by companies making or implementing AI tools seem to reflect a lack of knowledge in these areas.

For instance, according to an article by Reuters last year, Amazon recently scrapped a résumé-screening tool to which it had devoted substantial resources. The developers taught the system to identify attributes of résumés consistent with those of previously successful candidates. These attributes, such as the use of words such as “executed,” are more likely to be found in men’s résumés, which ensured that potentially qualified women were being weeded out simply because of gender differences in language use.

There is actually a wealth of research that, if developers had been aware of it, might have alerted them to the possibility of this error, which was surely costly, attracted unfavorable press, and likely hindered the company’s ambitious hiring goals.

As Mendoza dean Martijn Cremers points out, “The proliferation of AI raises the issue that as our technological capabilities increase, we need to likewise increase our awareness of our social and moral responsibilities.” Given that Notre Dame’s mission foregrounds ethics, moral courage and social responsibility, Mendoza is uniquely positioned to be a leader on this front.

The intensifying needs of business for skilled employees present a tremendous opportunity for higher education as a whole. A lack of trained workers will hinder AI adoption and economic growth, resulting in greater labor displacement at all levels of the income spectrum, rising inequality and societal instability. Yet, we also stand to exacerbate these problems by allowing our exuberance about the technology to outstrip our sense of accountability.

There’s too little room here to cover all of the pitfalls, but also to extol all of the possibilities for positive transformation due to AI. I’m hopeful that effective adaptation on the part of core institutions, including business schools, will enable full realization of the latter.

ARISTOTELIAN APPROACH
Aristotle believed education must be grounded in ethics and a worldview that contributes to the flourishing of individuals and the human community. Is this balanced view the key to education in the age of AI?

 


headshotCHARLICE HURST is an assistant professor in Management & Organization. Her primary research focus is interpersonal dynamics in the workplace and how interpersonal relationships influence well-being and performance. She studies the roles of personality and gender in shaping interpersonal relationships between co-workers and between leaders and followers.

Categories

  • Vita6
    Editor's Letter 14340
    NEWS 14342
    Research: WHY THE EARLY BIRD GETS THE CLICKS 14345
    Research: MOTIVATED BLINDNESS AND THE SLIPPERY SLOPE 14361
    FUTURES 14363
    INSIGHTS 14381
    FACULTY IN THE NEWS 14419
    PHOTO OPS 14456
  • Features3
    TEACHING IN THE AGE OF TECH REVOLUTIONS 14383
    AUTOMATION ANXIETY AND THE MEANING OF WORK 14406
    LOAVES AND FISHES 14443
  • Family8
    A FUTURE IN HIS PAST 14395
    GOOD WORK 14397
    ON READING THE BIBLE 14408
    MORE THAN A POLLYANNA 14414
    AI HAS SOME EXPLAINING TO DO 14421
    CLASS NOTES 14425
    IN MEMORIAM 14431
    WHERE ARE THEY NOW? 14434
  • Salt and Light1
    A FUTURE IN HIS PAST 14395
  • Class Notes2
    CLASS NOTES 14425
    IN MEMORIAM 14431
  • Ask More of Business1
    GOOD WORK 14397
  • In Memoriam1
    IN MEMORIAM 14431
  • Mendoza News1
    NEWS 14342
  • First Person1
    ON READING THE BIBLE 14408