Managers are turning to new technology to improve decision making and innovation. But they often screw up a crucial part of the process: getting employees on board with the changes.
The problem, at its most basic level, is that managers usually don’t anticipate that systems will lead to fears and suspicions among employees who feel disrespected or displaced. And they usually don’t recognize that the systems often mean employees have to take on new work that disrupts their routines.
Understanding...
Managers are turning to new technology to improve decision making and innovation. But they often screw up a crucial part of the process: getting employees on board with the changes.
The problem, at its most basic level, is that managers usually don’t anticipate that systems will lead to fears and suspicions among employees who feel disrespected or displaced. And they usually don’t recognize that the systems often mean employees have to take on new work that disrupts their routines.
SHARE YOUR THOUGHTS
Tell us about missteps and successes you’ve seen when it comes to the introduction of new workplace technology. Join the conversation below.
Understanding these pitfalls is crucial. Our research has found that helping employees to accept new technologies is just as important as making sure the systems work in the first place.
Do it right, and you have employees who embrace the new technologies and use them to reduce costs and improve product and service quality. Do it wrong, and you have employees who are frustrated, resentful, angry—and likely to resist implementation.
Where do technology implementation processes go awry? Here are four ways that managers trip themselves up—and how they can avoid those mistakes.
Using junior employees as trainers
On the surface, it seems to make perfect sense: Younger employees who grew up immersed in the digital world will surely be quick to learn new technology. And they’ll be a lot more flexible about incorporating it into their routines. So, why not use them to train the rest of the staff?
Because this strategy can make longtime employees feel slighted—and make it tougher to train them.
For example, my co-authors and I studied the introduction of new electronic-medical-record technology across multiple medical clinics. At most of the sites, leaders chose young trainers. But the leaders didn’t realize that this move challenged the status of employees with long tenure and expertise in the old way of doing things. These employees questioned the ability of the trainers, complained about training procedures and disputed whether the new technology-related tasks were worthwhile—such as classifying calls from patients using a template in electronic medical records.
What did work? Rotating the role of trainer—so that sometimes longer-tenured employees did the teaching. In those cases, trainees embraced rather than resisted the new technology.
Choosing trainers who are initially less skilled might not seem to be the most efficient way to do things. But putting rookies in charge is a likely recipe for failure.
Adding a new layer of workers to handle technology
Sometimes leaders decide to sidestep the problems of training. Instead of getting everybody up to speed, they add new employees to take on the burden of dealing with tech—such as data scientists—or designate certain employees as “superusers” to take on those new jobs.
This may spare current employees from having to learn new technical skills. But it is likely to raise new problems.
For example, a team of researchers studied a telecommunications company that added data scientists to crunch data from multiple sources and automate the process of identifying sales leads. That sparked a battle with salespeople, who felt that the new methods undermined their longstanding strategy of building personal relationships with customers, and using their gut to identify new opportunities.
None of this means that adding new roles is doomed to fail. But managers should be careful not to get caught up in the idea of AI as a magic bullet that can displace the old way of doing things. In addition, they shouldn’t signal that they value data scientists, computer scientists or other people in new roles more than traditional employees. And, crucially, they should be sure to hire new people who have the emotional skills to reassure current workers and patiently explain the new systems.
Focusing on prominent users
Obviously, if a company wants to get the whole organization on board with new technology, it must get powerful stakeholders, such as high-level employees, to accept it.
But often, even when those stakeholders do embrace the technology, it leads to conflicts with lower-level workers. Why?
Emerging technologies promise to automate a lot of practices and processes. But in the real world, they can’t do that job perfectly—leaving a lot of extra work that lower-level workers must cover.
As an example: I studied managers in a medical center who were introducing new technology to alert doctors when patients needed vaccinations, diabetes tests and Pap smears. Powerful doctors embraced the technology. However, it didn’t work as well as the doctors wanted—the advice the machines gave often conflicted with their own instincts. So, medical assistants had to check patients’ medical records to confirm that the machines hadn’t missed anything.
That led to conflict. Many assistants told their managers that they didn’t have time to do the new tasks, and several of the best assistants left for less-stressful jobs elsewhere.
Managers solved the problem by putting together a working group to implement solutions that balanced the needs of doctors with realistic expectations for medical assistants. For instance, the machines were modified to provide doctors with information from the patient’s electronic medical records directly, so that the machine’s decisions were more transparent.
Assuming developers can create tools in a vacuum
Managers often assume that the process of adding new tech goes in one direction: Developers create a tool, and users adapt to it. But to make things go smoothly, users must have a back-and-forth dialogue with developers about how the system should work.
Consider a health network we studied that built a tool to predict bed availability in ICUs and other units.
Developers needed to grab data from multiple groups in the hospitals. Yet, different groups collected data on beds to answer different questions, like what is the length of stay, or how do we track transfers? End users needed to be willing to work with developers to get on the same page about their data collection methods.
The upshot is that managers should anticipate the need for collaboration between users and developers. As part of that, they should be aware of the laborious nature of that collaboration, and frame the work in a positive way—as a learning opportunity for users, developers and the technology itself.
More broadly, managers need to realize that introducing emerging technologies such as artificial intelligence, data analytics and robotics aren’t straightforward. Managers who hope to successfully implement these technologies need to focus on issues of employee status and roles, and the amount of new work that will need to be done.
Dr. Kellogg is the David J. McGrath Jr. professor of management and innovation at the MIT Sloan School of Management. She can be reached at reports@wsj.com.
"four" - Google News
November 29, 2021
https://ift.tt/3xuPzd4
Four Mistakes Leaders Often Make When Introducing New Technology - The Wall Street Journal
"four" - Google News
https://ift.tt/2ZSDCx7
https://ift.tt/3fdGID3
No comments:
Post a Comment