A lively foray into the future of work to explore how leaders can rally their people and organisations through change, while remaining relevant themselves. Spoiler: in a somewhat ironic, but by no means unwelcome discovery, we find that human skills are even more important than ever in the age of artificial intelligence and automation.
It’s impossible to explore the topic of humanity and artificial intelligence without brushing against the work of Yuval Noah Hari. Homo Deus and 21 Lessons for the 21st Century are crucial reading for anyone with a curious mind and a hunger to understand how we’ve arrived here and where we’re likely to end up in the future.
Yet although we dance with similar themes, this work pursues particular relevance for leaders. Amidst a maelstrom of opinions and predictions, where should we be focusing our efforts? We endeavor to discover.
The mere mention of artificial intelligence and the mind wanders to fanciful images.
A policeman’s arms morphing into metal blades to molest John Connor. Rows of shiny robots conspiring to enslave the human race. A sentience growing increasingly hostile. Terminator; I Robot; Transcendance — sci-fi tends to take a dim view of the future of human and machine.
In reality, though, it’s the mundane yet infinitely scarier prospect of AI taking human jobs that’s igniting fear and righteous indignation. It’s not unfair to say that humans have an uneasy relationship with change at the best of times, and these advancements promise to bring plenty.
There’s a sense that AI is a spectre in the distant future, a nebulous force threatening to tilt our very existence askew...
The truth is, the future is already here. It just arrived so quietly that no-one noticed it.
We already trust our phones to take us from A to B, find dinner, recommend a movie, and assemble our weekly music playlist. We even rely on it to select potential mates, which seems like, y’know, a fairly significant responsibility.
No-one questions an ATM dispensing our money instead of a human, because when one requires a late night kebab, one does not want to be constrained by branch trading hours. Nor is the ATM the only machine working 24 hours a day, seven days a week for our convenience.
If you have a fondness for your news delivered by a friendly face at any hour (and can speak Mandarin), you can finally satisfy that need by tuning into the news with Xinhua, a Chinese AI that promises to ‘work tirelessly to keep you informed’.
Don’t be alarmed, though. If this very unhuman commitment to work leaves you feeling slightly uncomfortable, an AI bot that translates satellite images into street maps then back again was caught out in the very human behaviour of cheating at work to make itself look better. 🤯
Even creativity, oft-exalted as irrefutable evidence of a human soul, has been successfully automated. An AI called Obvious (so art!) recently sold a piece at Christie’s for a very substantial USD$432,500. Which begs the question — does that make it a sell-out?
Oh yes, AI is already here and so thoroughly integrated into our lives we no longer notice it. It’s making our lives more convenient and business more profitable — why on earth would we stop?
Machines don’t have short attention spans and aren’t influenced by emotion. Machines don’t have personal lives (or lives at all, for that matter). Machines don’t take lunch breaks. Machines are accurate and rarely make mistakes. Machines aren’t concerned about the exploitation of machine labour. Machines do what they do without pay, and they do it unceasingly, unquestioningly, twenty-four hours a day, seven days a week without the inherent human inconveniences of living.
And unless bold decisions are made at a governance level (the same folk still pondering how to regulate the internet), businesses will continue to pursue every advantage in an increasingly competitive marketplace.
In 2019, 20 per cent of US companies with AI initiatives will roll them out. They expect this investment to re-imagine jobs and work processes, as well as increasing profit and revenue.
This is the world we live in now, friends, and it’s no time for burying heads in the sand and hoping for the best. No — hope is not a strategy. Instead, let’s seek to understand where we are, what likely lies ahead, and how leaders can reinvent themselves to remain relevant.
Before we go further, let’s take the briefest moment to define a couple of terms used liberally ahead.
Automation is relatively simple to define. It’s the use of technology to repeat a function over time with minimum human intervention. This process can be physical or digital in nature.
Artificial intelligence (AI) is a little more complex. It’s a broad term encompassing everything from algorithms that process data (like telling us what music we might enjoy based on existing preferences), to machine learning capable of expanding its knowledge over time (learning to play chess), all the way (theoretically) to a conscious entity capable of functioning and reasoning autonomously.
Contrary to science fiction, AI isn’t about bringing Frankenstein’s monster to life. For the most part, it’s for the far more mundane purpose of executing very specific tasks within set parameters, rather than creating a single all-knowing, all-feeling artificial entity. Most functions don’t require (or benefit from) consciousness.
At present, most AIs share more in common with calculators than humans. They excel at process and logic. But creativity, empathy, intuition, judgement, critical thinking, meaning-making, storytelling and communication — qualities humans take for granted — are incredibly complex (though not impossible theoretically) to replicate.
From splitting the atom to brewing LSD, humans have a lengthy history of creating tools with no idea of their potential application or impact.
Today the wheel is synonymous with transport, yet evidence suggests it was first used in Mesopotamia in 3500 BC to improve pottery, and wasn’t applied to chariots for another 300 years. The inventor could hardly have imagined it on a horse and cart, let alone autonomous cars.
While it’s impossible to accurately predict the future, it’s pretty damn likely AI will have a considerable impact on every aspect of our lives.
Consider the possible ramifications for manufacturing and the economy. If businesses weren’t reliant on cheap labour in developing nations, automated manufacturing hubs could emerge almost anywhere. It would make sense to locate factories closer to areas of demand to reduce transportation costs. But what would that mean for manufacturing-based economies like China with massive numbers employed in that sector?
There’s also a social and ethical debate. We’re increasingly reliant on algorithms to run our lives, yet we have very little idea about how they actually work. Is it enough that they make our lives easier, or should there be greater transparency around data to ensure inherent social and racial biases don’t seep subconsciously into the code and perpetuate certain perspectives. If so, who’s responsible for governance?
If machines are learning from us, what are they learning about the way we treat the planet, animals, and other humans? Will it one day assume the role of the dominant species and decide the only way to save the planet is to cull our numbers? Or, will we welcome our own physical demise as a way to live forever and survive an increasingly inhospitable planet ravaged by climate change?
Even more frighteningly, if we didn’t have to work — what would we do? How would we pursue happiness and find meaning? AI could free us to completely reimagine human life.
These are all very satisfying topics to ponder, and we could pontificate indefinitely. Yet they’re hardly practical considerations for most.
Instead, let’s drag ourselves reluctant from these fascinations and focus on the challenges facing leaders over the next decade.
What do we know for sure?
The only certainty is that change will be constant and savage. The workforce of 2030 will look very different, as entire professions and industries vanish and are replaced by new roles and new industries parallelling new technology. Even the roles that still exist will look quite different, as various aspects are automated.
And if we’ve learnt anything at all from the past, it’s that the future will be unmerciful to those who refuse to adapt.
Of course, humans have faced similar challenges before.
Agricultural machinery, the loom, vehicle manufacturing plants and ATMs; considering our tumultuous relationship with automation and lengthy history of technological change, the only surprise is that humans continue to approach every advancement with exactly the same set of emotions. Shock; denial; anger; denial; depression; finally, acceptance — technology continues to march forward as we continue to return to the Kubler Ross curve.
In 2018, the media proclaimed the end for humanity nigh when Cadbury replaced 40 people in their Tasmanian factory with technology. Yet similar news has been making headlines since the industrial revolution, and the sky remains unfallen.
Back in ‘79 while Pink Floyd were hitting the big time with ‘The Wall’, manufacturing in the US was hitting a metaphorical wall as the number of jobs peaked and rolled into a steady downhill trend. An MIT Economics report with the no-nonsense title ‘Robots and Jobs’, found that 670,000 American manufacturing jobs have already been lost to automation since 1990.
Meanwhile, more goods are being produced, which means machines didn’t just take the jobs — they’re actually doing a better job. Yet no-one is rioting on the streets.
Perhaps because each change has also presented unforeseen outcomes and opportunities.
When arguing the potential impact of AI, numerous economists point backwards to demonstrate a historical link between technology and employment growth. Automation allows businesses to produce more goods at a lower cost, which has tended to drive demand.
In the 1920’s, automation helped cloth weavers’ increase production by 50 per cent. This caused the price of cloth to decrease, resulting in increased demand, and ultimately increased the number of weavers needed to supply that demand.
A similar precedent exists in the services sector. In 1988, banks averaged 20 employees per branch. ATMs whittled that number to 13 by 2004. However, the banks also increased the number of branches by 43 per cent in the same period, causing the total number of jobs to increase.
This time, though, there’s the possibility supply could outstrip demand, without the employment growth that traditionally came from increased production. This would threaten the whole paradigm of labor for pay, potentially leaving us in the quite ironic position of being unable to afford the glut of products machines would be capable of producing.
Another common argument for job growth is the peripheral industries and opportunities that emerge with each technological advancement.
Cars may’ve done coachmen out of a job, but it also made a career as a cab driver possible. The internet culled bricks and mortar stores, yet it also created roles like web designer, UX strategist and online retailer. None of these professions existed 50 years ago, yet here we are with ‘social media influencer’ as a legitimate job title. 🤷
For centuries, technology has upended industries from manufacturing and transport to retail and banking. However, while Uber threatened taxi drivers by making anyone with a vehicle a potential driver, autonomous cars herald the end to all transport related employment, be it ride sharing, taxis, buses, trains, or even planes (eventually).
Humans have always relied on intelligence and cooperation to overcome past challenges, but what if we give birth to an entity more intelligent and better connected than us?
Perhaps this time we won’t be the coachmen — we’ll be the coach.
So, how many people will be impacted by the Fourth Industrial Revolution? The numbers vary substantially, but the consensus is it will be considerable.
McKinsey Global Institute predicts 30 per cent of tasks across 60 per cent of occupations could be automated, and between 75 and 375 million people could be looking for new jobs by 2030.
A report by PwC estimates job losses at up to 44 per cent in many countries, while researchers from The University of Oxford believe a similar 47 per cent of workers will lose their jobs to automation, with particular impact on retail jobs and office workers.
A World Economic Forum article stabs a finger at a quite specific 210 million career changes.
So all said, give or take several million people, it’s safe to say there’ll be quite a few people looking for new work.
Much of the talk about AI and automation revolves around which careers will be made redundant. Yet the changes to existing roles will be just as significant. Every job will be in a constant state of flux as specific tasks and duties are automated and roles shift towards providing value in other areas.
The most likely areas for automation are repetitive, monotonous or dangerous tasks. This is a good thing too, because people definitely weren’t made to be slaves to decision-trees.
Remember the last customer service interaction where the representative refused to deviate from a script? Without empathy, judgment or critical thinking, the resulting experience has all the downsides of dealing with a computer, and none of the advantages.
Effective helplines balance the contribution of human and machine, automating the repetitive part of the call identifying the problem and directing it to the right department, where a person takes over to work through a solution.
We’ll likely see a similar development occur within most roles. AI and automation will take over certain responsibilities, freeing us to do the tasks better suited to humans.
Although these changes have the potential to make us safer and more productive, it raises interesting questions around job satisfaction and social identity.
Does removing the dirt, danger and drudgery remove the sense of an honest day’s work? What would it mean for the health and safety function if dangerous roles are automated? And what burden would constant reskilling place on us?
Just as they have in the past, the first jobs to go will be repetitive routine tasks in predictable physical environments. These are the easiest tasks to automate, and offer the most significant improvements to speed, accuracy and quality.
Manufacturing roles have experienced the brunt of automation for decades, and will likely experience the greatest risk in the near future. By 2030, vast frontlines may be as synonymous with the early 2000’s as perms and neon was to the 1980’s.
Autonomous vehicles also threaten to shake entire industries. In the US, the largest area of employment is driving trucks, and it’s estimated AI could be doing that job by 2027. All told, between 2.2 to 3.1 million professional driving jobs in the US could be lost. Vehicle ownership is also predicted to decrease by up to 70 per cent in urban areas, which would considerably impact the automotive industry.
It isn’t just blue collar workers under threat, though. White collars are equally at risk of being bloodied.
Repetitive, routine administration and service roles are prime candidates for automation. Computers significantly outperform humans at specialised, process-oriented tasks. They’re adept at collecting and processing data quickly and accurately.
The advantage of machines for this type of work is already being felt in the global financial services sector. In late 2017, National Australia Bank announced plans to cut 6,000 jobs over a five-year period. And in June 2018, Citigroup’s investment bank declared intentions to halve 20,000 operational roles over the following five years.
Analysts at Macquarie Private Wealth estimate digitisation and automation could help each of Australia’s four major banks reduce their workforce by up to 10,000 employees over the next five to 10 years. That’s nearly one in four jobs.
Until now, we’ve focused predominantly on frontline jobs. But if those roles are lost, what happens to frontline and middle managers?
As AI and machine learning advances, intelligence in a specialised field is no longer enough to guarantee employment. For middle managers who’ve built careers on technical skills rather than people skills, this poses a problem.
With machines capable of doing a better job at technical tasks, and more suited to executing process-oriented work with minimum supervision, combined with smaller frontlines to oversee, managers face the brutal choice of transitioning into leaders or becoming technicians.
Recent research from MIT reveals this trend is already underway, with a sharp decline in middle class job creation since the 1980’s. The study shows the majority of new positions tend to sit at either end of the pay scale.
At this stage you might well be wondering if anyone is safe in their current roles?
In the long term — probably not.
The University of Oxford’s Future of Humanity Institute puts the odds at 50/50 that machines will be capable of doing all human jobs in 120 years.
Heck, at the current pace of these quarterlies, you can expect the next piece to be written by a robot with little discernible difference.
Perhaps, it already is?
You’ll never know.
What is worth knowing, though, is that Google has been feeding its AI a steady diet of romantic novels and news articles to get its creative circuits flowing. And predictions point to AI composing New York Times best-sellers as soon as 2049, which would admittedly be an improvement over some of the woeful scrivenings currently stuffing shelves.
Even the surgeon’s steady hand could be out of a job by 2053, which sounds reasonable given the accuracy of robotics. The average GP is already using a simple form of AI whenever they type their patient’s symptoms into their computer to supply a list of possible ailments and the drugs needed to cure them.
Somewhat surprisingly, the safest jobs in the immediate future actually lie at the very lowest end of the income spectrum. These roles are less viable for automation because the cost of technology currently outweighs the wage.
Similarly, jobs in variable environments, like construction, will also be more difficult and costly to automate. Though prefab construction may still impact the number of workers required on site.
Some jobs remain helmed by humans for purely cultural and social reasons. Food service, medical and hospitality roles — these could all easily be automated (and some tasks and roles certainly have been). Yet our basic need for human interaction means there will always be humans in service roles, even if a face-to-face, ‘robot-free’ option comes at a premium price.
When considering what roles will be important in the future, perhaps the question we should be asking is: what can humans contribute to a role? If the answer is ‘not much’, it’s fairly likely that job won’t exist in the future.
At this stage, with morale sagging, it’s worth revisiting the historical precedent that technology tends to create new jobs.
The US economy currently destroys 13 million jobs and creates 16 million new jobs every year. And although the World Economic Forum predicts 75 million jobs will be lost to AI, it also forecasts the creation of 133 million new jobs.
So — what fields will emerge?
In the long term, it’s difficult to imagine. With the exception of science fiction writers, no-one in the early 1900’s could have predicted careers such as app designer or vlogger. The language and technology needed to envision those jobs didn’t exist.
The most likely areas for growth in the near future relate to the technology itself. As AI and automation advances and becomes obsolete, humans play (for now) a pivotal role in imagining, developing, maintaining, upgrading and marketing it, as well as educating others how to use it.
It’s a trend that could see hoodies replace overalls as coding becomes the blue collar profession of the twenty-first century.
Until now, we’ve spoken exclusively in maybes and perhapses. Likelihoods based on the past and predictions from reputable sources. It’s the best we can do, as there’s simply no crystal ball that can accurately illuminate the future.
Regardless of exactly how the future plays out, though, we believe savvy leaders will succeed by building capability in three areas:
Interestingly, while these areas will be absolutely essential in the future, they’re all fundamental to great leadership today. Investing in these capabilities isn’t a gamble on the long term, the investment benefits us immediately.
Yes, there’s no downside here, mates. So let’s explore each area in a little more detail.
As sure as tides changing and the inevitability of a Fast and the Furious sequel, the only certainty is constant change ahead, with industries, organisations, departments, and professions all facing the impact of automation and AI.
Dealing with change will become an important skill for everyone — particularly leaders. Not only will they need to personally evolve to stay relevant, but they’ll also play a crucial role in guiding their teams and organisations through transformation.
Fortunately, though, we’re alive — and the state of living comes with certain advantages.
Search the repository of all knowledge (Google) for the definition of life, and you’ll find it described as: ‘the condition that distinguishes animals and plants from inorganic matter, including the capacity for growth, reproduction, functional activity, and continual change preceding death’.
Unfortunately, despite having enormous capacity for growth and change, humans tend to have a limit to how much they can handle before — to use very scientific vernacular — freaking the hell out.
Yes, humans have complex relationship with change; an inherent ability to learn and adapt, coexisting with a strong desire for things to stay exactly the same. The potential for evolution competing with a primitive survival response, and a little laziness.
To some degree, tolerance for change varies from person to person, based on temperament and past experiences. We can even be open to change in some areas of our life, yet completely shut off in others. However, there are a few commonalities in the way all humans experience change that occurs on a physiological level.
With the exception of the few rare individuals who thrill to the unpredictable and revel in ambiguity, most humans loathe uncertainty and avoid it at all costs.
This makes sense from an evolutionary perspective. Anything we’re unsure about is a potential threat, so our brain gives it plenty of energy and attention. When faced with a situation where the outcome is uncertain, our brain invests maximum effort to try to tip the outcome in our favour, flooding our striatum with dopamine and triggering a powerful urge for fight or flight.
Unfortunately, this primitive survival response is completely at the expense of higher-order thinking, which would be far more useful to us in most present-day, non-life-threatening scenarios.
Uncertainty also becomes particularly taxing as the duration increases, leading to anxiety as our brains work furiously to predict all the potential outcomes and lingering morbidly on the worst case scenarios.
So fierce is our loathing for uncertainty, research shows we’d prefer to know something bad is definitely going to happen than be unsure about the outcome.
We’ll do almost anything to alleviate the feeling of uncertainty, even if it means choosing a fast and easy fix, rather than taking the time to reach a better solution.
Our brains are wired to crave immediate information and resolve anything incomplete. When there’s a gap in our knowledge, we seek to find answers (the curiosity gap). When there’s a gap in visual information, our brain attempts to fill in the details (gestalt theory). When we’re faced with meaningless, we attempt to find meaning. And when we’re faced with uncertainty? Yep, we try to resolve that too.
Happiness researcher Dan Gilbert suggests our dislike of uncertainty is linked to a need for control. Losing control in any aspect of our lives can lead to unhappiness, helplessness, hopelessness and depression.
Sometimes, though, we choose to avoid making any decisions at all. Because, better the devil we know, right?
Change requires letting go of things. It also means investing time, energy, emotions, money and other resources, with no guarantees about the outcome. The result is that we tend to think of doing nothing as the least risky option, even though research shows that in the long term, we’re far less likely to regret doing something than nothing at all.
From losing loved ones, to losing our own life or livelihood, humans are obsessed with the concept of loss.
Yes, our brain has quite the proclivity for negativity, and approaches uncertainty with a decidedly grim perspective. We tend to imagine the worst. Yet although bad things do happen, they’re rarely as bad nor last as long as we expect. Heck, even when they do, negative events can still have a positive impact on our life, and be remembered fondly as pivotal moments that shape who we become.
What does it take for us to gamble on change?
Research by Daniel Kahneman and Amos Tversky on prospect theory shows that a potential benefit has to be twice as much as we stand to lose for us to take a punt.
We’re also influenced by the sunk cost effect, sticking with a course of action even if it’s not serving us, simply because we’re invested in it. We see this frequently in jobs, relationships, careers, ideas and initiatives.
Similarly, we’re prone to having our perspective twisted by the endowment effect. This describes the tendency to assign greater value to things we own, making it more difficult to let them go.
Finally, our brains are evolved to place more importance on the present than the future, a phenomenon also tied to language. As a result, we often struggle to make the necessary short-term sacrifices needed to achieve long-term goals.
More than anything — change is hard.
Not just figuratively speaking, but actually mentally taxing. It means breaking habits, learning new skills and embracing unfamiliar experiences. It’s far more effort than simply sticking with the status quo.
Learning puts the prefrontal cortex to work and requires us to form new neural connections. It takes time, practice and conscious effort for new skills to become habits, at which stage they can be passed to the striatum to be done automatically.
Even anxiety about change can become a habit if we don’t pay attention to our mindset and make a conscious effort to embrace it.
So, we have ourselves a challenge.
Previous technological revolutions happened relatively slowly, giving people longer to adapt. In many cases, career changes meant transitioning from one unskilled profession to another. Learning happened on the job, and generally involved picking up something heavy and swinging indiscriminately.
This time, though, the pace of change will likely be faster than our inherent ability to adapt. And with technical roles at greatest risk of redundancy, career transitions will require more knowledge in a shorter amount of time.
We no longer have the luxury of time spent languishing in denial, anger or bargaining. We no longer have time for comprehensive (read: protracted) change management frameworks.
Instead, we need to seek every advantage to bring out people’s ability to change while easing the burden.
Communication has always been a crucial factor in business performance, but it’s even more important moving into the future. Whether it’s sharing a vision, strategy, processes, skills or knowledge, without effective communication people can’t change.
It seems paradoxical that in the age of AI, human communication would become paramount. Yet these are the crucial nuances that machines struggle to replicate.
A basic bot can communicate with us, but we know intuitively they aren’t human — we feel that something is off. If we want to instigate change, we need to go deeper to connect. Our messaging should consider curiosity, anticipation, surprise, visual, humour, emotions, narratives, words, names, language, and modes.
Only a few years ago it was common to be working on five-, even ten-year strategies. Now, the focus is shifting to 12-month roadmaps, with only the haziest vision for beyond. We simply can’t see far enough ahead to plan for longer.
It’s time to do away with lengthy strategies and complex change management frameworks and replace them with punchy strategies, communicated in a way that brings everyone onboard quickly.
As businesses scale, they become increasingly reliant on systems and processes.
As well as improving efficiency, productivity, quality and safety, systems and processes are a legacy of an industrial era model where people were cogs in a machine that could be easily switched in and out with minimum impact.
This isn’t to say systems and processes only serve businesses. They also provide employees with a measure of predictability that relieves mental burden. However, they also — by design — work by building habits. And these are one of the biggest hurdles to transformation.
As automation and AI replaces monotonous, process-driven tasks, leaders are free to imagine new work structures — creating environments that foster continuous change and frameworks that provide flexible constraints without forming habits.
In a constantly shifting landscape, we need to engineer constants to keep the horizon steady. These could be a vision, values or maxims that act as reference points when everything else turns upside down.
Rather than setting specific long-term objectives that are highly susceptible to unforeseen factors, we thrill to Dr Jason Fox’s notion of fuzzy contextual beacons. These direct focus in the right direction, while allowing the flexibility to roll with change.
Constant change accrues a heavy mental toll which can manifest in negative emotions, including fear, anxiety and anger. Unfortunately, these emotions are also major inhibitors of learning and adaptation.
Obviously, this is the exact opposite of ideal.
During times of transformation, using positive emotions in our messaging can help counter anxiety, promote learning, and ultimately, unlock people’s inherent ability to change. Curiosity is a particularly effective mindset to encourage, as it stifles negative emotions and motivates people to learn.
Belief alone isn’t enough to facilitate change.
To enable real transformation, leaders need to not only share the the big picture and instill the belief that change is possible, they also need to support people with the necessary training, resources and structure for it to happen.
As change becomes constant, the traditional model of a three- to five-year tertiary education spent learning technical skills, followed by a forty year career spent applying and refining them, will no longer serve us.
Career changes will be more commonplace, many requiring considerable reskilling and re-education. Even those employed in the same profession will need to adapt, as certain roles and tasks are automated.
To stay relevant (and therefore: employed) will require a shift in the way we view education — a concept known as lifelong learning.
In many cases, new jobs will emerge before courses exist, and by the time a curriculum is built, some may have already been automated. Not surprisingly, institutions are already considering how to arm graduates with the right skills for a future that might involve multiple career paths, rather than one long career in a narrow academic discipline.
The University of Manchester has a very-excellently-named AI: Robot Overlord, Replacement or Colleague? And Northeastern University in Boston has an equally great-named strategy called ‘humanics’, where computer science majors take classes in unrelated fields like drama. It acknowledges the need to broaden and humanise our skillset.
Increasingly, the responsibility for learning and development will also move beyond institutions and fall on the individual and the organisation.
It will make sense for leaders to hold onto good people by transitioning them out of redundant roles and reskilling them to fit new positions, rather than adopting a relentless cycle of firing and hiring. People will be valued on cultural fit and human skills, more than the ability to perform a specific technical role.
AT&T is already practicing a similar model, giving employees in redundant roles the choice between participating in learning and development opportunities provided by the company, or leaving with a generous severance package.
The need for lifelong learning will necessitate improvement in two areas.
With more change and less time, leaders will need to rethink the way their organisations approach learning and development.
Rather than cobbling together technical content and emailing it out as a crude frankendocument, greater consideration will need to be given to the psychology behind effective learning and the human-centred communication used to share and embed the knowledge.
This means considering the delivery mode, the right emotional drivers to influence mindset, and incorporating curiosity to inspire active learning. Visualising content will also be fundamental to improving the speed and effectiveness of comprehension and recall.
Virtual and augmented reality are becoming increasingly viable ways to deliver immersive learning experiences. The visual nature of these mediums reduces the cognitive load required to imagine complex scenarios and solve real-world problems. They’ve also been proven to improve motor skills, critical thinking, creativity and empathy.
While learning and development will become a fundamental business function, great leaders will understand how to share knowledge with their people and teams.
With 400 to 800 million people looking for new jobs, people, leadership and culture departments will be kept quite busy.
Transitioning people within the organisation will be a priority, however, it’s inevitable that a larger number of people will be coming and going.
Great onboarding experiences have already been linked to better performance, and their value will only increase in the future. Onboarding will ensure people joining the business have the right cultural knowledge, mindset and skills required to work together effectively, and remain with the organisation through constant transformation.
Outboarding will also become a critical consideration. It’s time to leave behind exit surveys and design better exit experiences. These are valuable opportunities to ease the trauma of change and leave people with positive feelings towards the brand. Beyond kindness and decency, in instances where ex-employees are also potential customers or consumers, it also has the potential to impact the bottom line.
While a degree of automation may be necessary to deal with volume, it should never be at the expense of human. Face-to-face touchpoints will continue to be essential, while bots and other automated touchpoints will need to consider how information is delivered in a distinctly human way.
It’s not surprising that paralleling our increasing interaction with technology, we’re also seeing a trend back towards human.
The toll of interfacing with technology for extended periods has become increasingly evident in the past few years. Could you’ve imagined needing a digital detox ten years ago? Me either. But now, bloody well sign me up and take my phone, tablet, computer, television, console…
Technology has changed the way we communicate and socialise, and not always for the better. Social media has been linked to mental health issues and social isolation has become enough of a concern for the United Kingdom to appoint a Minister of Loneliness, which sounds like a character in a Harry Potter movie, but apparently is completely legit.
Despite the ability to exist without face-to-face contact, there’s a craving for rewarding human interactions and experiences that isn’t coincidental.
Inside organisations we’ve seen Human Resources rebrand as People, Leadership and Culture, or (even better) Employee Experience. There’s also increasing evidence to show the value companies are placing on human skills.
The 2018 Workplace Learning Report by Linkedin found that the top learning and development focus for executives, people managers, and talent developers aren’t role specific skills, they’re the so-called soft skills — social skills, communication skills, character traits, attitudes, career attribute, social intelligence and emotional intelligence quotients.
Of these skills, executives rated leadership (65 per cent) slightly in front of communication (64 per cent), followed by collaboration (55 per cent) in order of importance. Another study from Evolve revealed that in scientific recruitment, the most in-demand skills are the ability to work cooperatively, flexibly and cohesively.
There’s already good reason for upskilling in these areas. A 2018 Willis Towers Watson survey of 10 million employees across 500 global organisations, linked proficiency in human skills directly to financial performance and employee culture. To succeed, high performing companies needed to show excellence in leadership, image and competitiveness, communication, and career development. These soft skills far outweighed operating efficiency, work tools and conditions and pay and rewards.
The 2013-2014 Change and Communication ROI Study Report by Towers Watson found a strong relationship between financial performance and effective communication going all the way back to 2003. They identified that companies that communicated effectively demonstrated 57 per cent higher shareholder returns over a five-year period, and were three and a half times more likely to significantly outperform peers. According to Creative Communications and Training, Inc, communication errors cost companies with more than 100 employees an average of $420,000 annually.
Given that, at least in the foreseeable future, people will remain essential to work, the leaders and businesses that will succeed in the future will be the ones who embrace human skills.
Looking ahead, the value humans will bring to a role won’t be opposable thumbs, niche technical skills or the ability to mindlessly follow a process. It will be human skills — communication, leadership, problem solving and non-linear thinking, flexibility, critical thinking, interpersonal skills, creative application of knowledge, empathy and collaboration.
While machines have the advantage in a narrow field, our edge comes from cross-disciplinary knowledge and the ability to make connections between seemingly unrelated concepts. And although even the most advanced machines struggle with semantics, we have the inherent capacity to communicate and connect like, well, humans.
The importance of human skills will be particularly relevant to frontline and middle managers. As systems and processes, and repetitive routine technical tasks fall to automation, human skills will be the difference between advancing into leadership positions or fading quietly into irrelevance.
Meanwhile, strong leaders will be increasingly sought after. These positions won’t be filled by individuals with technical prowess. Rather, those with the human skills needed to rally people and organisations through change.
Much of our dominance as a species can be attributed to our ability to work together. While we evolve physically at a similar rate to other animals, none can cooperate like we can to develop tools, advance our knowledge and work together towards a common goal.
This inherent ability to collaborate is closely linked to our ability to communicate.
Early in our existence, millenia passed with relatively little technological advancement. Then, suddenly, everything lept forwards. The past two centuries have seen dramatic advances in medicine, manufacturing, travel and human rights. And they occurred in parallel with giant leaps in communication: the telegraph, telephone, television and the internet.
Unlikely. These developments were a result of wider cooperation enabled by new technology that allowed us to communicate across vast distances to massive audiences. Suddenly, we could see exactly what was happening around the world. Instead of existing in relative isolation, we could build on the work of others.
Today, businesses can operate globally for exactly the same reason. We can bring together teams from around the world to work together for a common cause. And as we face greater challenges, collaboration promises to be more important than ever before.
For organisations to succeed in the future requires more than a few leaders with vision. It will take everyone being on the same page and working together effectively. This is where the human skills like empathy, emotional intelligence, and communication are crucial.
Our homage to human skills is by no means a rally against technology. Oh no, we aren’t advocating a march back into the caves to live as luddites. Impossible and undesirable!
Instead, let’s embrace the possibilities that exist at the intersection of technical and emotional intelligence. Let’s consider how humans and machines can coexist, drawing upon the strengths of both to improve our workplaces, organisations and industries.
Yes, in an unexpected, but by no means unwelcome discovery, it appears that the rise of robots might actually herald a more human era.
With increasing access to mindlessly efficient entities, businesses can finally stop treating their people like them; gifting leaders the freedom to imagine new ways of working that bring out the value only humans can provide.
What a bloody wonderful time to be alive.