Hello,
You know, I spend an awful lot of time talking about AI these days.
I may even have learned a thing or two in the process:
A lot of people who previously had no need to know about AI suddenly feel compelled to understand it. And with good reason: we have all seen the reports that AI is here to take our jobs. Some reports say the robots wonāt even stop there. (For my part, given that humanity has destroyed 69% of the planetās wildlife since the 1960s, Iād say weāre still our own biggest threat. And hey, weāre due our own threat after all weāve done.)
People in marketing, journalism, customer support, and management consulting attend my sessions. These are typically not ātechnicalā roles, in the sense of needing to create and execute computer programs. Nonetheless, they now need to understand how to use AI tools in their roles. Itās not as straightforward as the transition to email was, for example. Knowing how, when, where, and why to use AI requires a new way of thinking about work.
In a meeting this week, a colleague in skills development put this neatly. People who are in the process now need to work on the process.
I have interviewed a number of business leaders recently who are concerned about this new way of operating. The idea is that all employees should know the strengths and weaknesses of AI, as well as their own strengths and weaknesses as individuals, to develop more effective ways of working.
Few of these leaders believe they can simply drop this new technology into the company and watch it take off. They think their employees are trained to work in a different way altogether.
Many of the people who attend my workshops bemoan their leadersā lack of AI knowledge, too. They fear they will be replaced by AI, as their bosses understand neither what the technology does, nor what their employees do all day.
The result is that we are left with a huge skills gap, but also a lack of confidence from all sides.
I speak to ātechnicalā colleagues and they donāt see just how significant this is. To them, AI skills can only be technical. How else would one truly āuseā AI?
Throughout 2023, my workshops have improved progressively as I have dropped the technical jargon and instead tried to communicate what AI means in terms people can grasp. The intended impact is not that people have an overly simplified concept of what AI can do.
Rather, the intended impact is that they start to think in more āalgorithmicā terms. I just donāt say it like that. For instance, I try to show how working processes can be reordered to include AI. We work together to break tasks down into their components, then assign these sub-tasks to different owners. These are fundamental concepts we can all understand, but only if put in language and scenarios we can relate to.
As a result, weāre starting to understand what āAI skills for non-technical rolesā should mean.
2018: āRoseanneā returned, we had a royal wedding, and I didnāt close the digital skills gap
Today, I was looking through a presentation I put together in 2018 to support a report I wrote about the ādigital skills gapā. Itās full of well-intentioned, data-driven insights that Iād stand by today. Which, in its own way, rather proves the redundancy of the report. Five years on, that skills gap has only grown. My report didnāt close it, against all odds.
Back in 2018, we found that just 35% of companies had a plan for upskilling employees for the future of work. Not a good plan, just a plan of any kind. 68% of employees said they received fewer than five days of digital skills training each year.
The two most popular reasons for this lack of focus on skills? No budget and no time.
There is a sense of urgency about generative AI right now, yet for many in L&D departments worldwide it is really just the tipping point. Businesses have needed this rethink for many years now already.
We should use the opportunity to galvanise support for this overdue reskilling, while the interest is still there. What I learned by looking back to 2018 is that these plans really do mean very little unless they are put to work.
Categorising AI skills
After months of waving my hands around to try and convey what these AI skills really are, I have arrived at the categories below.
There are āmasterā skills that everyone needs to develop. These include strategic thinking, problem-solving, and collaboration.
There are then ātechnicalā skills that many roles require, albeit to differing degrees. These include data literacy and customer insights, along with about ten others we have identified so far.
And at the top of the pyramid, we have the functional skills. These are the ones that relate directly to a job discipline, such as content creation or search marketing. Weāre starting with marketing, but the same idea applies across industries.
My challenge with so many existing skills frameworks is that they start at the functional level. People learn how to make incremental improvements in their niche part of the professional world, using specific technologies. Said technologies soon become redundant, along with the skills.
As another colleague put it recently: āWe need to teach people to drive, not to drive a specific make and model of car.ā
Individuals can then build their own path by stacking these blocks. The bottom two layers open a range of possibilities and apply across different disciplines. This should mean that it is quicker and easier for people to move across departments at a later stage. This is a massive barrier for a lot of large companies today.
So far, so theoretical. I have had plenty of healthy, āvibrantā discussions about this framework of late, but there is a broad consensus that we need a mindset shift as well as new āhardā skills.
The challenge lies in bringing this to life. As youāll know, we have built digital marketing simulations at Novela and certainly learned a thing or two about interactive learning.
We are now working with (and seeking more) early partners to build out the first simulations for AI skills. We will build the simulations to suit the needs of these early partners, so itās worth getting in touch if it sounds like what you needā¦
A simple example would be the one below, containing a pun I love but cannot lay claim to (itās from The Simpsons).
As you can see, the user is dropped into a business challenge that they must navigate with help from their AI control panel. The challenge teaches a master skill and a technical skill, for which the user will receive points towards their development plan.
And thatās not all - these major simulations (2-3 hours to complete) are supplemented by skill drills (10-minute activities), micro-lessons, and peer learning.
Iād love to hear how your company is approaching this, and do get in touch to discuss our plans further. Iāve been trying to ācrackā this problem for a number of years now!
What do āAI skills for non-technical rolesā mean in your organisation?
Weāll go live in early 2024 with our first AI skills simulations š¤š¤
Nonsense statistic of the week
ā45% of CFOs have rejected marketing proposals because they doubted the ROI.ā
I saw this on a sponsored post on LinkedIn (where else?) and I canāt imagine that the real figure is anywhere south of 100%.
So, the Department of Justice is focusing on the real moneymaker behind Google: ads. It alleges that Googleās dominance lets it raise prices for advertisers with few repercussions ā a claim backed up by Google ads executive Jerry Dischler on the stand.
When it needed to hit pesky things like quarterly targets, Google would just up the prices in its auctions - sometimes by 5% or even 10%. That means it made advertisers pay significantly more for the clicks they received through Google Ads.
As an internal Google email put it:
āWeāre shaking the cushionsā
Subsequent emails have shown that Google used rather sketchy means to increase the number of impressions it could serve through search advertising. This matters because if there are more searches, there is more inventory for Google to sell.
Iām not dedicating a huge amount of space to this because itās completely unsurprising. I was talking about this a long time ago, not because Google is uniquely evil, but rather because this is what people do when under pressure to hit a target and in control of the levers that would allow them to hit that target.
It is always, always, just a little man behind the curtain.
I genuinely think there are millions of people who believe what they are told by large corporations. They must do; that belief keeps the whole show running. There are millions of people in marketing alone who are in awe of Google, believing everything they do must be the result of some interplanetary complexity. Or, perhaps even more naĆÆvely, āfor the good of usersā.
See also:
š¦ Now this is more like it
Bird Buddy, my favourite company, has opened up its Explore app to non-customers. Bird Buddy is a smart birdfeeder that uses AI to recognise birds as they land and then sends the owner a video of the encounter. Itās a rare idea that uses tech to bring people closer to nature, and I wish there were a lot more like it.
Since I cannot afford one of the birdfeeders, I can now watch birds land on feeders around the world. (Owners must opt in to this feature). Itās like TikTok for the enlightened. Facebeak.
Iāll leave it there - see you next time!