Does investing in employees’ marketing skills pay off? Or is it just a waste?
Businesses spent nearly $94 billion on corporate training in 2017âa 33% increase over 2016. Â Per employee, expenditures ranged from $399 at large companies to $1,886 at smaller organizations, according to the same report from Training magazine.
Within marketing departments, an estimated 4.2% of the total marketing budget now goes to training programs, up from 2.7% in 2014. The Association of National Advertisersâ CMO Talent Challenge Playbook highlights success stories from marketing training investments:
- After Unilever rolled out a training program to 5,000 marketers in 2016, it reported a â35% uplift in knowledgeâ and â96% uplift in confidence.â
- Graduates of IBMâs E.School now produce marketing content that âperforms better on average than content previously produced.â
- Fifteen months after implementing their âNew Modern Marketing Curriculum,â Johnson & Johnsonâs team showed a âstatistically significant increaseâ in prioritized skill areas.
What about your team? Are you spending too much? Too little? Are you training the right skill sets? Is training a good investment? A wasteful expense? Do you even know?
The State of Marketing Training
Some 23% of marketers surveyed by HubSpot identified âtraining our teamâ as a top challenge. âHiring top talentâ was immediately below, at 22%.
Companies trying to hire their way out of a skills gap face a competitive marketplace. LinkedInâs May 2018 Workforce Report revealed a 230,000-person shortage in the United States for marketing skills, with demand highest in major cities.
That tight talent market has pushed training to the forefront, especially training to develop new marketing capabilities. The biannual CMO Survey anticipated a 6.5% increase in investment for âhow to do marketing.â No other increase in marketing knowledge developmentâsuch as the transfer of internal knowledge or honing of market research skillsâtopped 3.9%.
As a segment of the training market, however, marketing lags behind the stalwarts of the industryâsales and leadership training:
Still, marketingâs share of the corporate training budget is significant: $17 million annually for large companies (10,000+ employees), per Training magazine. Mid-size companies (1,000â9,999) spend an average of $1.5 million per year; small companies (100â999) invest around $375,000.
Those figures are supported by similar findings from a 2016 Brandon Hall Group Benchmarking Study, which surveyed training spending for equivalent tiers: $13 million (10,000+ employees), $3.7 million (1,000â9,999), and $290,000 (100â999).
In recent years, most of that money has tried to close a single marketing skills gap: digital.
A Skills Gap Marketers Donât Know They Have
The Digital Marketing Instituteâs 2016 report âMissing the Mark: The Digital Marketing Skills Gap in the USA, UK & Irelandâ lays bare marketersâ shortcomings. Only 8% of those tested achieved entry-level digital marketing skills, and the perception of skill exceeded performance: 51% of U.S. marketers perceived a skill level that only 38% demonstrated via testing.
A 2018 analysis of client data by General Assemblyâwhich has benchmarked more than 25,000 marketers with its âDigital Marketing Level 1â skills assessmentâfound no correlation between seniority and expertise (among those below the vice-presidential level), and cited âdata and measurementâ as the biggest skills gap.
âItâs not uncommon for us to hear, âWe donât know what we donât know,ââ noted Alison Kashin, an Education Product Manager at General Assembly who focuses on digital marketing training. Kashin elaborated:
Most corporate marketers have outsourced digital execution to agencies, and clients now realize theyâre too far removed to be effective. Itâs hard to give direction, ask the right questions, or make confident decisions if you donât know how something works.
Marketersâ Self-Inflicted Wound
The yawning skills gap is, in part, self-inflicted. As the Digital Marketing Institute’s report notes, âThe general consensus among employees is that the pace of technological and digital change within their organizations is too slow, and that factors such as a fear of loss of control, especially among employees aged 35â49 years, is hindering its adoption.â
The push to close the skills gap also has the potential to create tension with agency partners, who at times transfer knowledge that reduces the need for their services. As Rhea Drysdale, CEO of Outspoken Media, explained:
Companies want to train their team so they can handle more internally, and that makes sense.Â They see our work as a means to an end. More often than not, that end is team growth.
âThis exact scenario happened last year with an enterprise-level professional services company,â Drysdale continued. âOur advocate went from managing one person to a dedicated team that included a data person, an SEO, an editor, and developers. We’re still working with them but as a consultant on project scopes.â
Digital marketing isnât the only skills gap disrupting the industryâin-house and agencyâeither.
Further Fronts in Marketing Training
Niches like account-based marketing (ABM) have seen rapid growth in recent years as well.
âThe top question we get around education and training development is account-based marketing,â stated Rob Leavitt, Senior Vice President of the Information Technology Services Marketing Association (ITSMA). âThere is a hunger and demand for ABM, and itâs far beyond us.â
Leavitt believes ABM training has been a reaction to the digital wave, which can confuse interested individuals with interested accounts:
If I download four whitepapers to understand something relevant to my client, I look really interestedâbut Iâm not a relevant account for you. So how do we take what weâve learned in digital and overlay an account-based strategy and approach?
At times, the skills gap comes full circle. Just as experienced marketers may hesitate to invest themselves in digital, newer marketers, Leavitt cautioned, risk undervaluing traditional skill sets: âMore experienced people feel more comfortable with soft skillsâcollaboration, leadership, teamwork, etc.â
For every marketer, thereâs need. For every facet of marketing, thereâs training. But can training close the skills gap?
Does Marketing Training Work?
Few executives know.
In Learning Analytics: Measurement Innovations to Support Employee Development, authors John Maddox, Jean Martin, and Mark Van Buren reveal that, when it comes to training, some 96% of CEOs want to measure one aspect more than any other: impact.
How often is it being measured? Just 8% of the time. Another 74% of CEOs want to connect money spent on training to money earnedâthe return on investment (ROI). Itâs measured just 4% of the time.
Measuring the business impact of training is possible. But individual knowledge gains donât guarantee company-wide improvements.
Recognizing the Limits of Training
On August, 8, 1963, a band of 15 robbers stole ÂŁ2.6 million in cash from a mail train traveling between Glasgow and London. Media outlets dubbed the heist âThe Great Train Robbery.â
In 2016, Harvard Business School (HBS) Professor Michael Beer and TruePoint researchers Magnus Finnstrom and Derek Schrader reappropriated the moniker to allege a similarly monumental fraud: âThe Great Training Robbery.â
Despite the ominous title, the authors are less critical of training programs per se than the âfallacy of programmatic change,â which mistakenly focuses on individual behavior change as a way to shape institutions. Their findings suggest the inverse is true:
The organizational and management systemâthe pattern of roles, responsibilities and relationships shaped by the organizationâs design and leadership that motivates and sustains attitude and behaviorâis far more powerful in shaping individual behavior.
Evaluating the Corporate Climate
Additional work by another HBS professor, Amy Edmondson, distills the prerequisites for effective training programs down to a single metaphor: the need for a corporate climate to provide âfertile soilââa psychologically safe environment in which subordinates can voice opinions freely. Only fertile soil, in turn, can allow the âseedsâ of individual training to germinate.
Beer et al.âs work found that just 10% of training programs surveyed had the fertile soil necessary to derive value from training. Too often, they lament, the rush to invest in individual training protects obdurate executivesâor the HR representatives who would need to confront themârather than addressing core organizational or leadership issues.
Those findings align with Kashinâs experience working with clients at General Assembly:
There are layers of team structures, technology, planning processes, etc., that need to be re-examined to be successful in digital. Most corporate programs have an element of change management. The most success occurs when we support a larger change-management effort that has been set in motion with strong internal leadership.
Measuring the Success of Marketing Training
Even with strong organizational support, how do you know if a marketing training program works?
âIt usually looks like âprogram success,ââ according to ITSMAâs Leavitt. âClients look at basic satisfaction with the education training: Did it seem like a good use of time? Have we been able to develop the program and succeed? Are we hitting our targets?â
For Kashin, numbers are only part of the picture: âAt the core of every one of our success stories are individuals who were motivated to learn and change, and highlighting their stories is one of our most powerful and rewarding ways of showing value.â
âThe reality of a lot of these programs,â Leavitt summarized, is that âeducation training is hard to measure. A lot of it is qualitative, informal. We know it when we see it. Weâve not cracked code.â
Jack Phillips believes he has. Phillips, an expert on determining the value of training programs with a doctorate in Human Resource Management, is chairman of the ROI Institute:
We donât like âestimates,â but our choices are to do nothing or claim it all. Neither one is any good.Â Quantitative data is more believable. Executives understand it quite clearly. Our challenge is to make and defend credible estimates if quantitative data isnât available.
That combined measurementâexhausting quantitative data sources while communicating qualitative ones persuasivelyâbegins with the identification of KPIs.
Identifying KPIs for Marketing Training
âSales and marketing tend to have the same metrics,â explained Phillips. âIncrease existing customers, acquire new customers, increase client quality, etc.â
(A scan of marketing-specific KPIs highlighted by training firms also reveals a list of familiar metrics: number of qualified leads generated, cost per qualified lead, marketing staff turnover rate, and marketing staff productivity.)
When attempting to identify KPIs, a common mistake is not translating a problem into its underlying metric. For example, âpoor copywritingâ may be a marketing problem, but improvement canât be measured unless marketing executives identify an underlying business metricâlike conversion rateâthat can show the effects of successful training.
According to Phillips, identifying KPIs is far easier than parsing the influence of factors that may affect them: What if an improvement to an ad campaign drives more qualified visitors to a landing page? Or a recent website redesign increases site speed?
Isolating the impact of marketing training, Phillips asserted, is the key to unlocking assessment methods that can demonstrate ROI.Â Still, the math can quickly become complex. So can the cost of measurement. On average, companies spend just 4% of the total training budget on measurement; most spend less than 1%.
Many models, Phillipsâ included, outline progressive levels of measurement to help companies scale accountability based on resources.
The Phillips Measurement Model
Phillips uses a five-level model (an optional sixth level assesses intangible valuesâjob satisfaction, organizational commitment, teamwork, etc.):
- Reaction: Did participants like it?
- Learning: Did they learn from it?
- Application: Did they apply their new knowledge on the job?
- Impact: Did the training have a business impact?
- ROI: What was the value of that impact, and was it a good investment?
While authoritative, Phillipsâ model is not the only one. Models by Donald Kirkpatrick and Josh Bersin are also widely used. (General Assembly uses a version of the Kirkpatrick model.) The Kirkpatrick model allows for immediate post-training measurement, while the Bersin model folds values such as efficiency and utility into Phillipsâ approach.
Levels 1â3: Generating a Baseline Measurement
The initial levels of measurement include assessments such as post-training surveys to measure trainee satisfaction as well as tests or instructor evaluations to measure knowledge transfer.
Phillips believes the first two levels are sufficient for a baseline measurement of knowledge transfer. Additional levels of measurement connect training outcomes more closely with business metrics and monetary returns, but those insights come at a cost.
Kashin concurred: âMeasuring behavior change and business impact is something we always encourage, but it requires a fair bit of investment on the client side.â
Measuring behavior change (âApplicationâ in the Phillips model) also requires a time lapseâPhillips suggests three monthsâbut can be a simple retest of training knowledge or follow-up survey about traineesâ perception of its enduring value.
Levels 4â5: Bridging the Gap between Training Costs and ROI
To complete a five-level measurement with âImpactâ and âROI,â companies must identify a business outcome (e.g. web leads), assign it an accurate monetary value (e.g. dollar value of a web lead), and isolate the impact of training from other factors.
Phillips offers quantitative and qualitative options to isolate the impact of training:
- A/B Testing. Find two similar groups of marketers within your organization (e.g. teams in roughly equivalent markets located in different cities or countries). Offer training to one and not the other. Measure the difference in performance between the two based on a key metric (e.g. increase of sales-qualified leads the three months before and after training).
- Trend analysis. Use past performance to project expected progress of a given metric. Measure the actual outcome after training. The impact of training is the difference in performance between the two lines.
- Modeling/forecasting. If training aligns with changes to other variables (e.g. advertising campaigns), subtract the known impact of that variable from the total change; the remaining change is the impact of marketing training.
To identify outside variables that affect progress toward business metrics, Phillips leans on experts within the organization, asking questions like:
- Would this trend have continued if we had done nothing?
- Is it market growth?
- Did anything else happen in the environment around the marketing share?
- Were there other marketing promotions?
- Did prices changes?
- Were there added incentives for related groups, like sales staff?
- Did competitors shift strategies or enter/exit the marketplace?
- Estimates. Conduct surveys of clients or marketing staff. Ask clients to identify the channels or efforts that made them aware of a product or influenced their purchase. Survey marketing staff about the degree to which various marketing efforts (training included) influenced results.
If, say, a digital marketing training program, an online advertising campaign, and a website redesign all launched in the past three months, ask marketing staff to weight the effect of each, multiplying by their confidence:
As Phillips argued:
When you combine these estimates from a group of people, they become powerful.Â Our effort is always to go to the most credible process first. If we canât use a mathematical approach, weâll use estimatesâand weâll defend them.
Time and again, Phillips has seen the âconfidenceâ adjustment account for human error effectively. (Phillips cited Jack Treynorâs jelly bean experiment as corroborating evidence.)
- Expert opinions. When collective estimates by customers or staff arenât available, use internal experts or key managers to make individual estimates.
âThe key is to ask the right person and collect it the right way,â Phillips explained. Finding the ârightâ person or conducting a survey the ârightâ way is open to interpretation. But, Phillips is adamant, no less necessary:
You have to do it. You canât just say, âWeâll take full credit for it,â and life is good. Executives will require you to sort it.
Translating the business impact into ROI requires two additional steps:
- Converting the business impact to a monetary value (e.g. the dollar value of a whitepaper download)
- Determining how many dollars are returned above and beyond the initial investment in marketing training
Importantly, an ROI calculation differs from a benefit-cost ratio in format (percentage versus ratio) and formula (subtracting program costs from benefits):
Even without a complete ROI calculation, assessing the âbusiness impactâ of trainingâwhen supplemented with a list of intangible benefitsâcan be a powerful defense at multiple levels within an organization, c-suite included.
Need aloneâdigital marketing skills today, account-based marketing skills tomorrowâmay continue to fill training budgets and grow training programs. Measurement challenges will not fix or excuse skills gaps in marketing departments.
âTo some extent,â Leavitt concluded, ruminating again on the question of ROI, âWhen your clients come back for more, theyâre happy with what they got the first time.â Itâs a purely qualitative measurement.
Still, robust, quantitative ROI models, though more persuasive in the c-suite, lean on qualitative components, too. All measurements can be defended; all surpass a failure to measure anything at all.
Nor can any assessment answer other, broader questions: Is training currently the best use of marketing resources? Does the commitment to change extend to the highest levels of the organization?
In short, is it the right season? Is the soil fertile? If yes, then plant the seed. And grow.
The post Making Marketing Training Work: Closing Skills Gaps, Proving Value appeared first on CXL.
Source: Conversion XL