Archive for May, 2013|Monthly archive page

Retaining employee expertise

 

It’s an organizational fact of life that talented and experienced people will move on — whether through headcount reductions, promotions or leaving for a better job. The knowledge that departs is often vital to a firm’s capabilities and a key source of best practices. This flight often ends up weakening the company, increasing costs and boosting risks.  To preempt this outcome, leaders should develop talent plans for key employees that capture organizational history, best practices and customer insights.

Many kinds of knowledge are at risk of disappearing with key employee turnover.  This expertise could be around key business relationships, customer insights, or having an implicit understanding of how the organization really operates or how a product has been designed.  Information risk is especially acute under times of business distress or when circumstances (e.g., a merger) overwhelm deliberate thinking.  Outside of risk mitigation, documenting and sharing expertise in a timely fashion is a great way of driving continuous improvement and minimizing costs.

Two of our previous client engagements illustrate the hazards of not retaining wisdom:

Product firm outsources key activities

This company decided to outsource an important business process, making the operational team redundant.  Despite a long implementation period, little attention was paid to retaining the team’s institutional knowledge. This was a fateful omission.  The firm no longer had the expertise to effectively manage the outsourcer, resulting in higher costs and reduced operational performance.  Furthermore, the lack of know-how prevented the company from exploiting new innovations that could have improved consumer satisfaction.

IT company is acquired

A rapidly growing software firm was purchased by a large IT services company.  As buyers are apt to do, they quickly brought in their own teams and processes, and rationalized many of the functions including sales and product development.  This process was handled clumsily.  Incoming managers spent too little time understanding the informal works practices used to get their jobs done.  Moreover, they did little to preempt expertise gaps through knowledge transfer or retaining key people as consultants.  These omissions created significant problems around client retention, customer service and software upgrades.

In both cases, outcomes would have been better if these companies codified and managed their expertise, had timely knowledge transfer and archived important historical information in accessible places. The reality for many organizations, unfortunately, is the opposite.  The proficiency of a small team or even a single person can be a challenge to re-accumulate when needed.  Vital know-how (especially implicit knowledge that is never written down) is often spread over many people or buried in IT silos. In fact, losing implicit data may represent the biggest danger because managers may not even know it existed in the first place.

How can firms avoid these pitfalls?

First, recognize this issue is not about better severance packages or employee engagement.  Key people will leave; you just have to manage the risk, and work on better documenting and sharing their expertise.  Catherine McIntyre, SVP Strategy and Development at LoyaltyOne, believes capturing institutional knowledge is crucial.   “My experience at LoyaltyOne and P&G shows it’s essential to do and it definitely pays out in many ways. Like most leadership tasks, it takes planning and showing we truly supported the work by participating in some meaningful way.”

This can be achieved by exploring three key questions:

Which employees have risks, opportunities?

Who are your experts in the key roles?  Usually, they will be long serving employees who manage customer relationships or design products.  These people may not always be high in the organization or be the ones with the most seniority.

What do we need to learn from them?

What are best practices, indispensable skills or work habits needed for important tasks?  You will often need to go deep to understand the “art” of the job.  In key accounts, for example, who are the decision makers, barriers and influencers?  Or, what “pitch” seems to work best?

How will we get this knowledge, in a sensible way?

High performance companies bake information sharing into every employee’s job description and performance plans. We also recommend regular team and department debriefs, as well as 1:1 mentoring with those workers most likely to graduate to key roles.  McIntyre takes a comprehensive approach to talent management. “I’ve used a variety of approaches, appropriate for the type of knowledge to be captured and the existing culture. This included special functional training as groups of people were promoted into new levels, job shadowing for those being groomed for next roles, and case study competitions to encourage documenting the most current knowledge.” Companies looking to be more strategic in their approach may want to develop apprentice programs, encourage more inter-department job mobility and look to create internships with key suppliers, especially outsourcers.

For more information on our services and work, please visit the Quanta Consulting Inc. web site.

 

Big Data boosts advertising

Much has been written about the transformational role of Big Data in improving business performance, but the usefulness of data analysis has spread to almost all aspects of business. Most recently, ad-development managers have been able to make use of Big Data to measure and improve the performance of their traditional and digital advertising programs and tie them more closely to corporate goals. A thought leadership piece by Wes Nichols published in the March 2013 issue of the Harvard Business Review highlights a new framework for designing and implementing cutting-edge advertising analytics.

In the dynamic world of digital and traditional advertising, channel proliferation and social media, any improvement in measuring and refining performance will have an immediate impact on the bottom line and the brand. Traditionally, advertisers have been challenged to realistically measure the performance of their creative and media plans. They have been forced to link sales data with a small number of variables such as media reach and frequency, using a limited number of rudimentary analytical tools like media-mix modeling, surveys, measuring clicks and focus groups.

This popular approach has some significant drawbacks.  It evaluates each medium (e.g., TV, print, digital) independently, and not collectively as consumers in the real world experience them.  Secondly, it is very difficult to measure the impact of one advertising variable, (increased banner ads, for example), on another variable like awareness. Finally, these tools do not easily connect advertising activity back to changes in consumer behaviour like purchase.

Recently, a new set of specialized Big Data methodologies have emerged that allow managers to improve both the effectiveness and efficiency of the advertising plans.  Powerful techniques and technologies can now mine terabytes of data in real time across hundreds of different marketing and business variables in search of key correlations.  The insights gleaned can then be used to dynamically adjust media spend and creative execution for optimal performance.

In his Harvard Business Review article, Mr. Nichols, outline a three-step approach to leveraging next-generation advertising analytics:

Attribution: Gathering and attributing the revenue and strategic contribution of each tactic.  In many companies, this exercise could involve hundreds of variables, ranging from marketing initiatives to economic factors and competitive actions.

Optimization: Using predictive analytics to measure the potential outcomes of different business scenarios based on the interrelationship between tactics and changing market variables. For example, what will happen to sales revenue if you boost online advertising in Ontario, cut it in Quebec and increase prices in the Maritimes?

Allocation: Re-allocating marketing and advertising spend based on the learnings gleaned from the Optimization phase. Ideally, the most successful programs would gain additional funding while others would see less support.

We have witnessed a number of companies use an approach similar to Mr. Nichols’ to generate a 20-40% improvement in marketing effectiveness and efficiency.

Case in point is Electronic Arts, one of world’s leading software gaming companies.  They were looking to boost marketing performance by going beyond simple measurement tools and managerial judgment.  The company decided to use the attribution, optimization and allocation process on the marketing plan of a new game, Battlefield 3.  Hundreds of variables were analyzed including sales results, online chatter, pricing data, advertising reviews and distribution information.  The predictive analytics uncovered some important insights.  For example, a favoured tactic (in-theatre advertising) was under-performing.  Second, digital marketing performed better than previously thought.  And finally, the media launch plan was sub-optimal.  These learnings helped the firm revamp the introductory marketing plan of Battlefield 3, making this launch the most successful in the company’s history.

Despite Analytics 2.0’s potential, firms need to approach it systematically and with common sense as implementation could be a challenge. We have seen analytics projects flounder due to poor data quality and reporting, weak compliance (e.g., data hugging), insignificant management support and insufficient IT capabilities.  Moreover, good judgment and creativity is still vital in the creative and media planning process.  Glen Hunt, creator of many memorable ads including Molson Canadian’s “I am Canadian”, says:  “Big Data represents a big opportunity, but it does not negate the importance of ‘blink’ test intuition and experience.  After all, ‘not everything that counts can be counted, not everything that can be counted counts.’   Or so says, Einstein.”

Big Data has the potential to revolutionize advertising measurement and evaluation, truly delivering higher marketing performance at less cost.  Companies looking to build and leverage these new capabilities would be wise to make them strategic priorities, choose the right business or product beachhead to kick-off and earmark the necessary mandate, resources and investment.

For more information on our services and work, please visit the Quanta Consulting Inc. web site.

Influence and social media marketing

Social media marketing is a large and growing part of every company’s marketing budget and plan. Conventional wisdom says leveraging “influential” people like friends or celebrities can trigger many others to do new things like purchase a product or join a community.  However, new research published in the Harvard Business Review challenges this view and suggests a significant amount of social media investment and focus is being improperly allocated.

Over the years, millions of dollars of social media investment has been directed at finding and leveraging influential consumers who will virally persuade others to try a new product or service.  On the surface, this notion makes a lot of sense.  Marketers can improve effectiveness and efficiency by targeting only those customers that will use their product and trigger others to do the same.  Much of the rationale behind this premise dates back to Malcolm Gladwell’s book, The Tipping Point, which explored why some ideas take off and others don’t. Does this “cause and effect” hold up to statistical scrutiny?

No, or at least not yet.  In a social media universe, it still is very difficult to separate influence from other factors in a purchase decision.  “Real influence depends on personalized and engaged relationships,” says communications pundit John Barker of Truenote. “However social media often dilutes digital relationships to the point where ‘influencer’ impact becomes increasingly abstract.”

To get closer to a definitive answer, NYU management professor Sinan Aral conducted a number of experiments to understand who and what is most influential, and who is most predisposed to their influence.

Results

We know from psychology that human behaviour clusters among friends over time.  What we don’t know is whether or not this is due to peer influence or another factor, such as similar interests. In one experiment, Aral studied the adoption of a mobile service product within the 27 million-member Yahoo! instant messenger network.  The research used the latest analytical models to separate social influence (i.e. how does a friend’s usage or recommendation impact another’s decision to use the product) from another factor, homophliy.  A sociological phenomena, homophily is the preference of individuals to associate with, have the same habits as, and like the same things as other people (the proverbial ‘birds of a feather, flock together’ concept), even if these people have no direct connection to each other.

Interestingly, Aral found that traditional measurement models overestimated the impact of a friend’s social influence on purchase decisions by a factor of seven times.  Furthermore, these models overstated the role of social influence early in a product’s life cycle (or when a trend should begin).  In fact, his research shows half of the perceived influence could be attributed to homophily effects alone. Early adopters tend to be so much alike that social influence plays a lesser role.  To see this phenomenon in action, check out the people standing in line at an Apple Store before a new product launch.

Another experiment looked at the role of social influence versus a common digital marketing program on the downloading of a Facebook app. Aral found that while personal invitations from a friend had a higher response rate (6%) versus an automated announcement (2%), the automated messages still generated much better results.  Their adoption rates were significantly higher (246%) versus a personal invitation (98%), due to the higher number of automated messages that went out.

To sum up, it appears that using outdated analytics and shaky strategic assumptions is leading marketers to rely too much on social influence-based tactics at the expense of more traditional yet successful homophily (i.e. segment) driven programs.

Implications

This research has significant implications for a firm’s marketing strategy and planning, especially around new product launches.

  • Peer-to-peer tactics (e.g., referral incentives) designed to leverage social influence will be less effective and more costly than previously thought.  This is not to say that P2P programs should be abandoned; rather they would be more effective introduced later in a product’s roll out.
  • Well designed digital and traditional advertising and promotion tactics that target discrete segments (based on homophily characteristics) will be more effective and efficient, at least initially.
  • Marketers will benefit from using the latest Big Data models so they can design plans with the optimal mix of influence and traditional-based tactics.

Exploiting Big Data may hold the key to super charging the role of social influence.  According to Mr. Barker, “Big Data can now provide customized social influence at scale.  Building a virtuous circle of “mass personalization” that is both deep and broad could be the “tipping point” for digital marketing.  Think influence on steroids.”

For more information on our services and work, please visit the Quanta Consulting Inc. web site.

Sync your supply chain and business strategy

It is self evident that a company’s supply chain should be aligned with its core business strategy and value proposition.  For example, a retailer following an everyday, low-cost positioning should have a supply chain built to minimize cost and maximize inventory turns, potentially at the expense of other capabilities such as innovation or sustainability.  Yet, our research suggests many organizations retain supply chains that are out of sync with their core business goals, leading to lower financial and market share results.  Fixing this problem is part analytics, part strategic planning and part organizational redesign.

Syncing your strategy and supply chain is a ticket to superior performance.  There are many examples of market-leading firms with strategic congruency including Dell,Walmart, Nordstrom, Cisco and McDonald’s. These firms diligently manage their supply chains to support their core positioning and deliver superior value — not to mention creating industry barriers to entry.  For example, Walmart has achieved outstanding operational performance by developing sophisticated inventory management, logistics and procurement systems.   These capabilities have played a key role in Walmart delivering on its everyday-low price brand promise while achieving industry-leading margins and profitability.   In another case, Dell vaulted to the leadership of the PC industry in the 1990s by offering low-cost and customized products through a build-to-order manufacturing model backed up by extensive procurement and inventory-management competencies.

Most companies, however, are not strategically coherent.  This can occur for a variety of reasons.  For example, firms competing in multiple product categories face a myriad of competing demands from different product teams and functional departments, leading to a convoluted supply chain design and bloated product portfolio.   In other cases, weak centralized management control combined with an outsourced and global supply chain will often result in misalignment.  Finally, some firms do not posses a consistent strategic position in their marketplaces.  Instead, supply chain decisions ebb and flow depending on short-term market conditions rather than long-term considerations like sustaining a differentiated market position.

One of our consulting projects illustrates the causes and dangers of supply chain incongruence. We were engaged by a customer-driven industrial goods company to help fix its customer satisfaction problem.  The company was losing revenue, facing higher cost-to-serve expenses and experiencing historically low customer satisfaction scores.  Its distributors were getting short shipped of high-velocity items and incurring extra costs through persistent errors in filling orders.  After a thorough analysis, we discovered the problem was not localized to the logistics group (as assumed by management) but had to do with the design of the supply chain. Over time, major parts of its operations had drifted away from the firm’s core positioning around maximizing customer satisfaction.  Specifically, production planning was being under-resourced and the product management and procurement teams had quietly (and independently) shifted their focus towards launching multiple products and aggressive cost reduction respectively. Our solution got their supply chain back on track by enhancing their product life cycle management policies, improving order fulfillment capabilities and moving production to a more flexible manufacturing model.

Senior leaders need to develop the right supply chain and capabilities for their business strategy and keep it there.  To do this, we recommend a simple three-step approach:

Clarify

Although many companies pursue a hodge-podge of strategies, they tend to focus on a couple of parameters (singly or in combination) like cost leadership, premium positioning, or service excellence.  However, many managers may not know what levers drive their success and what to leave for their competitors.  By undertaking a thorough strategic planning process, leaders will understand their ‘winning’ positioning, where they merely need to meet competition and what they can ignore because of poor strategic fit.

Prioritize

Misalignments often occur when short-term management decisions undercut the optimal supply chain model.  This is understandable given the dynamic nature of some markets and quarterly financial imperatives. One example could be the launch of a cost savings initiative for a premium car brand.  The purchasing department may choose the lowest cost, but least reliable and innovative, parts suppliers. Managers need guiding policies and discipline to ensure their supply chain decisions and capability investments efficiently reinforce their core business strategy and value proposition.

Measure

As the saying goes, you can’t manage what you don’t measure. I will add the truism that you need to measure the right things, too.  Unfortunately, numerous organizations rely on incomplete metrics, which do not measure the link between corporate strategy and supply chain design. Leaders must focus on or identify key performance indicators (KPI) that reinforce strategic coherence. For example, one KPI — ‘shipments on-time, and complete’ – is a good proxy in customer-drive product companies for supply chain performance areas including production, customer service and logistics performance.

For more information on our services and work, please visit the Quanta Consulting Inc. web site.