Archive for December, 2012|Monthly archive page

Retooling product development

The success of companies like RIM, Sony, P&G, Apple and GE depend heavily on their ability to develop and launch compelling products on time and on budget. Alas, few organizations are consistently successful.  Plenty of market and academic research shows that upwards of 90% of all new products fail to hit corporate objectives within 3 years, resulting in missed market opportunities, wasted capital, and damaged executive careers.  These breakdowns have many culprits, as we have written previously.  According to a recently published article in the Harvard Business Review, the primary cause of new product failure lies in flaws within the conventional product development approach. Addressing these shortcomings can improve a firm’s ability to successfully design and launch new, innovative products, thereby improving time to market, enhancing product appeal and reducing business risk.

In our experience, a new product’s evolution follows a common trajectory: project managers are tasked to deliver mission-critical yet ambiguously defined new products on time and on budget. In turn, managers cajole their teams to write detailed plans, to remain frugal around program spending, and to minimize schedule variations and downtime. Changing market and client feedback as well as shifting internal priorities complicate matters.  Managers never [sic] seem to have enough resources to get the job done.  All the while, their superiors continue to insist they adhere to predictable schedules, budgets and deliverables. This rigid approach, which may work well in turning around under-performing manufacturing lines, can actually be hurting the product development effort.

The article’s authors, Stefan Thomke and Donald Reinertsen, identified a number of fallacies around the standard product development process that produces delay, compromises quality and raise costs.

Sticking to “great” plans

Many companies put inordinate faith in their new product plans.  However, we have never seen a project plan remain unchanged throughout the design and commercialization process.  Slavish adherence to a first generation plan – no matter how well crafted – will often lead to poor outcomes since product requirements often change, new customer insights are discovered and operational processes take time to solidify.  This is not to say that upfront planning is wrong; managers just need to ensure their kick off project plans and goals are flexible enough to adjust to changing financial assumptions, new customer and channel feedback and actual operational lessons.

The higher resource utilization, the better

Conventional wisdom says that the busier the product development group, the more efficient they become.  The reality is often the opposite.  After a certain point, more work inevitably leads to more setbacks and lower quality work.  These diseconomies of scale trace to the intrinsic variability and opaqueness of development work. Delays can result from multiple projects queuing, the difficulty of managing stressed knowledge workers, and the challenge of working with semi-completed actions and milestones.

Another unexpected problem with running a fully utilized product development group is the tendency for executives to start too many projects reflecting shifts in corporate priorities or personal agendas.  Furthermore, product managers and developers abhor idle time and look to further their careers with new initiatives.  They tend to launch more projects than they can realistically complete given existing resources and reasonable timelines. All of these organizational issues dilute resourcing on all projects and triggers delays for the real high priority projects.

Get it right the first time

As in life, it is challenging to get things right from the get go when there is a high degree of uncertainty. In virtually every company we have worked with, there is always an inordinate amount of pressure on the team to get the execution right the first time.  The problem with this imperative is that it often leads to sub-optimal results (especially when linear project management approaches like Six Sigma are crudely employed). Specifically, the team will often default to the least risky – not the ideal – solution; true project costs tend to be pushed out beyond the initial project horizon and; employees have little incentive to pursue innovative alternatives.  This last point can be particularly dangerous because employees will cling to bad ideas longer than they should.

Organizations can take a variety of steps to overcome these failings, including:

  1. Treat the project plan as a work-in-progress that should evolve to reflect new information and conditions;
  2. Commence projects only when there is a full organizational commitment and sufficient resourcing to ensure success;
  3. Utilize parallel, not linear, project planning schemes to avoid delays and illuminate problems earlier in the implementation process;
  4. Employ quick feedback mechanisms instead of first pass success to rapidly inculcate new customer and operational learnings;
  5. Tolerate some resource slacking to preempt project delays and foster creative solutioning.

For more information on our services and work, please visit the Quanta Consulting Inc. web site.

Advertisements

Cutting the cost of IT

In most organizations today, IT is firmly planted near the top of the strategic agenda.  Businesses continue to require new software and hardware to interact with customers, manage supply chains, and process transactions. However, the bygone days of CIOs getting a blank check for the latest IT application is long gone.  Infrastructure and operating (I&O) cost reduction is now an important priority. Even after multiple rounds of cost cutting over the past few years, many CEOs and CFOs continue to look hungrily at IT budgets that could now approach 15-20% of total spending in many companies. Fortunately, opportunities abound. A proactive and systematic cost reduction initiative could reduce IT expenditures in the short term by 10%, and 25% over the following 3 years.

According to Gartner Research, I&O costs make up 60% of the typical enterprise IT budget.  These costs encompass all the activities that deliver IT to the organization, including: facilities, hardware, software, services, labour and network costs.  Up to 80% of these costs fall into 3 omnibus areas:  data center operations, network fees and supporting the lines of business.    Shaving these expenditures is a major opportunity area in most firms.  In a 2011 survey of IT executives, Gartner found that only a minority of companies were more than halfway down their IT cost savings path.

There is no magic bullet to reducing IT expenditures while ensuring ‘always on’ computing remains responsive to dynamics business needs. Our work with savvy CIOs has identified many cost reduction best practices, some of which include:

Consolidate IT

Significant savings of 15-20% can be garnered by consolidating IT through server rationalization, moving to standardized software platforms, negotiating better IT provider terms and by optimizing the data center.   For example, many IT managers out of habit or risk aversion put all their computing needs in the most robust and secure data centers.  This not need be the case.  Lower tier requirements (e.g., development, testing environments) and applications (e.g., training, HR) can be placed in lower-tier facilities with minimal business impact. Furthermore, lower-tier facilities can still be used for hosting production environments and critical applications if they use virtualized failover— where redundant capacity kicks in automatically— and the loss of session data is acceptable (as it is for internal e-mail platforms for example).

First virtualize, then buy

Most IT infrastructures operate at less than 15% capacity on average due to uneven demand, decentralized purchasing and “siloed” resourcing.  Driving up utilization through grid or virtualized computing is a cheaper and easier option than buying expensive hardware & software and building new data center to handle the new assets. “Dedicated infrastructure will usually be an order of magnitude lower in utilization than an intelligently shared infrastructure,” said Gary Tyreman CEO Univa Corporation. “Using grid computing to share infrastructure across multiple applications is more efficient, saves money and simplifies capacity planning and governance.” We have seen many companies use server virtualization and grid computing to boost IT utilization rates in excess of 75% while reducing energy, facilities and operating costs.

Target power and cooling efficiencies

Power and cooling are significant cost centres and barriers to higher IT utilization.  Many companies can cut 5-20% in operating costs by deploying energy-efficient power and HVAC equipment and making simple infrastructure upgrades. Furthermore, augmenting cooling can also boost scalability.  In many cases, older data centers have dated air-conditioning systems that limit the amount of server, storage, and network equipment that can be placed in these sites.  Capacity can often be inexpensively and quickly improved by upgrading infrastructure cooling efficiency, using free cooling and installing energy management systems.

Troubleshoot better

Adding hardware, software and facilities isn’t always the most direct or effective way of making applications more available. The vast majority of IT downtime is the result of architecture, application or system design flaws not hardware or software problems. Instead of looking first to upgrade the infrastructure, smart firms are adopting integrated problem management capabilities that gets to the root cause of problems, significantly reducing infrastructure costs and maximizing application up-time.  Additionally, major cost savings can be gained by pushing IT support down from expensive tiers to lower, less expensive tiers that are able to satisfactorily resolve the user’s issues.  Right-sizing IT support should include the deployment of low cost, self-service portals to handle issues like password resets and ‘how-to’ queries.

These days, the cost of IT is too big to be ignored.  CIOs can quickly increase IT’s returns on assets and operational performance without increasing business risk by: thoroughly understanding their cost base (and how it compares to their peers); diligently pursuing ‘low hanging’ cost reduction opportunities and; deploying new architectural and virtualization schemes that deliver more IT for less money.

For more information on our services and work, please visit the Quanta Consulting Inc. web site.

5 sources of growth in 2013

North America is mired in a low growth funk driven by cautious consumer spending and frugal capital expenditures.  For 2013, many CEOs are bracing for zero or even negative revenue performance. Even frothy companies are adjusting to this ‘new normal’ by continuing to restrain R&D, sales & marketing and M&A activity.  Is this reaction a tad premature?  Have firms exhausted all avenues for growth?  Since 2008, we have helped a variety of dynamic companies drive topline growth an average of 27% by identifying market ‘white space’ and monetizing under-utilized assets.  Managers should explore these 5 areas to propel their 2013 business:

1.    Find under-serviced or ignored niches around your core offering

The fluid nature of many categories and consumers hide a number of market anomalies that could be exploited by nimble firms.  For example, the majority of markets can support different strategic positions including low cost, specialized and premium offerings.  Some categories, however, are missing one of these players offering opportunities for new and differentiated entrants.

Other companies will discover adjacent “white space” – an ignored market or compelling, unmet consumer need – where they could extend their strong brand franchises. P&G has done this successfully by launching Crest White Strips, extending their Oral Care line-up from toothpaste and brushes into the Whitening business; and by launching an array of Swiffer products to clean various surfaces in addition to their other cleaning line.  “Cutting costs is important, but you cannot shrink your way to growth.  You’ve got to reinvest the savings in distinctive value-added products and services that customers are happy to pay for.” said Tim Penner, retired President of P&G Canada.

2.    Increase revenue from current customers

Most firms inadvertently leave money on the table, often with their best customers. This occurs for a number of reasons including: over-zealous discounting; poor visibility into the customer’s potential value; low customer awareness of the vendor’s full offering or; ineffective cross-selling programs.    Fact is, opportunities exist in every customer relationship and company.  We designed a revenue maximization program for a software company that plugged billing leaks and better aligned pricing to value deliveredpainlessly producing an 18% revenue lift.  In another case, we helped a U.S. industrial goods manufacturer double their cross-selling rates by mining their customer data with advanced analytics and developing targeted sales and marketing initiatives.

3.    Turn platforms into new revenue generators

Following significant capacity, infrastructure and IT investments over the last decade, many firms now have robust but under-utilized operational platforms that can be leveraged into new revenue opportunities.  Amazon has successfully pursued this strategy.  Early on, they recognized the potential of their B2C e-commerce platform by launching a host of new B2B services including cloud computing, online storage and merchant e-commerce services.

4.    Maximize all distribution opportunities

Many marketing strategies have not kept pace with the buying habits of their customers, who increasingly are directing their purchases through a plethora of direct and indirect on & offline channels.   Filling these distribution gaps is an ideal way to build volume and outflank competitors.  For example, we helped a consumer products company drive a 18% increase in shipments by gaining ‘bricks and clicks’ shelf space in non-traditional retail and B2B channels.  Furthermore, firms can no longer ignore the revenue, margin and custom experience benefits of going direct to the consumer.

5.    Monetize intellectual property and process by-products

In some firms, healthy investments in R&D and strategic partnerships have spawned a significant amount of intellectual property.  Much of this IP may now be lying dormant due to lower commercialization investments or a shift in corporate strategy. Organizations should look to monetize inactive IP through outright sale or by licensing to non-competitive 3rd parties.  In addition, many companies like Cook Composites and Polymers have discovered that there is gold in the waste by-products of their manufacturing processes. Turning waste into new products can create new streams of high-margin revenues and improve sustainability performance.

In tough times, prudent companies will seek to maximize their revenue by better leveraging their existing customer base, resources and capabilities.  To realize this potential, managers will need to relook their entire business, including: enhancing their understanding of the market ecosystem, mining their consumer data and; looking for creative ways to serve customers in unique and compelling ways.

For more information on our services and work, please visit the Quanta Consulting Inc. web site.

Bad publicity can drive sales

Receiving bad publicity has always been a double-edged sword for companies.  The old maxim, that there is no such thing as bad publicity, is tempered by a variety of academic studies that demonstrate that getting damaging news hampers sales. New research published in the Harvard Business Review takes a more nuanced position on this age-old question and suggests that for some products in certain industries, negative news can trigger a sales lift.

Professors Jonah Berger (Wharton), Alan Sorenson and Scott Rasmusson (both from Stanford Graduate School of Business) analyzed the sales patterns of 250 fictional books reviewed in the New York Times between 2001 and 2003.  The researchers compared sales patterns before and after receiving a review from a critic.

As expected, good feedback increased all books sales from 32 to 52%.    Books by established authors that received negatives reviews all saw their sales, not surprisingly, fall 15% on average. Interestingly, unknown authors who received bad reviews saw their sales spike 45% on average.    The sales increases occurred even when the review was cutting.  In one case, a book with an adverse critique like, “the characters do not have personalities so much as particular niches in the stratosphere” saw its sales increase by over 400%.  The elapsed time following a bad review was shown to have an important impact.  Bad reviews initially hurt all books, but the negative sales effect diminished quicker for unknown authors.

These findings have many real-world counterparts.  According to the authors, a $60-a-bottle Tuscan wine experienced a 5% sales lift after a popular on-line reviewer likened its smell to “stinky socks.” Shake Weight, a vibrating dumbbell widely panned in the media (“The most ludicrous fitness gadget of all time?” said one newspaper) racked up $50M in sales.  Why did these poor reviews lead to a sales boost?

With a relatively unknown product, the value of increased awareness due to bad publicity significantly outweighs the ill effects of the evaluation.   Moreover, the impact of the negative review will quickly taper off leaving higher product awareness and little memory of the bad assessment.

Our learnings from working with crisis PR teams suggest that marketers regularly over-estimate the negative impact of bad publicity on their target audience.  Given the frantic life of the typical individual and the level of media “noise” in today’s society, most people do not have the attention span or inclination to pay close attention to the details or context of most negative publicity.  All they tend to remember is the product name – which the authors have shown has a positive business effect.

These findings have important implications for personal brands (say for an entertainer or athlete) and companies, especially those that are content or experiential-based like books, movies, video games, theme parks, music and theatre.

  1. Don’t be so quick to squash negative publicity for a new, obscure or undifferentiated product;
  2. When faced with bad publicity, pursue damage control for well-established brands such as consumer electronics, cars, video games, software or apparel that depend heavily on pre-launch marketing programs and expert reviews;
  3. Consider undertaking controversial public relations tactics to increase the awareness of new products or brands that are slated to be re-launched.

For more information on our services and work, please visit the Quanta Consulting Inc. web site.