Archive for the ‘IT’ Tag

Overcoming the pain of Technical Debt

Many businesses are hamstrung by expensive and inflexible information technology. To wit: The average firm’s spend on IT has swelled to the equivalent of between 4 percent and 6 percent of revenue, thanks in part to neglect, poorly executed integrations and the breakneck speed of technological change.

While the exact toll of lost productivity and hampered innovation for any given firm is difficult to quantify, it’s safe to say that the true cost IT is greater than what appears on a company’s ledger. Research firm Gartner estimates the total cost of poor systems architecture, design and development will reach US$1 trillion in 2015. Put another way, that’s an average of US$1 million per organization, according to analytics firm Cast Software, and US$3.61 per line of code.

This hidden expense is referred to as “technical debt.” Reining in technical debt is an ongoing challenge for IT leaders because the cost of lost opportunities is tricky to peg while the cost of modernizing legacy systems is immediately tangible and often significant. But understanding technical debt is vital for organizations angling to improve performance through new technologies, improved agility and tighter cost controls.

I first encountered the dangers of technical debt when I did consultant work for a medium-sized manufacturer. In our search for savings, we found that maintaining one legacy system was consuming nearly 85 percent of the firm’s IT maintenance budget while rendering the integration of new applications difficult and risky. Worse, support activities were diverting scarce resources away from growth-enabling automation initiatives.

In that instance the firm was able to successfully phase out the old system while phasing in a new, more effective and cost-efficient replacement. But the question remains: Why did the firm’s IT leaders run up so much technical debt in the first place?

“The challenge is twofold,” explains Mike Grossman, founder IDI Systems, an automation development firm that regularly confronts technical debt in the course of infrastructure projects. “First, how can you economically and practically support current processes and business capabilities with existing — and potentially deteriorating — code, tools and processes? And second, how and when are you going to transition these old systems to support your new business objectives?”

Think of a legacy IT system as an old clunker. The driver understands that buying a new car is cheaper and easier in the long run, but either doesn’t have the down payment on hand or can’t spare a day without wheels. So instead of efficiently getting where they need to go, they’re stuck trying to keep an old car running by repairing old parts and adding new ones.

Where the metaphor falls flat, however, is in underscoring the value proposition of abandoning the old. The difference between a messy legacy IT system and a modern, fully integrated and efficient one is greater than the difference between an old car and a new one. While either vehicle will get you where you want to go, a world-class IT system can take your firm places that your current infrastructure would never allow. This is due to the opportunities for innovation that arise from a top-notch system.

That’s not to say that eliminating technical debt is as simple as hiring a team of developers to rebuild your infrastructure from the ground up. Before any such decision is made, consider the following steps:

  • Calculate your existing technical debt. To do this, compare the capabilities of your current software and hardware to industry-leading versions.
  • Determine your firm’s goals. Consider both the extent to which your current activities depend on your legacy system and what new functionality you will require for future, growth-generating activities.
  • Identify and align around the priority areas for remediation.
  • Find and deploy talent to replace or redesign legacy systems.
  • Measure and track progress at a senior level along the way.

And remember: Even after you’ve successfully upgraded your IT systems, the threat of running up technical debt remains. This is due both to the changing nature of technology and of business. While senior leaders ought not to obsess over technical debt, keeping a vigilant eye on the efficiency and capabilities of IT operations can be the difference between running in place and forging forward.

For more information on our work and service, please visit the Quanta Consulting Inc. web site.

Advertisements

Cutting the cost of IT

In most organizations today, IT is firmly planted near the top of the strategic agenda.  Businesses continue to require new software and hardware to interact with customers, manage supply chains, and process transactions. However, the bygone days of CIOs getting a blank check for the latest IT application is long gone.  Infrastructure and operating (I&O) cost reduction is now an important priority. Even after multiple rounds of cost cutting over the past few years, many CEOs and CFOs continue to look hungrily at IT budgets that could now approach 15-20% of total spending in many companies. Fortunately, opportunities abound. A proactive and systematic cost reduction initiative could reduce IT expenditures in the short term by 10%, and 25% over the following 3 years.

According to Gartner Research, I&O costs make up 60% of the typical enterprise IT budget.  These costs encompass all the activities that deliver IT to the organization, including: facilities, hardware, software, services, labour and network costs.  Up to 80% of these costs fall into 3 omnibus areas:  data center operations, network fees and supporting the lines of business.    Shaving these expenditures is a major opportunity area in most firms.  In a 2011 survey of IT executives, Gartner found that only a minority of companies were more than halfway down their IT cost savings path.

There is no magic bullet to reducing IT expenditures while ensuring ‘always on’ computing remains responsive to dynamics business needs. Our work with savvy CIOs has identified many cost reduction best practices, some of which include:

Consolidate IT

Significant savings of 15-20% can be garnered by consolidating IT through server rationalization, moving to standardized software platforms, negotiating better IT provider terms and by optimizing the data center.   For example, many IT managers out of habit or risk aversion put all their computing needs in the most robust and secure data centers.  This not need be the case.  Lower tier requirements (e.g., development, testing environments) and applications (e.g., training, HR) can be placed in lower-tier facilities with minimal business impact. Furthermore, lower-tier facilities can still be used for hosting production environments and critical applications if they use virtualized failover— where redundant capacity kicks in automatically— and the loss of session data is acceptable (as it is for internal e-mail platforms for example).

First virtualize, then buy

Most IT infrastructures operate at less than 15% capacity on average due to uneven demand, decentralized purchasing and “siloed” resourcing.  Driving up utilization through grid or virtualized computing is a cheaper and easier option than buying expensive hardware & software and building new data center to handle the new assets. “Dedicated infrastructure will usually be an order of magnitude lower in utilization than an intelligently shared infrastructure,” said Gary Tyreman CEO Univa Corporation. “Using grid computing to share infrastructure across multiple applications is more efficient, saves money and simplifies capacity planning and governance.” We have seen many companies use server virtualization and grid computing to boost IT utilization rates in excess of 75% while reducing energy, facilities and operating costs.

Target power and cooling efficiencies

Power and cooling are significant cost centres and barriers to higher IT utilization.  Many companies can cut 5-20% in operating costs by deploying energy-efficient power and HVAC equipment and making simple infrastructure upgrades. Furthermore, augmenting cooling can also boost scalability.  In many cases, older data centers have dated air-conditioning systems that limit the amount of server, storage, and network equipment that can be placed in these sites.  Capacity can often be inexpensively and quickly improved by upgrading infrastructure cooling efficiency, using free cooling and installing energy management systems.

Troubleshoot better

Adding hardware, software and facilities isn’t always the most direct or effective way of making applications more available. The vast majority of IT downtime is the result of architecture, application or system design flaws not hardware or software problems. Instead of looking first to upgrade the infrastructure, smart firms are adopting integrated problem management capabilities that gets to the root cause of problems, significantly reducing infrastructure costs and maximizing application up-time.  Additionally, major cost savings can be gained by pushing IT support down from expensive tiers to lower, less expensive tiers that are able to satisfactorily resolve the user’s issues.  Right-sizing IT support should include the deployment of low cost, self-service portals to handle issues like password resets and ‘how-to’ queries.

These days, the cost of IT is too big to be ignored.  CIOs can quickly increase IT’s returns on assets and operational performance without increasing business risk by: thoroughly understanding their cost base (and how it compares to their peers); diligently pursuing ‘low hanging’ cost reduction opportunities and; deploying new architectural and virtualization schemes that deliver more IT for less money.

For more information on our services and work, please visit the Quanta Consulting Inc. web site.

Do you have too much IT?

These are heady times for technophiles.  New technologies like mobile computing, data analytics, social networking, and cloud computing has propelled IT back to the top of corporate agendas.  However, in the rush to exploit new applications, many companies can easily over indulge in IT with negative repercussions on cost, ROI and organizational performance.

In today’s competitive economy, IT exuberance is understandable.  Managers want to use breakthrough technologies to serve customers better, improve performance and ring out more cost savings from operations.  At the same time, nobody wants to go through the carnage of the early 2000s when firms threw away $130B in IT spending between 2000-2002 (source:  Morgan Stanley). Furthermore, CEOs can no longer ignore the high cost of IT in their search for bottom line savings.  In some firms, the IT budget is now approaching 12-15% of total corporate spending.

Managers are faced with a dilemma:  how do you take advantage of new technologies (if they are any good) without overspending and distracting the business?  Based on our research and client experience, we recommend the following maxims:

1.  IT must follow business strategy not the other way around – Typically, many managers look to get the latest applications, functionality and hardware before they understand how it would fit into the corporate strategy and workflows, or because they succumb to common phenomena like ‘feature creep’ or ‘keeping up with the Joneses.’  As a result, much of the IT purchased does not end up being deployed or effectively utilized.  There are a variety of reasons for this, including:  uneven management attention, insufficient employee training or poorly articulated requirements.

When strategy and goals dictates what resources are needed and when, less IT is inevitably purchased and more is utilized.  To make this happen, firms should tweak their cultures in two ways.  First, business sponsors should take the responsibility for better understanding existing IT assets and capabilities.  They should jointly propose with IT technical solutions that align to business needs and corporate strategy.  Second, the IT department must adopt an ‘inside-out’ approach to recommending technology.  To do this, they must be congruent with business goals, strategy and plans before seeking out the ideal IT solution.

2.  The organization is the focus – The role of IT is to support the organization, not the other way around.  It is common for impatient managers to throw IT resources at what appears to be a business problem, when in fact it is the workflows, structure and policies that are the issue.   Leaders need to first make sure the organization’s roles & responsibilities, decision rights and processes are optimized before considering new IT resources.

In addition, firms need to recognize that IT is an aid to judgment not a replacement for it.  A case in point is data analytics.  The potential of new DA technologies to better segment customers or identify operational improvements is hard to resist.  However, managers need to tread carefully to ensure their organizations have the capabilities, skills and focus to fully leverage the power of DA or implement its insights.

3.  IT simplicity should be the goal – Not surprisingly, the typical IT department is a mish mash of hardware, applications, operating systems, vendors and skills.  This complexity breeds more complexity when managers start to add capabilities while continuing to support legacy systems.   No wonder IT spending can quickly, quietly and unexpectedly spiral out of control.

Standardizing the computing platform across a company or business unit is one answer.  Many companies like Cisco and Zara have gained significant productivity improvements and enterprise-wide IT savings by standardizing on a limited number of platforms, applications and vendors.  In fact, firms can generate savings through scale economies and experience effects even when the individual asset is not the least expensive or the most capable.

Another way of getting more IT for less money is to move your computing into the cloud.  While valid security and technical concerns remain, there are enough case studies and organizational best practices to justify moving many IT operations and applications, particularly non-core activities.

4. Re-exert transparency and control – Mismanaged IT spending is a pervasive problem in large organizations, particularly where there are weak controls and spend opacity. We’ve seen companies with strict headcount ceilings simultaneously give free rein to junior IT managers to purchase hardware, software licenses and consulting services at their leisure.  A hospital we work with allows researchers to buy new hardware for every new project regardless of the presence of hundreds of under-utilized servers and licenses lying around.   In our experience, rogue purchases can account for up to 25% of an IT budget.

To counter this, management needs to apply the same spending rules and discipline to IT as they do with other functional groups and expense categories.  Furthermore, centralized purchase and finance departments should have more knowledge and visibility into existing IT assets and vendors in order to encourage the sharing of assets across business units and departments.

Many companies will flourish despite a minimalist approach to IT but to a large extent because of it.  A ‘less is more’ IT strategy can lead to lower spending, reduced business complexity and higher employee engagement. Achieving this is as much about strategic alignment and organizational optimization as it is about technology selection and resourcing.

For more information on our services and work, please visit the Quanta Consulting Inc. web site.

Organizing for cloud computing

Many organizations we work with are diving head first into the latest IT game changer, cloud computing.  While a comprehensive technical and financial analysis is usually undertaken, few companies thoroughly consider the organizational implications of this strategic move. They do this at their own peril.  We have seen cloud computing implementations go astray when the wrong structures, processes and practices compromised the right technical solution.  Managers would be wise to consider whether their organizations are cloud-supportive before re-architecting their infrastructures.

In a traditional IT model, technicians, hardware and software are tied to specific geographies, departments and business units.  In most cases, this model fails to maximize operational flexibility and IT asset utilization.  A CC architecture, on the other hand, centralizes and virtualizes IT resources, making them available to all users when needed as needed. The result is greater operational agility, lower costs and higher IT scalability.  This fundamental change in the way IT is treated has major implications on a firm’s organizational system and culture.  For example, who controls virtualized IT resources and priorities in an ‘on demand’ environment? How do companies execute projects when assets and capabilities are decoupled from a physical location? And, what work practices are better suited for a more transactional and fluid CC environment? 

If they are to maximize the benefits of CC, business leaders must rethink how their enterprises are organized and run. Based on our consulting experience, we know the following areas are a good place to start:

Focus on tasks, not structure

CC’s rapid IT provisioning enables companies to be more flexible and agile, for example, in deploying new applications faster or responding quicker to market needs. However, many firms have rigid structures and processes that were developed in the era of static IT resourcing.  This traditional model is too limiting to effectively exploit the benefits of CC.  To be cloud-ready, managers should experiment with other organizational approaches that are more synergistic with the way CC works.   For example, an adaptive, SWOT-team structure and working style can more quickly respond to new priorities and deploy the resources and expertise needed to deliver on the business need.  The film industry is a good example of this kind of adaptive system; a wide variety of people and capabilities come together quickly at different points in the production process to execute on a creative concept and plan.  At completion, the people and resources go back to a central business unit or are dispersed onto other projects. 

Form follows function

In a traditional IT model, resources are usually structurally (if not mentally) “siloed” and linked to specific functions via non-standard workflows (i.e. processes)  Putting IT resources in the cloud decouples them from the constraints of a physical location, allowing them to be managed more centrally and deployed virtually.  As such, CC can help bring about the formation of a true Shared Service Organization, a structure that delivers key business benefits. For example, a capable SSO is essential to enabling the adaptive business system mentioned above – assuming good workflows are in place. However, Gary Tyreman, CEO of Univa, a leading supplier of Cloud Computing solutions, cautions that “to realize value, an organization must integrate its cloud-powered IT services into existing workflows.  Where those workflows are broken or non-existent, they need to be fixed and defined.”  Secondly, a SSO brings significant value including lower administrative costs, increased management control & standardization, and the possibility for greater organizational learning.  Finally, having a SSO allows IT managers to focus more on pushing the business forward as opposed to hoarding resources and building fiefdoms.

Collaboration breaks down barriers

The common business environment – hierarchical roles, non-standard processes, and department-based metrics – encourages employee practices that are ill-suited to the dynamic nature of CC. To best leverage the cloud’s capabilities, employees need to change how they work.  To begin with, the leadership must foster increased collaboration and alignment within the firm as well as with external vendors.  Examples of the changes required, include:  better aligning IT teams and vendors to overall business objectives (versus more parochial departmental goals); encouraging end-to-end project collaboration (versus point-in-process support); and placing greater importance on team and individual skills enhancement (to drive best practice adoption).  To make these changes stick, leaders will first need to get two things right in their management system.  One, project accountability should live with the business sponsor. Two, responsibility and authority must reside with the SSO leadership.

According to Tyreman, “For most companies, moving to the cloud is more an organizational challenge than a technical problem.”  Fully tapping CC’s potential will require enterprises to recast their structures, processes and management systems where appropriate. Though this may not be easy, it need not be scary. Companies that are open-minded, practical, and flexible will create the right organizational environment to fully leverage the Cloud.

For more information on our services and work, please visit the Quanta Consulting Inc. web site.

The perils of offshoring

For North American companies looking to stay competitive, outsourcing some or all of their back-office business operations to India has achieved the status of dogma. However, in the past couple of years poor outcomes, changing cost dynamics and continued cultural challenges have swung the value and performance advantage back to North American providers in many cases.

The times they are a-changin’

Firms migrated operations to India to save money, focus on their core competencies, and move way from a fixed cost structure.  Today, faith in offshoring must be tempered by reason.  In the last few years, India’s significant advantages have yielded to some harsh economic realities.   New cost dynamics and the reality of doing business halfway around the world with a very different culture have reduced the attraction of offshoring many operations, particularly those in knowledge intensive industries.

India’s fading appeal

Four key developments, unlikely to dim in the medium term, are contributing to offshoring’s declining appeal:

Shrinking wage differentials

India’s primary advantage, low labour costs, has been steadily declining.  According to the U.S. Bureau of Labor Statistics, India’s average per-hour cost advantage in 2010 had shrunk to only 6-7 times U.S. rates versus 11 times the rate in 2001. This shrinking differential traces to a combination of Indian wage inflation and North American wage moderation.   If present trends continue, this gap could shrink to five times the U.S. rate by 2014.

Pervasive cultural challenges

India remains a culturally challenging place to do business; a situation unlikely to change in the medium term.  The differences–language, cultural mores, business practices–generate high indirect costs by introducing complexity, miscommunication and risk.  Furthermore, persistently high labour turnover in all Indian firms complicates attempts to close this ‘cultural gap’.

Higher than expected administrative costs

When they began outsourcing, firms understood there would be transaction costs — travel, communication, compliance and relationship management.  What virtually every company has experienced are administrative costs typically three times higher than their estimates and all tracing back to geographic and cultural challenges.  In some cases, these costs can make up close to 20% of the total project cost.

Increased business risk

Today, effective risk management (e.g., protecting intellectual property and sensitive data, business continuity) is a strategic prerequisite for many companies.  Lingering doubts remain that sensitive data and intellectual property sent over to India (or any other emerging economy) is as secure as it would be in North America.  Not surprisingly, some government regulations continue to prevent certain types of IP and sensitive data from leaving North America.  Furthermore, India remains in the center of one of the world’s most dangerous regions, with instability on all of her borders and inside to boot.

Case in point: IT services

IT services provide a good illustration of the challenges of offshoring. For the past 10 years, CIOs and professional services firms have enthusiastically offshored to India large swathes of their IT work in order to reap the advantages of lower wages and round-the-clock development.

In many cases, however, the promise has not kept up with reality.  India no longer possesses the same IT cost advantage versus innovative Canadian firms.  Alex Rodov, managing partner of North America’s largest dedicated software testing firm, QA Consultants, contends that “Canadian IT labour rates on average are no more than 20% higher than India’s.  After you factor in the high administrative costs, lack of visibility and hassle of doing business around the world, then our delivered costs are roughly equivalent.”  Secondly, India’s workers continue to suffer from poor productivity.  Despite working in modern facilities, most Indian IT workers (including recent grads) lack basic technical skills and rudimentary English language proficiency.  In fact, the Wall Street Journal has reported that 75% of India’s technical graduates are unemployable by their IT sector.

Finally, the integrated structure favoured by most Indian software enterprises — firms develop and test their own code — poses real quality and delivery risks. “This [model] often leads to poor outcomes.  Testing should never be done by the same firm and people writing the code,” says Rodov, “as they lack objectivity and independence.  Furthermore, when a project runs late or is over-budget, the same Indian firm will prioritize development, often cutting corners with vital testing operations.”

For many business operations the pendulum is beginning to swing back to North America. Many companies have done the math and now realize that some local providers can deliver better value and lower risk versus an offshore Indian solution. A forthcoming article looks will look at how innovative North American firms are beating the offshorers at their own game.

For more information on our services and work, please visit the Quanta Consulting Inc. web site.

New IT, a diamond or a lemon?

When it comes to deploying new technologies in areas like social media, mobile enablement and cloud computing, CIOs face a bewildering array of hardware and software choices. Moreover, management dynamics can further complicate matters. For example, IT professionals often fall into the trap of chasing the latest technology fad, over-designing for the application or over-committing to their internal stakeholders. All of these issues will ratchet up complexity, making it hard to separate the good technology – for your company – from the bad stuff.

The cost of choosing the wrong combination of hardware, software and services can be high in terms of wasted investment, greater project risk and longer time to business value. How do managers determine whether a new technology is suitable for their requirements? Our firm helps IT departments make these important decisions through the use of a common-sense yet rigorous 6-step vetting process:

  1. Is there a commonly accepted nomenclature for the technology? Immature or early stage technologies often feature a disparate set of names and descriptions. A new technology is not sufficiently advanced if the industry can’t align around agreed upon terms and definitions.
  2. Have standards emerged? New technologies do not coalesce around standards quickly, as vendors jockey to gain market penetration for their products.  A lack of standards will pose challenges for prospective buyers who want to compare vendors on performance and features, as well as integrate the new technology into their existing IT infrastructure.
  3. Is there competition? The presence of multiple providers validates that a new technology is evolving into an established category.  Having more than one vendor allows managers to evaluate different solutions, set reference prices to minimize cost and avoid single vendor lock-in. 
  4. Is there clarity around functionality and attributes? A lack of clarity in marketing materials or specifications is evidence that an early stage technology is immature or has been over-sold.   Managers should not purchase any new technology unless they are very clear about its functionality, features and value. If you don’t truly understand what the technology is supposed to do, chances are your technical and business users won’t either.
  5. Are there customers?  Having existing (and paying) customers using the technology is crucial to providing your company with use cases as well as ensuring the vendor offers sufficient support and ongoing product development.  Be wary if vendors can not provide a client list.  It is also important to understand whether an ecosystem – customers, consultants, 3rd party developers and community – has evolved to support development and implementation.
  6. What have other user’s experienced? CIOs should be concerned if the new technology has no verifiable and ROI-driven success (deployment and production) stories.  Failures matter as much as successes as they will help you set realistic expectations and understand technical gaps.

There are no guarantees that a new technology will not turn out to a lemon.  However, insisting that vendors answer some simple questions can significantly reduce the performance, implementation and financial risks.

For more information on our services and work, please visit the Quanta Consulting Inc. web site.

Winning with Data Analytics

It’s no secret that leading firms such as Walmart, American Express, Capital One, Amazon and CarMax use cutting-edge Data Analytics to outflank competition, improve marketing & operational efficiencies and get closer to their customer’s needs.  Making sense of internally generated data – it’s collection, synthesis and reporting – and turning it into learnings and actionable strategy is what DA is all about.  All medium to large size companies generate reams of data on their customer habits, supply chain execution and financial performance.  Yet, few of them derive as much value out of this vital asset as they could.  What separates the pace setters from the laggards is the organizational environment underpinning the DA function, specifically, the level of management commitment, cultural readiness, and analytics & IT expertise.

A recent global survey (the second one in as many years) of 4,500 business executives by MIT’s Sloan Management Review explored key barriers and success drivers around DA. The results dovetail closely with our consulting experience in a number of data-intensive Canadian organizations.  Below are some of the key findings and implications:

Analytics is growing in strategic importance

Increasingly, managers see analytics in strategic terms – outflanking competition, transforming customer relationships, sparking operational innovation – and not just a means of incrementally improving business performance.  According to the survey, 58% of respondents viewed DA as a source of competitive advantage, up from 37% in the previous survey.

Not surprisingly, the study found that “experienced” firms are extracting significantly more benefit from DA than “basic” users.  The most experienced DA companies (who utilize tools like data visualization, advanced modelling and sophisticated data mining) reported a 50% year-over-year improvement in competitiveness.  Conversely,  organizations that are employing basic functionality such as spreadsheet-based budgeting and forecasting cited a 5% year-over-year decline in competitive advantage. What are the lagging companies missing?

Leveraging analytics requires a trio of competencies

Many would suggest that deploying high performance hardware and software solutions is the best way to enhance DA capabilities and deliver a strong return on investment.  Though resources and technology are important, the respondents – particularly the experienced users – reported that demonstrated competencies in 3 areas is more crucial:

  1. Managing information, in areas such as integrating data silos, making data usable, deploying collaboration tools;
  2. Maintaining analytics expertise, around using predictive analytics, supporting scenario development, automating algorithms etc;
  3. Fostering a data-oriented culture. 

The findings and our research confirms that there is no “typical” roadmap as to which competency is more important or should come first.  They are all prerequisites.   

Like many good things, there is a risk of over-indulging in DA before a company can fully digest its capabilities. For example, the sheer amount of data can slow down decision making by creating “analysis paralysis” as well as lead to significant data management and hardware/software costs.  Leaders must set yearly DA priorities while ensuring their functional groups/divisions align to the data that directly impacts key metrics – versus what is merely “nice to have” information.

Data-oriented cultures have unique attributes

Analytics-focused companies go beyond clichés to incorporate specific cultural norms and practices that leverage analytics capabilities and learnings.  A significant majority of respondents reported that data-oriented cultures had the following key elements:

  • View analytics as a core enabler of business strategy and day-to-day activities;
  • Senior leaders and middle manager champions regularly support DA across the organization;
  • An emphasis on communicating data and insights vertically and horizontally, especially to the front-line employees who need them on a daily basis;

The more experienced a firm is with DA, the greater is their ability to overcome internal challenges around sharing information, sustaining focus and coping with poor processes.  Only 30% of experienced DA users considered organizational issues to be difficult to resolve compared with 60% of basic users.

Resources still matter

When it comes to enabling sophisticated analytical modelling, data visualization and knowledge management there is no free lunch. Companies still need the right methodologies and a robust, enterprise-wide IT infrastructure to effectively collect, process, report and manage the data.  Furthermore, there must be sufficient analytics expertise and tools at both the manager and specialist levels to effectively manage the data through its life cycle as well as to leverage it strategically.

These findings have significant implications for all companies seeking to gain competitive advantage through analytics.  Firstly, the more a firm leverages DA across and up/down the enterprise the more it will reap in terms of greater efficiencies, improved customer focus and enhanced performance.  Secondly, each company will define the DA path that best suits their competitive position, business requirements and available resources.  Although this article identified guiding principles, there is no ‘best practice’ template. Finally, mutually reinforcing factors such as consistent leadership, cultural receptivity, silo-busting information management systems and analytics expertise are essential to exploit the full potential of analytics.

For more information on our services and work, please visit the Quanta Consulting Inc. web site.

Are you ready for Cloud Computing?

Over the past few years, few technologies have been hyped as Cloud Computing.  According to the pundits and early adopters, CC is transforming the face of corporate IT at the same time as delivering compelling business value.  Simply put, CC is a suite of enterprise-level technologies that enables organizations to draw their computing power and data from a separate and centrally managed pool of compute resources including servers and software licenses. CC has a compelling basket of benefits for firms of all sizes in all industries.  Company’s can significantly reduce IT operating costs and increase server utilization.  Additionally, CC can enable a more agile and scalable computing infrastructure that better aligns IT to business requirements, including reducing new product time to market.  Importantly, CC allows firms to focus on its core mission of delivering goods and servicing customers while outsourcing a big chunk of their IT (read: fixed costs and headaches) to experts.   

Currently, there are many business functions delivered through a cloud, from CRM (salesforce.com) to messaging and collaboration (Google Apps) and high performance computing (Amazon Web Services). Not surprisingly, all the IT heavyweights including IBM, HP, and Microsoft have committed billions of dollars to marketing a plethora of products and services.    No wonder Gartner, an IT research consultancy, named CC the second most important technology focus area for 2010. 

Yet, CC has received a couple of black eyes recently arising from security breaches at Amazon and Sony that impacted millions of users.  And, there remain important challenges to fully exploiting CC’s potential.  Not all first generation initiatives have met expectations.

Given its young age, it is not surprising that CC carries a variety of definitions and connotations.  For the sake of clarity, I use the US Department of Commerce’s National Institute of Standards and Testing definition.  NIST defines 5 characteristics of cloud computing:

  • On-demand, self-service computing – allows business units to secure the resources they need without going through internal IT for servers and licenses;
  • Broad network access – enables application to be deployed in ways the business operates such as mobile and multi-device;
  • Rapid resource elasticity – provides for quick resource scalability or downsizing depending on computing needs;
  • Compute resource pooling – enables computing resources to be pooled to serve multiple consumers;
  • Measured service – allows IT usage to be measured like a utility and charged back to users according to demand.

How do managers determine whether this technology is right for their business?  Our firm has developed a quick and dirty checklist to test a company’s cloud readiness: 

  1. Are your revenue-driving business applications hampered by inadequate computing power?
  2. Would significantly quicker resource availability enable you to reduce time to value with new products and key operational initiatives? 
  3. Are your operating units and managers always fighting for more IT resources?  
  4. Is your business environment characterized by unexpected surges in demand? 
  5. Is IT redundancy an important risk mitigation strategy? 
  6. Do new or short duration business projects have difficulty “making the cut” for IT priority? 
  7. Are server, software license and data center costs rapidly out-pacing profit growth? 
  8. Are you frustrated with the flexibility and responsiveness of your enterprise IT infrastructure?

If you answered yes to only 4 of the above questions, your business is being seriously impacted by IT constraints and higher than necessary operating, hardware and software costs.  A compelling business case for CC exists and a pilot program should be investigated as soon as possible.

For more information on our services and work, please visit the Quanta Consulting Inc. web site.

Using IT to drive Insurer growth

Despite an improving economic climate, Canadian insurers continue to face considerable challenges including growing price competition, rising distribution & delivery costs and increasing competition from banks.  At the same time, consumers – particularly younger, more educated and more affluent ones – are rapidly turning to online tools to purchase and service products.

Emerging technologies offer Insurers the potential to increase market penetration, grow share of wallet and reduce sales, marketing and support costs.  However, Canadian firms are appreciably behind when it comes to using IT to drive revenues and bolster customer service.  There are understandable reasons for this conservatism.  As an industry built on trust developed through personal connections, Insurers are hesitant to adopt arm’s-length ways of developing relationships.  Moreover, with up to 20% of its operating budget spent on technology and its supporting services, Insurers tend to frame IT as an expensive cost center prone to high-priced cost overruns not as a strategic business driver.

Yet, many global insurances companies are using new web and mobile technologies to drive revenues and outflank competition.  In particular, four areas offer rich rewards for those company’s that can truly understand their customer’s needs, make IT a strategic imperative and execute with excellence.

1.  Establish new sales channels

New wireless platforms like the iPhone, iPad and BlackBerry enable Insurers to reach customers more effectively and efficiently.  In one case, South African Metropolitan Life is piloting a new short-term life insurance product, Cover2Go, which allows new customers to purchase a contract by simply by sending an SMS.

2.  Leverage Social Networking

Popular sites such as Facebook and LinkedIn offer tremendous potential to build brand communities, enhance the customer experience and engage a national workforce. In one example, State Farm is using Facebook as a national training platform.  Approximately 17,000 agents have mobilized into groups of “friends” to discuss new products and exchange best practices around customer service and claims processing.

Some Insurers have invited customers into their product development process. Allstate has set up social-network forums to facilitate interactions among motorcycle customers and enthusiasts. The forums solicit customer feedback and use it to inspire new products and services.

3.  Develop custom products

Forward-thinking insurance firms are using powerful data analytics capabilities to identify unexploited customer segments and then target them with tailored, “mass customized” products.  One large Insurer I know is combining sophisticated “rules-based” algorithms with high performance computing to enable product designers to adapt policies both to customers’ preferences and to specific market regulations.

In another case, some auto insurers are using IT to develop dynamic coverage models based on driving patterns and behavior. One leading carrier in the United States, for example, uses GPS technology to monitor drivers and then apply discounts to policies as a reward for safe driving.

4.  Streamline customer service

Even with extensive IT, insurance remains a labour intensive and complex business.  However, a growing number of firms are using off-the-shelf technology to streamline customer service. For example, AXA and Zurich Insurance have location-based iPhone applications that enable customers to use their phones to record damaged areas and then to send the accident photos to reporting centers to expedite claims handling.

In the future…

Social networking has the potential to reorder the traditional Insurance business model. With its ability to mobilize people quickly, sites like Facebook could quickly spawn large affinity groups which can negotiate preferential insurance rates much like Groupon has been offering since 2008.

To use IT as a growth driver, executives need to look beyond their industry in order to identify promising business opportunities that are enabled by new technologies and business models.

For more information on our services and work please visit the Quanta Consulting Inc. web site.

The critical role of IT in driving sustainability

In previous columns, I have written about how companies such as Nike, Walmart and SAP are using sustainability strategies like Product Life Cycle Analysis, green product development and the reframing of environmental standards to deliver on their sustainability goals. Now, we turn our attention to the important but often overlooked role of Information Technologies (IT) in supporting green business strategies.  In the past, many companies have been reluctant to consider IT for a host of reasons including the presence of significant legacy assets; the mission-critical nature of many IT systems and; the lack of a strong consumer impetus. 

IT systems and their accompanying data centers are a major source of carbon emissions, toxic waste as well as being a major consumer of energy. According to a study by A.T. Kearney, a consultancy, corporate IT departments creates as much as 1 million tons of obsolete electronic equipment each year and produces 600 million tons of carbon dioxide (CO2) emissions worldwide per year. For perspective, these emissions are equivalent to the annual CO2 output from almost 320 million small cars. As well, some data centers are so big that they consume as much energy and water as a small city.

With Internet-based services growing at healthy double-digits per year, IT’s environmental impact will continue to increase rapidly unless management does something to rein it in.  If most organizations are going to meet their aggressive sustainability goals, they will have to take a hard look at their IT operations. 

Where should they start looking?

Powering down

Energy usage is a key area to tackle first. According to the Interactive Data Group, a typical IT department in 1996 spent 17 cents of every dollar to power and cool a new server. A decade later, the rate jumped to 48 cents per dollar.  The firm predicts that number will grow to over 70 cents by 2012.

When considering ways to reduce power consumption, an obvious place to look is the data center.  A number of steps can be taken here including monitoring and improving HVAC efficiency; switching to more efficient blade server and virtualization architectures and; choosing cooler climates to build new data centers.

The front office is another fertile source of energy savings.  Every firm can benefit from quick wins such as installing power measurement and management software and introducing policies that require PC users to shift to low-power or shut-off state when not using their machines.  When Bendigo Bank in Australia mandated employees turn off unused desktop computers, monitors and printers that used to run constantly, they saved more than $300,000 a year in electricity.

Buy greener

Better purchasing governance is an important tool to reduce a firm’s environmental impact.  For example, managers could stipulate that new equipment purchases must bring the highest energy efficiency ratings as well come from companies that feature prominently in sustainability indexes and standards. Moreover, buyers might also look for products manufactured from recyclable materials and that generate minimum amounts of hazardous waste and carbon emissions.  Finally, in order to reduce the purchase of unnecessary assets, policies should be enacted that prevent buyers from over-buying equipment just because someone wants the latest technology.  One way to ensure this happens is by extending the life cycle of IT equipment.

Improve reporting

Some companies are using IT to improve sustainability reporting across the entire value chain.  Dow Chemical’s IT group, for example, acts as a green watchdog, tracking emissions, performance and vendor activity.   Dow is using this data to calculate a net environmental balance across a product’s entire life cycle to help them better understand how materials are consumed in manufacturing. These insights can identify environmental and cost savings throughout their operations as well as their vendor inputs.  Finally, improved tracking and reporting will enable companies to better meet customer sustainability programs like Wal Mart’s Sustainability Index as well as provide key environmental data to consumers.

Greening IT will be crucial to helping many organizations achieve their aggressive sustainability targets. Managers can ill afford to ignore this under-developed area.  

For more information on our services and work, please visit the Quanta Consulting Inc. web site.