Managing Change through Technology
by Puneet Chandra, CMO, IT Business, Wipro Ltd.
Thursday, May 16, 2013
Just a decade ago, desktops dominated; now tablets, e-readers and smartphones rule. CDs are gradually becoming extinct, in favour of online downloads. Credit cards are being threatened by NFC-enabled mobile devices. The trend to share rather than own a car is growing. Even the cars themselves are changing – before you know it, your car may no longer have a gas tank. An established and once groundbreaking 224 year old resource, the Encyclopedia Britannica went out of print last year. Kodak, the business that built the world’s first digital camera in 1975, filed for bankruptcy.
Everything, from consumer behaviour, to product design, and manufacturing methods, to business models are changing at a break-neck pace. Not since the Industrial Revolution has the list of companies and business models that have become obsolete, been longer. Moreover, there is no guarantee that today’s hot products and trends will stand the test of time. Every business is being forced to constantly evolve. Take Amazon, a quintessential technology-driven business, which has had to adapt from being an online bookstore to a distributor of digital books, a marketplace for products, a manufacturer of low-cost e-readers to web services and same-day delivery. The rate of change we are seeing is not just fast, it is exponential.
For many, such unpredictability can be daunting. Take the European Union telecom market that is both fragmented and fiercely competitive, with over 140 mobile operators. Many businesses are uncertain as to what will happen next: will we see a series of mergers and acquisitions in a bid to acquire more customers? How will operators cope with meeting the requirements of national regulators? But a small incremental change – being creative and innovative in aligning business strategy to new regulatory demands – could result in a giant leap for a handful of operators.
So what is a business to do in these exponentially changing times? I believe that the answer lies buried in data. Today there is an abundance of data. Hidden in the data are the coming trends. What will customer want tomorrow? Which are the new markets with the highest yield? Where do the best sourcing options lie? Which channels are star performers? Which factories need to close down and which ones need to resize or change their product line today in order to meet evolving demand? Who are your best partners to deliver business success in the future? Do you, as a company, know your core?
A recent Economist Intelligence Unit report commissioned by Wipro looked into what data currently does – and has the potential to do - for businesses. The study revealed that out of 300+ C-suite executives, only a tiny fraction (3%) are currently not prioritising data collection. Of the executives polled, 72% considered themselves effective at translating data into insights. The report clearly identified data and analytics as being a crucial differentiator in businesses and one which the C-Suite is taking seriously in today’s competitive climate. However, the key question is: how do you arrive at actions from insights?
Here, technology plays a pivotal role. Take the retail sector, where technology can drive in-store and online customer experience, based on customer-level P&L. These insights can improve the ability of businesses to personalise and customise products, services and ultimately - experience. By deploying offer engines that leverage real-time data, data has the potential to drive sales. With the right system in place, a business can be increasingly agile , predicting change and rapidly re-tuning their supply chains accordingly. With improved financial decision making, workforce allocation and productivity, the benefits are undeniable.
Change is not going to halt in tracks; so agility and the ability to react – often in real-time – could make the difference between whether a company sinks or swims in this difficult economic climate. With the help of technology, companies can embrace change, recognising the opportunity for innovation and creativity, rather than viewing it as in inhibitor of growth. Those who do so will emerge winners – because technology will arm them with ways to stay one step ahead of change by predicting it.
Service providers looking to “VPO” to enhance customer experience
by Jonathan Kaftzan, Managed services marketing manager at Amdocs
Tuesday, May 07, 2013
Has it happened to you? You order something in a restaurant, but your waiter muddles up the order and your starters never arrive – usually you say forget about it, just take it off the bill. It’s called order fallout and when it happens to communications service providers, it causes a major headache. According to industry estimates, errors based on service orders total more than £33 billion a year.
So one key question for service providers across Europe (and elsewhere) is this: are unnecessary order fallouts, loss of revenue, overlooked inefficiencies and high costs inevitable? Or can a new solution fix the inherent systems and processes?
To answer that, we need to step back and take a hard look at what’s called “business process operations” (BPO) and why that model, which has worked so well for so long, needs to change.
The fact is that there’s a growing chasm between information technology (IT) and business operations that reaches across all business processes. The traditional BPO (business process operations model) is no longer effective and cannot help bridge the IT and business gap when it comes to building end-to-end ownership of the business process.
When it comes to the O2A (‘order to activation’) process, one of the most critical and complex processes that service providers deal with (and one that directly affects revenue generation) is that there is a lack of clear ownership and process visibility, resulting in misalignment of objectives, goals and priorities. The impact on the bottom line is dramatic : order fallout rates can be as high as 25 percent and service activation failure rates as high as 35 percent.
Looking ahead, there are three key industry drivers that are influencing how business process operations will be addressed in the near future:
• A super-connected world – According to GSMA, it is expected that by 2016, there will be 24 billion network-connected devices, far exceeding the number of people on Earth. This exponential growth means service providers will need to manage ever-increasing volumes across their O2A processes.
• Changing service provider focus – With service providers increasingly focused on small and medium business (SMBs) and the growing complexity of the multi-play ordering process, there is a dire need for robust optimised order handling processes.
• Cost pressures balanced by the need for customer experience improvements – According to the Yankee Group, service providers are under constant pressure to improve customer service levels, while reducing order processing costs and order cycle times.
In short, what’s needed is a shift from a business process operations orientation to a new concept we’ve developed called value process operations (VPO). And the best way to describe it is simply to say that value process operations is an innovative managed services model designed to help service providers reduce their total cost of ownership (TCO) across specific business processes, regardless of systems landscape.
Combining backend operations and IT technology, the VPO approach leverages technology to generate even higher efficiencies and reduce operational expenditures, while improving the business’s key performance indicators (KPIs).
So, what are the benefits of this approach?
For service providers implementing a VPO-based solution, benefits will include reduced order fallouts and cancellations; lowered costs (day-one reduction of operation costs by up to 35 percent); and reduced cycle time and accelerated revenue.
And for the end-user customer? Improved customer service, with up to a 10 percent reduction in activation complaint calls and up to 12 percent reduction in missed customer appointments.
Sounds like a win-win situation.
Source to Pay: five procurement remedies for small to medium enterprises
by Daniel Ball, Wax Digital
Tuesday, April 23, 2013
Analysts anticipate that Source to Pay software will see a rise in adoption amongst SMEs this year and beyond. To quickly recap, Source-to-Pay takes the legwork out of the entire sourcing and purchasing cycle. I see it addressing five critical purchasing pain points which are the essential components of success and ROI from bringing this software on board. In my first post on this topic I discussed the initial pain point in the cycle – selecting your preferred suppliers. In this second post I will outline the next two pain points and remedies.
Once an organisation has sourced its preferred suppliers its next challenge is ensuring it enters into sound and acceptable contracts and then manages the adherence to these contracts on a continual basis. Contracts should not be simply left to run - managing them ensures that both parties meet their agreed obligations.
But contracts are frequently complex and may involve multiple products or services or last a long time and consume many resources. Keeping a constant check on whether they are being adhered to manually throughout every purchase and payment is complex and very time consuming, often resulting in slippages of service such as delivery or pricing, or payment terms on the side of the buyer.
If suppliers become aware that they are not being monitored against their contractual obligations they may make less effort to comply. Equally the buyer organisation might find itself purchasing services outside of the terms agreed in the existing contract.
Because Source to Pay electronically captures the supplier selection process, which involves the agreement and setting of terms, the software can then also help the organisation to police adherence to those terms. They are stored in the system and so they can be used to automatically check that purchases are payments are within terms and flag occurrences when they are not.
With effective contracts in place employees are ready to begin buying the agreed products and services they need. Here lies the next pain point requiring remedy. People can be resistant to change and may not want to use newly agreed suppliers or products when they have a relationship with a supplier already. The challenge here is to gain the trust of the buyer to ensure they purchase goods and services in a compliant way and avoid maverick spending with unapproved suppliers.
A well designed Source to Pay platform enables this by providing an intuitive and simple to use buying system that acts very much like an online shopping website or Amazon. This means people need no training and can pick up the system quickly, soon realising it actually makes their lives easier, not harder. The system also ensures that people buy in line with agreed policies. For example buyers might be required to gain three competitive quotes on spend items over a certain value.
Cost effective and low risk purchasing requires a joined up process from supplier selection to final payment. Source to Pay software remedies a number of common pain points that exist along the way, creating costs and resource bottlenecks for the business. In my final post I’ll be talking about the last two remedies – overcoming manual financial processes and implementing a continual purchasing improvement cycle.
Revolution in automation
by Paymon Khamooshi
Paymon Khamooshi discusses how the rise of new IT technologies including the development of automation applications is set to revamp the onshoring services industries.
In a recent blog post for Sourcing Focus I eluded to new technologies that will soon make onshore IT outsourcing as competitive as its offshore competition. Programming technology, especially web application design, is on the verge of a revolution in automation which will have far reaching consequences for the outsourcing industry. These developments are in the very early stages, and are little understood by the wider industry, but change is definitely coming. IT outsourcing will soon be a very different environment than it is today.
Offshore outsourcing’s major competitive advantage has always been its lower labour costs. Cheaper manpower, especially in India and China, has created opportunities for enormous savings, especially in the IT sector. This savings has not come without trade-offs, though. The internet may have killed distance for some aspects of IT work, but managing a team of people from thousands of miles away in different time zones with different business customs still creates plenty of headaches. The vastly lower cost normally makes these challenges worth tackling, but automation in the IT sector will soon be rebalancing this equation in favour of high-value onshore IT providers.
The reason why labour costs are still such an important factor in IT development is the continued reliance on third generation programming languages like C# and Java. Third generation methods date back half a century, require extensive manual coding, and are therefore vulnerable to human error. Fourth generation programming languages, which attempted to overcome these problems, failed to live up to expectations and were never widely adopted. The result is that software coding is still a largely manual profession, dependent on large numbers of junior programmers to carry out necessary but repetitive tasks.
After a long wait, breakthroughs that bring the benefits of automation to IT without the loss in quality and flexibility have finally started to appear. New hybrid languages that combine the best aspects of third and fourth generation technologies are set to dramatically change the programming landscape. Monotonous, repetitive tasks can now be automated, but without sacrificing any flexibility or control over the final product.
One example of this new hybrid language is called M#. M# creates .NET web applications, but automates 90% of the coding, effectively cutting production time by nearly three quarters. The remaining 10% of the project requires the attention of an experienced senior programmer, but this is the area where onshore outsourcing normally has a competitive advantage over its offshore rivals. The time-consuming elements of coding, which favour outsourcing to markets with lower labour costs, have been eliminated. Without its pricing advantage, the argument for offshore outsourcing is significantly diminished.
These developments in IT should come to a surprise to no one. In the last half century almost every facet of our daily life has been changed by software as it has automated one time-consuming activity after another. It was only a matter of time before software engineers found a way to automate the development of software itself.
It will take time for these new developments to spread through the IT sector and for new working practices to be developed. IT designers located offshore may have little to fear in the very short term. The clock is ticking for labour-intensive programming, though. The clever business manager should be expecting new technologies such as M# to make onshore outsourcing the most competitive option before long.
The Data Dilemma – Is outsourcing the data centre right for you?
by Tim Chambers, CTO Data City Exchange
Monday, April 22, 2013
Data has become an inescapable reality of modern business, and the sheer volume of it poses huge problems to the enterprise. Recent research by Cisco predicts that by 2016 data centre traffic will have increased six-fold on the amount of data handled in 2011. That’s a startling statistic, and enough to keep any data centre manager awake at night.
Traditionally, there have been two options for organisations that require a data centre: build your own, or, rent rack space from a data centre colocation provider. Both have their distinct advantages and disadvantages; however, the size of an organisation has often proved to be the deciding factor with the most influence.
The biggest advantage of building your own data centre is exactly that. You designed it, you built it, you control what goes on in there. You can custom design your infrastructure to handle whatever systems you need to run. Upgrades to new technology can be carried out at the pace you need to go at, as slow or as fast as you like. Control also extends to security, and managing your own data on-premise means that physical security is less likely to be compromised.
This level of control, however, comes at a price. Quite literally. Building a data centre is not cheap, and it’s not exactly cheap to run manage and maintain when operational either. Between unpredictable property prices and the rising cost of energy, regardless of the size of the organisation, committing to build your own data centre amounts to a significant outlay of capital.
The other option is to use a colocation facility. There are many attractive and obvious benefits to this solution, not least the lower initial outlay involved and ability to buy more capacity as needed. This is a perfect solution to many organisations but the restrictions put in place in terms of available technology, ability to run custom systems and compromises to physical security mean that this is not a viable option for many companies.
With data traffic skyrocketing, a very real dilemma for many organisations that manage their own data centres is what to do when they exceed capacity. Finding a service provider that can handle legacy systems can be a real challenge. My own experience as a technical data centre engineer in the financial services sector taught me that many of the traditional outsourcing options were not really fit for purpose.
Recently, however, we’ve seen a different third model emerge, providing a viable alternative for those in need of rapidly expanding data handling capabilities. The advent of the modular wholesale data centre provider offers, in a way, what is the best of both worlds. Customers can lease what is essentially a managed data centre environment, providing a secure location, network connections, as well as power and cooling infrastructure, but critically, what happens inside this environment is completely down to the customer.
The environment can be exactly what they make of it, and is not restricted to the hardware put in place by the datacentre provider, preventing compatibility issues that may have arisen using a traditional colocation provider. This approach lends itself to phased expansion, and the modular technology associated with this type of offering allows tenants to increase their capacity in the same footprint as they need to.
For organisations that need more than just rack space, this is a godsend. In essence, it provides what is, for all purposes, a self-controlled data centre, while avoiding the outlay associated with building and maintaining your own. In a time when raising capital for infrastructure investment is becoming more and more difficult and budgets are being cut left right and centre, any opportunity to turn the data centre into an operational expense rather than a capital expense will be music to the ears of any IT decision maker.
Businesses find cloud migration costly
Source to Pay: five procurement remedies for small to medium enterprises
by Daniel Ball, Wax Digital
Thursday, April 11, 2013
Earlier this year the highly regarded procurement analyst Jason Busch heralded 2013 as ‘the year of Source to Pay (S2P) adoption in small to medium sized enterprises’. But at a time of continued economic uncertainty for many, is adding another cost line to already-strained IT budgets really a smart move? When it’s one that can deliver significant short term payback and resource reduction, the answer must surely be yes.
For many SMEs the first questions may be ‘what is Source to Pay’? Put simply, S2P systems deliver big cost savings by providing the tools to negotiate the best possible deals from suppliers, ensure employees buy compliantly from those suppliers and reduce process overheads by automating the exchange of orders and invoices.
Source-to-Pay software takes the legwork out of that entire sourcing and purchasing cycle, from the starting point of analysing spend and selecting suppliers, through to the endpoint of paying the supplier correctly. To aid understanding of this issue and solution I’ve identified five sourcing and purchasing pain points that are remedied by Source to Pay. In this post I will focus on the first of these – selecting your preferred suppliers.
So here’s the scenario - you decide to review your purchasing in a particular spend category that comprises hundreds of different products and a range of services, let’s say facilities management. This process shows that different departments are buying varied elements of this category from a number of different suppliers, creating many small ad hoc purchases at non-competitive prices. You decide to put out a tender to a vetted list of suppliers for a stripped down set of standard services. The aim is to consolidate what you buy and from whom so that you can increase buying power and negotiate a better price.
But going out to tender manually requires significant resource and is so long-winded and error prone that the process of reviewing suppliers never really gets off the ground. Writing effective tender invitations, then sorting through and evaluating varied and difficult to compare responses, slows the process and makes it harder to compare offers from suppliers. Your opportunity to reduce costs and ensure contractually secure products and services are used by the business falls at the first fence.
Source-to-Pay can largely automate this process. Tenders can be constructed within the system from pre-defined clauses dragged and dropped into place. Then suppliers actually respond within the system too, eliminating paperwork and manual checking. The system also automatically checks, compares and scores responses against chosen criteria. So significant are the productivity benefits of Source to Pay in this area that mid-sized organisations often see a reduction of around 85% in the time it takes to manage the tender process.
So Source to Pay has the potential to get your supplier and spending review off to a flying start with effective preferred supplier selection, but it doesn’t stop there. It’s an integrated cycle and in the next post we’ll look at the next two pain points it remedies – supplier contract management and maverick spend management.
The Future of the Public Cloud
by Jonathan Birch, EMEA Practice Head for Infrastructure and Architecture, NTT DATA
This year will see the turning point for the public cloud, as a result of repeated security infractions and regulatory issues. Companies are beginning to realise that they need a secure and stable infrastructure, which is both fast and flexible. They need to think about systems that allow them to ramp up resources, as well as scaling them down depending on business needs.
For many service providers, 99% uptime is seen as good, however as the role of service providers becomes even more critical, that one per cent of downtime isn’t good enough anymore. This is what is starting to be seen with the public cloud. The fundamental question is, what infrastructure would you run your important business system on and how risky are you willing to be with business data?
The public cloud emerged as a quick and scalable utility that could revolutionise enterprise IT, however, due to repetitive data breaches and repeated downtime, the public cloud has been highlighted as disposable. Data breaches are fairly common stories in the media, and are usually dramatised, with millions of people’s personal data being stolen each time. The Yahoo! eHarmony and Linkedin stories are oft-quoted, but the real extent of data breaches is often greater.
The public cloud’s open and shared services approach always raised security concerns, and the fact that data can be stored anywhere in the world raises regulatory issues around data protection and ownership. For this reason, businesses simply cannot trust the public cloud’s offerings.
In the future, we will see the private cloud’s status increase. It continues to grow in popularity, especially with financial services institutions and global corporations. The private cloud is a much more secure form of data storage, which is why is has become so popular with these types of companies. With the level of flexibility now offered by private cloud providers, it is a more compelling proposition for hosting important business systems and business information.
Establishing their own personal infrastructure allows these companies to keep complete control of their data, while still achieving the flexibility benefits of running services through the ‘cloud’.
Companies are looking to invest in a service that matches their required business outcomes. They are looking to move away from traditional data centres and move towards an option whereby you can pay solely for the data and computing resources that you use. Companies want a predictable financial transaction, which can easily be paid off each month and aligns with their business outcomes.
Companies are looking to move away from outsourcing the provision of infrastructure and applications. Instead they want a supplier with SLAs relating to the availability of their business services.
The private cloud is the only solution that can securely protect a company’s information assets. The future of cloud is to deliver an end-to-end service and not just component parts. Suppliers who can supply SaaS, IaaS, PaaS, management consultancy and in-house processes will be the only ones truly able to deliver the outcomes that businesses require.
Time to ReThink Network Security
by Dan Joe Barry, VP of Napatech
Wednesday, April 10, 2013
On August 15, 2012, Saudi Arabia’s national oil and gas company, Aramco, suffered a debilitating cyberattack. More than 30,000 computers were rendered inoperable by the Shamoon virus. US Secretary of Defense Leon Panetta described this virus as the most destructive weapon ever used against the business sector. Network security is a growing problem in the IT industry today. The very trends that have revolutionized users’ access to data are the same ones that are leaving networks vulnerable to attacks by cybercriminals. No single security product can fully defend against all network intrusions, but a smart combination of existing products can provide a more flexible solution. Napatech’s intelligent adapter forms a key part of this response by ensuring that network monitoring and security appliances have the full capacity to monitor, detect and halt potential attacks.
Three recent trends in the IT industry have improved the efficiency and effectiveness of digital services: cloud computing, big data analysis and mobility. Cloud computing centralizes data and makes it accessible anytime, anywhere. Unfortunately, it also provides cybercriminals with fewer, and more valuable, targets. Big data analysis offers a sophisticated overview of complex information; however, such a wealth of sensitive information in a centralized location provides an irresistible target for cybercriminals. Mobility allows convenience; it permits users to access data on the network with different devices, such as mobile phones and iPads. But this severely compromises security as these devices do not have the same protections as the typical corporate laptop.
With increasing data availability, cyberattacks are becoming more common every year. The cost of these attacks to business, though declining from 2010 to 2011, is still high. According to the Ponemon Institute and Symantec Research, the average cost of a security breach in the United States was $5.5 million in 2011. Cybercriminals are becoming smarter, innovating new methods to penetrate defenses and often using several different kinds of attacks in combination. For example, a hacker can utilize a distributed denial of service (DDoS) attack as a diversion for introducing malware into a network. In the case of the attack in Saudi Arabia, cyberterrorists utilized a virus in a spear phishing attack in an attempt to disrupt international oil and gas markets. There are many types of security appliances and solutions deployed in networks, each with its own specific focus. However, these solutions are rarely coordinated, which hackers exploit using a combination of attacks.
To successfully defend against this, some kind of coordination is required between the various security solutions so a complete overview can be provided. But, even this is not enough, as detecting zero-day threats (new attacks that have never been seen before) is very difficult. It is therefore necessary to also monitor how the network is behaving to make sure that no attacks have penetrated the security solutions in place. To do this successfully requires that all these solutions are capable of monitoring and reacting in real-time.
Most networks already have monitoring appliances in place, such as a firewall, an Intrusion Detection or Prevention System (IDS/IPS) or Data Loss Prevention (DPL) application. Some products that consolidate these methods into one appliance include Universal Threat Management (UTM) and Next-Generation Firewalls. But single point solutions can only ever address a part of the problem.
Another solution to network security uses the concept of Security Information and Event Management (SIEM) which is based on the centralization of information from both network and security appliances to provide a holistic view of security. This is a real-time solution, constantly monitoring the network to detect any anomalies that might arise. That means that both the network and security appliances need to be able to provide data on a real-time basis to ensure that anomalies are detected the moment they occur. This, in turn, means that each of the appliances must be capable of keeping up with growing data loads and speeds.
One of the easiest ways of disrupting the security of the network is to overload the security and network monitoring appliances using a DDoS attack rendering the centralized SIEM system blind. This is a real threat if these appliances are not capable of operating at full throughput. By assuring that they can, you have just removed another potential attack vector.
Napatech intelligent adapters are used in both network monitoring and security appliances to guarantee full throughput under maximum load at speeds up to 40 Gbps. Napatech adapters can scale network throughput and combine different port speeds, distributing data flows on up to 32 CPU cores. The data can then be intelligently distributed to one or multiple security or network monitoring applications running on the same physical server—all of this accomplished without compromising CPU performance.
The information from network and application monitoring applications can be used to build network behavior profiles. The customer uses real-time information on network and application usage to detect anomalies as they occur. These anomalies can then be compared to data from security appliances to identify if an attack is underway. Napatech adapters allow for the proper maximization of monitoring and security applications for a multifaceted defense.
Cyberattacks on the world economy and infrastructure are becoming commonplace. The adoption of cloud computing, big data analysis and mobility have improved efficiency, but unfortunately they have also exposed critical vulnerabilities in networks. Utilizing SIEM systems on standard servers with Napatech adapters enables OEM vendors to provide solutions that can respond immediately to any detected anomalies in the network. By combining network and security information into a more holistic solution, attacks—such as the spear phishing assault on Aramco—can be deterred. By focusing on guaranteed data delivery and scalable performance, Napatech, the industry leader in adapters, enables its monitoring and security appliance vendors to build the centralized security solutions that can help protect networks in the years to come.
Right skill sets – the new game changer
by Puneet Chandra, CMO, IT Business, Wipro Ltd.
Friday, April 05, 2013
As the Eurozone flounders in the ongoing financial crisis, it has entered a vicious unemployment cycle that is further weakening the economy and subsequently causing further job cuts.
Unemployment in the 17-nation euro zone climbed to 11.9 percent in January from 11.8 percent the previous month, according to Eurostat, the statistical office of the European Union. Whilst there is a growing concern about high unemployment levels, the real challenge we are facing today is the widening skill gap between the needs of new emerging industries and markets and the available talent.
EU commissioner Neelie Kroes estimates that there will be 900,000 unfilled ICT job vacancies by 2015 in the EU region alone. This raises serious questions about what the future holds for the EU and the task that lies ahead to bridge the skills gap and increase employment levels. In order to remain competitive, governments and industries must work together to ensure young job seekers are equipped with the skills they need to capitalise on this massive opportunity.
As a starting point, the government must address the following questions:
1) What are the policies and capacities that need to be developed to meet industry needs?
2) What is the role of the government and technology in skilling, re-skilling and cross-skilling the future workforce?
3) And, what action does the industry need to take to address the skills gap?
We all know that the realm of technology is fast changing and it has already revolutionised the world of work. Today’s employer often demands a niche skill-set that is not always prioritised by traditional education systems. There is a real demand for initiatives and programmes to ‘re-skill’ the unemployed and help them adapt to the changing enterprise. The success of Germany’s dual apprenticeship system is testament to this approach: a balanced curriculum of structured training within a company, accompanied by part-time classroom tuition in vocational and general subjects, should serve as a fantastic success story.
However, in the short term to address immediate needs businesses should explore the free movement of skilled workforce across borders. A recent survey that we conducted of global leaders at the World Economic Forum (Davos) 2013 revealed that 78% felt that the EU skills gap pointed towards cross border opportunities when it comes to sourcing talent.
Despite the recent economic slowdown and inevitable tightening of the purse strings, it is important that the EU thinks about the long term repercussions of the skills shortage. Without a skilled workforce, Europe risks lagging behind when it comes to the innovation and entrepreneurship which lie at the heart of economic recovery.
Now is the time for governments to focus investments towards education programmes in consultation with the industry to create shared value for both the economy and businesses. Recently, we partnered with UMass to launch a fellowship programme for 120 US school teachers with the aim of fostering excellence in science education among students from disadvantaged areas of Boston and New York. It is coordinated and sustained efforts like this from the government and the industry alike that will pave the way for increased employment levels and ultimately economic recovery.
By working together, businesses, governments, campaigners and teachers can ensure that adequate skills, policies and capacities are developed to meet the labor force needs of the enterprise of tomorrow.
For increased security, consider moving outsourcing onshore
by Paymon Khamooshi
Wednesday, April 03, 2013
Outsourcing IT and business services save substantial amounts of money, but the savings for many firms businesses carries a hidden cost. Too many firms fail to recognise the increased security risks that come with outsourcing, and this extra risk is therefore left unmanaged.
One very effective way to reduce the risk is to keep outsourcing onshore, but this option has normally, until recently, meant higher costs. A new wave of programming technology is set to change this balance by eroding the price advantage of offshore outsourcing. Understanding the security risks of outsourcing IT offshore, and being prepared for increased efficiency of onshore IT competitors, is a must for all CIOs.
A recent report by data security specialists Techwave contained some very alarming findings that clearly demonstrated the inherent security risks of outsourcing. Of the 450 security breaches Techwave investigated in 2012, outsourced IT and business services were a factor 63% of the time. Even more frightening was the average detection time of a corporate security breach, which was a staggering 210 days.
Corporate security is not receiving the attention it deserves full stop, but anytime an outside company is involved in a sensitive area like IT, the risks become much higher. One security breach could easily wipe out any savings being realised by outsourcing, as well as cause enormous reputational damage. Outsourcing can and should continue where appropriate, but the risk it creates must be managed and reduced. Keeping your outsourcing partners close to home, where communication and monitoring is easier, is an important step.
The reason proximity reduces risk for IT outsourcing is because the greatest threats to any system’s integrity are not technological, but human. People choose weak passwords (most commonly ‘password,’ or when capitals and numbers are required, ‘Password1’), operate from shared user accounts where accountability can’t be traced, and discuss confidential company information on Facebook. Hackers know this, and exploit the human tendency to be carless with corporate security. Educating your employees to follow best practice is vital, but even with adequate time and a resources, rooting out risky behaviour is a difficult and thankless task. All of this is wasted, however, if outsourcing partners are holding open the back door to your systems through their own careless behaviour. Every CIO should be asking, ‘are my outsourcing partners as concerned with my company’s security as I am?’
Just asking the question is an important first step, but ensuring the right outcome is more difficult from thousands of miles away. Digital security is too important to manage with only emails and video chats. And when a crisis does hit, offshore outsourcing can exaggerate the problem, as NatWest learned to their cost late last year. When a human lapse led to a catastrophic failure of the bank’s UK-based software, managers were forced to get support by telephone from software engineers in Hyderabad. This extra layer of complexity made a difficult problem even more difficult to solve. When you need on-site help in a hurry, make sure your IT support is a train ride, not a plane ride, away.
The security benefits of onshore outsourcing are clear, but the higher cost will still be a barrier for many companies. This is set to change in the IT sector, however, thanks to new innovations in software design. Onshore IT workers are gaining access to new tools that will make them as efficient, or even more efficient, than their offshore competitors. As this trend becomes more pronounced, expect to see more and more IT outsourcing staying in the UK.
I will be writing about these new technologies, which I believe will revolutionise IT outsourcing, in a coming post on sourcingfocus.com.