Industry news

  • 12 Jul 2012 12:00 AM | Anonymous

    Oracle Corp has acquired social marketing firm Involver, notching the third deal in as many months in a red-hot area for enterprise software makers.

    Terms of the deal were not disclosed.

    San Francisco-based Involver, founded in 2007, provides tools for developers to create advertising campaigns on social media networks such as Facebook.

    "Companies are looking to harness the full potential of social media to increase brand loyalty, connect with potential customers and anticipate buyers' needs," Oracle said in a statement.

  • 12 Jul 2012 12:00 AM | Anonymous

    Anyone who has worked at one of the outsourcing/offshoring companies knows that they try to make client satisfaction and success a primary consideration. They concentrate on this through three dimensions of delivery: time - using metrics to improve the timeliness of tasks, services or projects; budget – using metrics for cost reduction of tasks, services, projects or programs; and project/process/organizational quality – using metrics for tests, defects resolution times, quick ramp up of resources, continuous improvement etc.

    Everything is measured and improved, except for what lies at the core of everything – the software itself and the associated fourth metric, product quality.

    I contend that not understanding this fourth metric creates a thorn of dissatisfaction with outsourcing/offshoring quality in otherwise fruitful and beneficial relationships. I also contend that this need not be so.

    Supplier and clients often argue that testing is a measure of product quality. But testing, resolution of large number of bugs within time and budget, and their reduction over time does not necessarily mean that the product quality is good.

    Product quality is really only good when you can ensure that software meets the business needs of reliability, scalability and economics in the long run. Product quality is good when the right technology is used, the right architecture is designed, and there is control on the software implementation to guarantee high maintainability, low costs and low risks of operations.

    Most organizations, however, do not know how to be certain about product quality. Three steps that can offer control over software products and realize some certainty regarding quality include:

    1. Demand Product Quality – Product quality is not demanded and therefore not provided.

    Most tenders require a certain level of security assurance and organizational quality (CMMi-level-5 or ISO 9001 etc.) but are silent when it comes to expectations of product quality. Suppliers often contend that they are already highly focussed on quality, when they set up their re-usable quality targets. But does it help?

    Look around! Check how many supplier roles you see onsite. You will usually find numerous managers and team leads, but hardly any cross-technology experts, architects or experienced developers. In fairness, suppliers cannot go ‘onsite heavy’ if they still want to meet client’s strategic goals of optimizing costs or simply leverage their competitive advantage.

    So you see new engagement models emerging: managed services based on KPI’s, risk/reward partnerships and outcome based pricing.

    These models are assured through project, process or organizational control measures. But will these new models assure that the system will be easily modifiable, compatible and portable? No, the benefit of these models is only that they assure redress in the worst scenario. So the next question that arises is how to demand product quality in the normal course of design, development or maintenance of the product.

    2. Measure product quality: Know what product quality is, and use product quality metrics to measure, analyse and improve.

    ISO standard 9126, and its follower ISO 25010, provide a detailed guideline on product quality. (We have leveraged it extensively through a reliable and repeatable quality model that can be used directly through source code and fact based measurements.)

    Static code analysers can be leveraged directly by development teams but are often not productive as their focus is only on technical metrics rather than a contextual perspective. It is unhelpful, for example, to spend time removing duplicate code or measuring number of lines of code when the real problem is system modularity.

    We therefore contend that product quality measurement can only be helpful when aligned with product goals - not same as project goals.

    3. Evaluate vendor performance on product quality: IT suppliers will thank you for this transparency.

    IT suppliers want to develop good software but are a victim of their own global scale and size. Help them by making product quality a measure of the key performance indicator (KPI’s). Do outsourcers worry about being assessed on quality as well as the other three metrics? In reality, many relish it. Having independent assessment on the quality of their software, or insights into how they can improve (thereby increasing the likelihood of delighted customers), is, for most, helpful.

    So let’s hear it for product quality measurement – the fourth metric of outsourcing success!

    Ayush G.V. is a Sr. Management Consultant at SIG, the Software Improvement Group, which is strongly rooted in research and provides fact based consultancy to organizations in the areas of software economics, quality, risks and governance.

  • 12 Jul 2012 12:00 AM | Anonymous

    Cloud Computing has been a buzz trend and has been heavily marketed for some time. Recently we have seen the rise of private cloud services. To what extent has the economic climate shaped our needs and technologies and revolutionising the way we look at cloud and what cloud strategies are now emerging?

    The use of cloud services and platform is nothing new in business, it has been around for years, with companies such as Amazon employing cloud computing and selling it as Amazon Web Service (AWS) as far back as 2006. A successful strategy according to Martin Bishop, Global Head of Hosting Services at Telstra Global, will allow businesses to “grow faster, reduce costs and become more efficient as a result.”

    With widespread economic instability, businesses that chose the correct cloud strategy can expect to strengthen their IT capabilities. While service prices are increasingly falling or are free for public cloud services, open source software is becoming more wildly available for deploying private clouds. Companies are faced more than ever with the option of employing their own private cloud services.

    Those that deploy a private cloud service are faced with both the advantages and disadvantages inherent in the design, as such a strategy needs to be thought out in advance in order to reap the full benefits. Private cloud services hosted on site can offer increased security, physical on-site support, and greater flexibility, however private models can incur increased costs when software is required to upgrade the service or server storage needs to be increased.

    Private cloud services, although costly to maintain and requiring a large initial investment, can often prove to provide greater long-term savings than public cloud services, where large scale services used by businesses require pay-as-you-go model. Ephraim Baron, Director of Enterprise Cloud Solutions at Equinix, commented: “If you have significant demand, the owned private option can result in overall cost savings. If your load varies widely a capacity-on-demand approach will likely make more sense.”

    Public clouds are often pre-configured services lacking speciality, while private cloud services are bespoke, created for the needs of the business. Mark Skilton, Director of Global Infrastructure Services at Capgemini UK, views private cloud as being essential in providing individually managed solutions for businesses: “We’re looking ahead with our partners at EMC and Microsoft and many others to move into more hybrid cloud technology offering clients even more bespoke solutions. This can only be done through the private cloud which can be controlled to suit one’s own business objectives, as opposed to the public cloud which, by its nature, dictates services levels.”

    Ephraim said that customers need “to be more aware of your data needs and available options in order to make the best informed decision when it comes to managing the cloud. In today’s economic environment, proper planning and scaling of infrastructure is a critical formula to success.”

    In creating a strategy for private cloud implementation, businesses must ensure that the cloud is properly orchestrated. Orchestration enables the provision of exact requirements such as; CPU, memory, disk space or network/configuration and how it is delivered.

    Tony Lucas, SVP Product and Founder at Flexiant believes that businesses “should think about more than just private, public or hybrid cloud, and ask whether the platform they’re deploying is in fact cloud at all. Only with cloud orchestration in place can an organisation really deploy a private cloud that is flexible, agile and user-driven.”

    If private cloud services are deemed to be appropriate for the business, it is essential that the tendering process is comprehensive. “Some companies out there that are just selling servers and storage, rather than anything new, and are simply re-branding these technologies and offering them as ‘Private Cloud Services’ in the race to capitalise in the market”, warns Andrew Greenway, Accenture Global Director of Cloud Services.

    With the rapid development of cloud technology and services, John Green, CTO of Prolinx details the coming evolution of private cloud services: “there are a couple of big trends emerging that are driving the adoption of private cloud computing - cloud service management is becoming a requirement for adoption with companies being unlikely to adopt a single cloud deployment model.” John added “there’s also the question of Big Data, and businesses turning their attention to adopting technologies that enable them to manage and analyse huge volumes of data from many different sources.”

    While cloud computing continues to remain a growing trend, businesses need to be made aware that hopping on to the latest band-wagon may prove to be ineffective. It is essential that businesses identify a strategy for the employment of cloud services , in order to determine between Private, Public Cloud services or choosing to delay or scrap the deployment of cloud services for the time being.

  • 12 Jul 2012 12:00 AM | Anonymous

    IT should work hard for a business; with proactive management CIOs can turn an IT environment into a business-changing asset. With time and money both in short supply, it is easy for CIOs to be distracted away from leading innovation, instead having to focus on managing daily administrative tasks. The situation calls for a more proactive approach, from delegating time-consuming and less complex tasks to a third-party, to ensuring the inclusion of CIOs in business strategy meetings.

    Firstly, it is important to note that the CIO is no longer just an advisor, but is now a leader within the business, responsible for driving innovation. Therefore, to become the strategic technology tool that is now expected, CIOs need to be included in key business decisions, so they can ensure that IT infrastructure and policy is aligned with business objectives. This will also allow CIOs to disperse their digital know-how and technology expertise to employees across the company, which will in turn allow the workforce to gain a competitive edge in this digital age.

    Delegation is also key. By outsourcing the daily maintenance and administration of the IT environment to a managed service provider, CIOs will be able to flourish and deliver value due to the surplus time they will regain. By handing over the monitoring and managing of systems, CIOs can be assured any potential problems that may previously have caused several hours of costly downtime are now spotted and prevented before they reach the company. With time freed up, they can now focus on game-changing tasks such as the strategic analysis of big data and different types of cloud adoption they may be considering.

    CIOs across the UK are also struggling to handle ever-increasing amounts of data entering IT environments. The legal sector in particular is facing an uphill battle in keeping its data under control, as regulation requires the retention of all client data for seven years. This creates a huge raft of data often backed up across several tapes, which needs order that doesn’t cost a CIO time and budget. A cloud service can offer an effective solution, which can be then supported and managed by the provider. There are various types of model that can be tailored to fit a firm’s specific requirements and security concerns, ultimately saving them time as it is externally managed. Cloud services are scalable, so a CIO does not have to worry about over-using or over-paying. They can leave that to the cloud provider, and focus attention on analysing the data.

    The CIO is the keeper of the keys in terms of technology transforming innovation that can not only enhance the way an organisation operates, but also its strategic direction. Rather than wasting precious time and money on routine IT administration tasks, the CIO should be freed up to focus on technology and IT innovation that can deliver real benefit to the business.

  • 11 Jul 2012 12:00 AM | Anonymous

    Increasing volumes of data are a major concern for CIOs up and down the country. At the forefront of their minds are questions about how to manage and monetise it. Struggling to deal with not only the amount, but also the rising complexity of this data, CIOs are not receiving the extra resource they need to handle the challenge.

    Recent MTI research found that more than a quarter of IT professionals working in the UK (27%) feel that adding extra resource to their IT departments would allow them to become a strategic tool within the business. For example, if the task of storing and managing large and ever-changing volumes of data is removed from a CIO’s daily list of tasks, then more of their time can be spent on the strategic analysis of the data, transforming the IT environment into a valuable asset. MTI clients that originally outsourced data to the cloud simply for efficiency reasons, are now seeing a real return on investment from the move, with the IT department now actually bringing money into the company.

    With the support of a solution such as a managed service, a CIO can outsource backup, file serving and archiving capabilities to the cloud, removing the need for storing data on-site. CIOs can still retain control by deciding exactly what kinds of data they want to sit in the cloud, and then a tailored solution can be created. The management of the cloud environment can also be outsourced, with teams monitoring the infrastructure 24/7 and flagging any problems before the effects hit the network. This can also help to avoid costly system downtime.

    In these tough economic times, organisations are increasingly focused on scalability, whereby they only pay for the services they need and use. Flexible and scalable IT solutions enable CIOs to run efficient and cost-effective IT environments, whilst also enabling them to access more storage or bandwidth if required. Managed services allow CIOs to outsource as little or as much of their data and applications as they want or need to, for a price that matches the service.

    Outsourcing large amounts of complex data to a managed service provider can liberate a CIO, allowing them to focus on analysing the data and turning it into a valuable asset, which can benefit other business lines driving company growth and profitability.

  • 11 Jul 2012 12:00 AM | Anonymous

    Loyalty has always been important, but in this day and age it has become increasingly essential in all types of businesses. Loyalty within the supply-chain is no exception, as it can foster a common vision and the ability to offer best practice solutions, which can ultimately reduce costs in the long term.

    One of the biggest challenges of fostering loyalty within supply-chains is demonstrating your worth to procurement teams, who in many cases drive the re-evaluation of suppliers and make the key purchasing decisions. The problem for existing suppliers, is that unlike your day-to-day client contacts, who see the benefits of your service first-hand, procurement often sit outside of this and once they’ve procured the service they step away, making it very hard to see where you - as a supplier - add value.

    It is always a contentious situation when loyalty is challenged, but it normally comes from the procurement department in the form of cost evaluation. Without doubt the most anxious moment for any business owner is to find that the general procurement function doesn’t possess the expert skills, knowledge and expertise in evaluating specialised services. This can often lead to poor comparisons and judgements with the danger that the cheapest bidder is rewarded, even though they may not have the required capabilities and may cost the client more in the long-term through poor service and damaged reputation.

    The current economic climate is another area which makes it more challenging for loyalty to grow, in particular, when it comes to establishing outsourced functions. Although saying that, there has been an increased move away from e-auction activity as it predominately highlights fears and negatives. If a procurement company is involved, they will naturally be under pressure to at least save their fees, but if the costs are already at a realistic level or lower, then this can become a problem. Cost will always be the key driver, but added value is equally important and should have greater weighting when making evaluations.

    In order to overcome these challenges, fostering loyalty is essential. Once the hygiene factor of meeting the SLAs has been met, engagement and communication form the foundations for long-term, mutually beneficial supply-chain relationships. In terms of engagement, winning the contract is one thing, but retaining, developing and improving it takes hard work. This can’t be done blindly and the client has a key role in working with its suppliers to make the service the best it possibly can be.

    Complacency can be a killer in terms of loyalty, so regular communication is absolutely key. A good supplier should be challenging the transparency of their offering and continually feeding back to their clients on what they are doing and how they are making a positive impact on the business. The client also has a responsibility to develop the relationship for the long-term, by monitoring their suppliers and providing feedback on performance. This way the client is loyal to the supplier and vice versa.

    As with most things of value, loyalty doesn’t come easy and is something that has to be earned, but once it has been it can provide the stability and impetus for continuous improvement and growth.

  • 11 Jul 2012 12:00 AM | Anonymous

    The Outsourcing Yearbook 2012 Summer Supplement is now available!

    Outsourcing Yearbook returns, this time with award winning content!

    Featuring case studies from European Outsourcing Association Award winners and nominees, this supplement gives you access to some of the crème de le crème in pan-European business services. There are also exclusive insights and opinions from the speakers at the NOA’s 25th Anniversary & Conference.

    As if that wasn’t enough, you can keep your finger on the pulse of the outsourcing zeitgeist: the full results of the NOA and sourcingfocus.com UK Onshoring survey are on show, which outlines what our readers think the UK needs to make it the global strategic hub for outsourcing.. interesting reading, with some surprising answers.

    What more could you want for FREE?

    We hope you find the supplement interesting and useful! Please click here.

  • 11 Jul 2012 12:00 AM | Anonymous

    A post through the Government’s G-Cloud blog has detailed that the contract allowing the public sector to purchase via G-Cloud framework has been extended.

    The option for purchasing the framework has now been extended until 13 November 2012 in order to allow Whitehall to deal with supplier and customer queries.

    The post states that queries included feedback “about the basic philosophy and mechanics of G-Cloud.”

  • 11 Jul 2012 12:00 AM | Anonymous

    Business secretary Vince Cable has announced an increase in funding for the UK aerospace industry with investment of £120 million detailed.

    Cable said that the backing is intended to modernise development in areas such as green technology while maintaining the industry as the 2nd largest aerospace industry producer in the world. The funding comes in a time of increased demand and expansion, with foreign development driving overseas export potential.

    Cable commented that: "I don't feel at all embarrassed by saying that this sector is one of our winners. There is nothing wrong with the government getting behind successful sectors in the economy."

  • 11 Jul 2012 12:00 AM | Anonymous

    A report released by IDC shows an expected valuation of the analytics software market to reach $50.7 billion by 2016.

    The report detailed a forecasted increase in growth of 9.8 percent year on year. The report detailed that analytics software in data warehousing grew by 15.2 percent last year while applications and business intelligence grew by over 13 percent.

    Dan Vesse, vice president at IDC's Business Analytics Solutions, said: "The demand for business analytics solutions is exposing the previously minor issue of the shortage of highly skilled IT and analytics staff".

Powered by Wild Apricot Membership Software