Industry news

  • 19 May 2011 12:00 AM | Anonymous

    Traditionally, outsourcing application development has enabled organisations to reduce IT operating costs, tap into specialist skills and enable core, internal IT staff to focus on strategic tasks. The need for CIOs to capitalise on these benefits look set to continue, in fact a recent survey by Harvey Nash found that application development remains the most intensively used outsourced service by CIOs.

    Fast turnaround on application development projects is, of course, a key metric against which success is measured. It’s about getting the job done within budget and timescales and, for the client, cost is – and will continue to be – a principal driver. For the provider, delays not only mean lost revenue, but over-runs can also have a longer term impact on their reputation and competitiveness.

    Of course, taking an application from idea to deployment is not without risks and complexities so equipping teams with the tools that can enable fast completion and remove the complexity of projects can make a considerable difference in success or failure.

    What’s more, in this cloud driven, multi-device age, the ability to ‘future-proof’ applications is going to be an increasingly important success factor. When creating a new application, an organisation may not always know exactly which platform or device they could require in the future. It can be particularly difficult to predict what changes lie ahead in this era of mobility and cloud where on-premise applications may need to be broadened to allow different deployment modes such as SaaS, or taken to new platforms, for example deploying a mobile application from Windows to Android.

    New approaches are now helping development teams faced with such challenges, to simplify the code writing process, reduce development time and then enable an application to be distributed to a number of different deployment channels. These engines use pre-compiled and pre-configured business logic that contain pre-written coding functionality and services. An application platform results typically in fewer coding mistakes, faster project completion and the ability to adapt applications to a business’s changing needs. For example, it provides teams with the option of building and running a cloud application offering, in addition to a client‐server on‐premise deployment model from just one development process. The application can be repurposed at any time for a different channel without the need to re‐code the application entirely from scratch.

    The benefits of this work both ways - for the outsourcing partner this is about delivering choice to the client, enabling faster project turnaround and empowering customers with the options to take applications to whichever device and platform they may wish to with minimal impact on time and resources. The ability to offer this kind of differentiated service - whether it’s creating an application to help sales staff access customer data from their mobile phone or enabling field engineers to access stock information more quickly - could also be the route to reach a wider market and enable greater business opportunities, in the future.

  • 17 May 2011 12:00 AM | Anonymous

    It is understandable for in-house DBAs (database administrators) to view a remote database support service with some suspicion. Afterall, managing the day to day running of the database is their primary role. However, depending on the size of an organisation, DBAs are frequently being asked to take on more than just the traditional day to day management and monitoring of a database.

    In a survey by IDUG & CA in small organisations (1-1000 employees) 31% said they were primarily involved in database administration. This means that more than half are primarily involved in other activities which take them away from day to day administration and maintenance. DBAs, especially in smaller teams have huge responsibility as their workload increases and they are pulled into different projects whilst still having to cope with the day to day tasks.

    In many small to medium sized organisations there may be just one or two DBAs managing a mission critical system. Perhaps those DBAs would actually welcome an extra pair of hands. Would they object to having another full-time team member to help carry some of the load? Probably not, in fact another team member would likely be welcomed with open arms. So why shouldn’t this extra pair of hands be a virtual team member rather than another person in the office?

    For IT management there are compelling financial reasons for taking the decision to outsource database management. There’s a tipping point for most IT teams in small to medium sized organisations where they go from having just about enough resource who are just about coping, to a point where they absolutely have to make a change. When this point is reached it’s time to take the decision – hire a new employee or take out a managed service contract. Many organisations want to keep their existing talent and knowledge in-house, especially in the world of DB2 and database management, and so discount outsourcing.

    However, the cost and time of recruiting and retaining an experienced DBA can be prohibitive and so sometimes the status quo is maintained and an already stretched DBA team is stretched further. The thought of “outsourcing”, particularly in small to medium sized organisations can seem hugely daunting and can have people in fear for their jobs. It doesn’t have to be that way though.

    There are many advantages for in-house DBAs if their organisation chooses to utilise the services of a remote support partner rather than employ an extra member of staff, not least because they don’t have to be added to the coffee round! The DBA team can have the flexibility to pass off the areas of maintenance and administration which are most time consuming and labour intensive which will free them up to work on other projects. Or perhaps they simply want to get rid of the out of hours calls.

    A remote support solution can and should take the strain off the DBA team by looking after the day to day support issues and it is usually far cheaper to take out a database support package than to hire and retain an extra member of staff. The DBA team receive the extra pair of hands they need and IT Management get a cost effective solution for managing their critical databases.

  • 17 May 2011 12:00 AM | Anonymous

    The mainframe has been on a rollercoaster ride over the last 30 years; from hero to zero and now, back to hero again. In the late ‘90s the mainframe began to lose it’s “cool” new technology status to distributed platforms. Now the mainframe is coming back into fashion and rather than moving away from it, many organisations are building on its’ solid foundations.

    According to research by CA, mainframes still handle 55% of companies’ data; this rises to 59% for companies with more than 3000 staff. In the latest figures from IBM, System z revenues were up 69% year-on-year in the fourth quarter, so it seems that mainframe business is currently experiencing healthy growth.

    There have been many reports of a looming mainframe skills shortage. Is this just media hype or is there some truth behind the claims and if so what does this mean for UK business?

    At a recent IBM customer event on DB2 z/OS technologies one of the first questions which arose from the audience was about IBM’s plans to cope with the skills shortage which is on the horizon. There was general consensus from those in the room that UK businesses running IBM mainframes are concerned with how they are going to manage over the next ten years with so few new mainframe professionals entering the market.

    I put the question to a LinkedIn community of DB2 z/OS users and got some interesting responses including:

    “There is a big shortage of “Mainframe” skills, and it will get bigger.”

    “From my experience, I can say that there is definitely a shortage on the horizon...I would judge that there is going to be a major crisis in the financial services area.”

    Speaking to DB2 z/OS consultants in our own organisation I have seen similar sentiments expressed:

    “While I don’t think the issue has really started to bite on the DB2 side yet, you only need to look at slightly older technology such as IMS to see the pattern. Good IMS skills are in very short supply nowadays.” Julian Stuhler, Director, Triton Consulting

    What is causing the problem?

    There are varied reasons for the looming gap in mainframe skills. Many of the first crop of mainframe experts, who were the key technical heavyweights that put the mainframe where it is today are heading towards retirement age. In addition to that, many mainframe administrators were re-trained and re-deployed onto distributed systems.

    For the last 10 years we’ve heard that the mainframe is dead but the reality of the situation is that the mainframe, far from being dead is actually growing. However, this “death of the mainframe” rumour has lead trends which have meant that new entrants to the IT workplace have been concentrating their training and career paths onto distributed systems, leaving a gap that needs to be filled.

    “There are very few young people involved in zSeries – right across the skill base” James Gill, DB2 z/OS Consultant, Triton Consulting.

    Effect on business

    The banking and financial services sectors rely heavily on the power of IBM mainframe servers and so are likely to be amongst the hardest hit if mainframe skills begin to dwindle in the marketplace. With so much resting on the successful management of mainframes and the applications that run on them it will be vital over the coming years for the financial services and banking sectors to address potential staffing issues.

    66% of all respondents in the CA research agreed that the mainframe user will soon start to suffer, if it hasn’t already, from a shrinking workforce with the relevant skills not being readily available.

    It is also worth mentioning here how immigration legislation is affecting the UK job market, particularly in reference to technical skills. The option of bringing in skills from abroad has been made a lot more difficult with the introduction of a points based entry system. Organisations are having to go through much more bureaucratic systems and processes to secure work permits for potential employees.

    What are the big players doing to address the problem?

    IBM is not blind to the issue and is actively encouraging new blood into the mainframe world by working with universities to have mainframe modules included in undergraduate computing courses. There are currently 19 universities teaching System z topics. Sheffield University have been running a course as part of the undergraduate BSc degree program for the last three years with Liverpool John Moores University following suit in January 2012. The University of West Scotland is about to become the first Scottish university to run System Z courses starting in September this year.

    Cally Beck is the Academic Initiative Leader at IBM and says “I work with large enterprise clients to help them attract, find and retain young z skills, particularly on the development side. I know it is a serious issue right across the server platforms. I also get requests for Information management, DB2, Rational and Websphere skills. Although I see requests from regions outside Europe, by far the most comment comes from Europe, and in particular UK and Germany.”

    In terms of DB2, IBM have for some time been working hard to reduce the mainframe-specific skills necessary to manage a DB2 for z/OS environment and while there will always be a need for some people with deep knowledge of the platform it is also now possible to do many roles using GUI tools that are more familiar and comfortable for younger generations and need less mainframe knowledge. The increased role of autonomics is also playing a part in reducing some of the lower-level skills necessary.

    Another advocate of the mainframe is CA Technologies and they run their own Mainframe Academy in a bid to grow mainframe skills within the IT community. They are also running a scholarship project until 2016 to help drive take-up - http://www.ca.com/lpg/mainframe-academy.aspx

    How to manage the change

    Grow your own

    As we’ve seen above there is currently significant effort to increase the number of UK graduates entering the workforce with some level of System z expertise. One possible solution then is for organisations to take these young professionals and grow them into the experts of the future. This of course will take time, expense and effort but it will surely be worth the investment in years to come.

    Training

    Organisations need to be thinking about preparing themselves by looking internally at the skills they already have in-house. In a CA survey 42% of Financial Services organisations said they are currently looking at additional skills and training needs. Re-training and up-skilling already trained DBAs (Database Administrators) will ensure DB2 mainframe skills are retained within the organisation. This approach, backed up by bringing in external expertise where necessary could be a more cost effective option than growing a DBA from scratch, in the short-term.

    Outsourcing

    Wholesale outsourcing of mainframe services is certainly an option but it does bring with it some potential complications. Outsourcing all mainframe services to a third party means that the ingrained organisational knowledge of those currently managing the system can be lost. Although the outsourcing provider is no doubt highly skilled, they don’t have that intimate knowledge of the organisation which is built up over many years.

    A better option is “partial outsourcing” where specific areas of mainframe technology support are outsourced to a niche service provider. In this way the organisation keeps a certain level of in-house knowledge but can also have back-up where necessary from experts in the field. This partial outsourcing approach supports existing staff and can help bridge the gap when skills are in short supply in-house.

    All of the above

    As already discussed, both IBM and CA are putting huge amounts of money and effort into training the next generation of mainframe experts by running education initiatives in the UK. However, training university students takes time to filter through the system and even more time for organisations to train them up and give them sufficient levels of experience. By combining this “new blood” with training existing staff and working with specialist service providers, organisations can build a strong resourcing plan for the years ahead.

    Triton Consulting offer a range of DB2 z/OS training, resourcing and outsourcing services.

  • 17 May 2011 12:00 AM | Anonymous

    The pros and cons of cloud computing may be regularly debated by the IT team but what about the rest of the business? How many Financial Directors (FDs) are still afraid to take the plunge? Most are familiar with the compelling benefits of lower capital expenditure, improved financial control and rapid deployment – all key issues when facing an over-worked IT department struggling to find the time to even discuss business requirements, let alone oversee the delivery of new software. So why the reticence?

    It is, perhaps, understandable that an FD or CFO may be unwilling to rush headlong into the cloud with a business critical application such as the core financial software. But there are other aspects of the financial portfolio that provide a fantastic opportunity for testing the viability of the hosted model.

    As Karen Conneely, Group Commercial Manager at Real Asset Management, explains, those companies that opt for a hosted Fixed Asset Register, can rapidly discover the benefits of the cloud and prove the long term viability of the hosted model for the entire software portfolio.

    Flexible Business

    Demand is growing globally for hosted solutions, as organisations wrestle with continuing financial constraints that are now seriously hampering ongoing business development. With many organisations continuing to pare back internal IT resources, business managers are having to wait months to get access to critical IT skills. They are facing a near complete lock down on the capital expenditure required to deploy much needed new software solutions; indeed they are even struggling to ensure compliance-critical upgrades of financial software are completed on time.

    In contrast, the option to leverage a highly secure, hosted third party solution that can be delivered within days, rather than months, is compelling. Add in the appeal of monthly or annual subscription rather than the huge upfront cost of a perpetual licence, and the hosted model has clear financial and business value.

    So why is it that the IT department rather than the FD is driving the move to the cloud? Is this reticence based on educated mistrust and a justifiable decision not to hand over responsibility for critical financial data to a third party or a simple fear of the unknown?

    Buying Decision

    The decision on whether or not to exploit the benefits that the hosted model can offer is business critical. And while FDs will be understandably reluctant to trial this approach with the core ERP or Financials software, why not dip a toe in the water with other key applications, such as fixed asset management? Indeed, a hosted fixed asset register offers a raft of additional benefits. Automated upgrades ensure the software is always up to date – a key consideration for compliance requirements, particularly in relation to the latest IFRS and SORP regulations affecting UK organisations; while the hosted model also ensures the upgrade process can be achieved without any disruption or dent in user productivity.

    Furthermore, a hosted model provides access to the system from any location, allowing the FD – or other members of the finance team – to run reports, analyse asset information and check depreciation at any time, further boosting productivity.

    With the option to get the new solution up and running within just five working days, organisations can rapidly meet corporate demands for improved fixed asset management, including the adoption of mobile asset recording via PDAs. With minimal up front expenditure, the improvement in asset accuracy combined with the reduction in manual overhead delivers a very quick return on investment (ROI).

    Measured Decision

    Of course, before any such buying decision can be made, FDs need to understand exactly what is on offer. Security is obviously key – no organisation wants to expose its list of fixed assets to the world at large.

    The options are clear: does the business want to opt for a dedicated hosted server, or the slightly less secure, and less expensive, cloud based solution where resources are shared on a virtual machine? Either way, the servers should be located in a highly secure, multi million pound data centre facility that offers security far higher than that of in-house systems.

    Organisations also need to consider how the business will access the system? One option is a dedicated Virtual Private Network (VPN), which would further reinforce the security level, but organisations must ensure the communications bandwidth will deliver the required performance.

    Conclusion

    Whatever route a business decides to explore, there is growing pressure on FDs to at least try out the hosted model. Yes, a large proportion of risk averse FDs may well be unwilling to opt for a major financials implementation as a first venture into the hosted model. But even if the global financial situation radically improves and IT suddenly receives a massive input of resources, the finance team should be at least considering the financial and speed to delivery benefits on offer. It is time to test the waters of cloud computing.

  • 17 May 2011 12:00 AM | Anonymous

    Autonomy Corporation plc, a global leader in infrastructure software, and Iron Mountain Incorporated, the information management company, has announced a definitive agreement for Autonomy to acquire selected key assets of Iron Mountain's digital division including archiving, eDiscovery and online backup for $380M.

    Post-closing Autonomy expects to have a gross cash balance of at least $700 million and expected go-forward revenues of approximately $130 million to $140 million.

    "We are pleased to announce this transaction, one we have been looking at for some time, which will bring significant advancements for customers," commented Dr. Mike Lynch, Group CEO of Autonomy. "Processing customer data in the cloud continues to be a strategic part of Autonomy's information governance business. We look forward to extending regulatory compliance, legal discovery and analytics to a host of new customers as well as enabling the intelligent collection and processing of non-regulatory data from distributed servers, PCs and especially tens of millions of mobile devices. This will afford the opportunity to bring to these customers the power of IDOL's meaning-based technology."

    Dr. Lynch continued, "In 2007 we correctly predicted the merging of regulatory archiving and search, and we believe we are now seeing the next phase where the convergence of regulatory archiving, back-up and data restoration with operational processing of data in the cloud is coming to pass. This acquisition makes Autonomy the cloud platform of choice, processing and understanding 25 petabytes of customer information. IDOL will allow significantly more value, analytical insight and return to be generated for our customers from this cloud platform. This places Autonomy at the centre of the changes in the analytics of unstructured data, processing in the cloud-based platforms and desktop virtualization.

    We've had the opportunity to buy strong assets at an attractive valuation, and through the application of our intelligent IDOL dark server technology will greatly increase the efficiency of this offering. Whilst others may be creating roadmaps around virtualizing the enterprise and processing information in the cloud, Autonomy is already doing it in the world's largest private cloud platform."

  • 17 May 2011 12:00 AM | Anonymous

    Multinational corporations (MNCs) have increased their uptake of cloud services by 60 per cent in the past 12 months, according to new Ovum report entitled What MNCs want: adoption of cloud services.

    The survey questioned senior managers at 102 multinationals, each with at least three locations in different countries, and a minimum of 10,000 employees.

    Some 45 per cent of multinationals are using the cloud for at least some element of their key IT services, the study found.

    "We believe the majority of MNCs are currently between early and adolescent adoption phases of cloud-based services, with broader and deeper adoption being contemplated," said Evan Kirchheimer, practice leader, enterprise services, Ovum.

  • 17 May 2011 12:00 AM | Anonymous

    HP Enterprise Services has announced Dollar Thrifty Automotive Group has signed a three-and-a-half year, $72 million technology outsourcing services agreement to continue supporting DTG’s applications portfolio and managing its technology infrastructure as a foundation for the company’s future growth.

    With this agreement, DTG will move to a hybrid delivery model using a combination of dedicated HP onsite support and leveraged services. This delivery model allows DTG to build, manage and consume technology services that will optimize planning, deployment and management.

    “In today’s competitive market, our technology needs to evolve in a way that increases its flexibility to grow with us while continuing the reliability with which we are accustomed,” said Rick Morris, executive vice president and chief information officer, Dollar Thrifty Automotive Group. “Together with HP, we will continue creating innovative solutions to help us deliver outstanding results to our shareholders and customers.”

  • 17 May 2011 12:00 AM | Anonymous

    Advice on alternative ways of delivering public sector services is to be offered by the Co-operative Group.

    The company said it had set up a joint venture with law firm Cobbetts and the Westminster Bridge consultancy to help people considering setting up an enterprise.

    The move is part of the Co-operative Group's ethical plan, launched earlier this year, to support the growth of co-operatives.

    Chief executive Peter Marks said: "We have heard a great deal from all the mainstream political parties about the co-operative model delivering public sector services but up until now we've lacked a one-stop shop where people can get the advice and assistance necessary to turn great ideas into practical solutions.

    "We know the co-operative model works. Indeed, we have been around since 1844 and in recent years we have enjoyed a renaissance which has seen us double sales, profits and membership.

    "However, we have a purpose beyond profit, with a vision of creating a better society and we believe this new venture will enable us to help communities across the UK deliver important services in an alternative way."

  • 17 May 2011 12:00 AM | Anonymous

    Leading sourcing adviser joins forces with Chinese research and consulting firm to advise sourcing parks, local governments on adopting global standards.

    TPI, an Information Services Group company and the largest sourcing data and advisory firm in the world, today announced the formation of a partnership with Devott, a well-known Chinese outsourcing research and consulting firm, to encourage sourcing parks and local governments across China to adopt global standards and best practices.

    TPI and Devott have created Sourcing International Operational Protocols, a program designed to help attract more leading service providers to the region and make buyers more confident about outsourcing services to China. The program consists of an in-depth review of the current outsourcing environment, the development of improvement plans, and an online self-assessment.

    The self-assessment will be available on the popular ChinaSourcing portal (www.chnsourcing.com) beginning next month. The firms will also jointly publish an Annual Global Best Sourcing Parks report, an in-depth study highlighting best practices and services around the world. The 2011 report will focus on China.

    “TPI is excited to contribute our sourcing expertise and knowledge of both the Chinese and global sourcing markets to this partnership,” Michael Rehkopf, Partner & Director, North Asia, TPI. “This partnership will build on our success establishing sourcing protocols with the Chinese central government and local governments since 2009. Together with Devott, we look forward to helping China develop into a thriving destination for outsourcing.”

    The Chinese market bears similarities to where India was ten years ago, before it began its rise to a major player in the global sourcing industry. China’s large, underpenetrated domestic market, growing scale and experience, and potential buyer in the government present opportunities for significant future growth. Global trends such as the increased prevalence of offshore delivery and multi-sourcing could also serve as catalysts.

    “For China to become a prime location for sourcing service providers from around the globe, market participants here must follow international protocols,” said Haitao Qi, CEO of Devott. “We are partnering with TPI because it shares our commitment to helping improve the sourcing environment in China for both the sell side and the buy side of the industry.”

  • 17 May 2011 12:00 AM | Anonymous

    Imperial College London and University College London are to set up the “smart cities” centre this year in Shoreditch, east London, that will work with commercial partners.

    Mr Osborne said the centre would focus “on the massive amounts of data - energy data, transport data, social data - being generated in the world’s metropolises”.

    “This ‘smart cities’ research centre will develop new technologies, in partnership with leading companies, to harness and exploit these huge new data sets.

    Peter Walker, Information Builders UK country manager:

    “We’re rapidly entering a world where everything can be monitored and measured. However, the big problem is going to be the ability of non tech users to analyse and make sense of the data.”

    “It is great to see the government taking positive steps to recognise the true benefits of data analytics. With this approach, key departments will be able to access statistical and other reporting tools that expose the hidden data that is driving change and growth.”

    However, Dr Giles Nelson, deputy CTO for Progress Software, believes the government must now look forward to adopting real-time technology:

    “For the government, the flow of data currently being dealt with is enormous. While there can by no denying that today’s announcement is a positive move to handle the explosion of data, the next step will be to capture data in real-time in order to proactively respond to events. Only by learning how to apply real-time technology effectively will the government be able to get a true handle on data.”

Powered by Wild Apricot Membership Software