Industry news

  • 30 Aug 2012 12:00 AM | Anonymous

    Avecto examines the difference between proactive and reactive digital forensics and explains their contribution in the fight against malware and malicious activity.

    For a number of years digital forensics has referred to ‘the application of computer investigation and analysis techniques to gather evidence suitable for presentation in a court of law’. While collecting this digital evidence, to be used retrospectively in subsequent litigation, is a valid activity there is growing support for a more proactive proposition.

    Organisations need all the help they can get if they’re to adequately fight back against malware proliferation and malicious activity. We’re about to witness a new dawn for digital forensics.

    Digital Forensics by Any Other Name

    We’re all familiar with the risks our enterprises face from rogue or untrained IT administrators gaining access to the corporate servers and wreaking havoc. This can be anything from accidental and/or unwanted changes and bad IT practices to corporate espionage and malicious revenge attacks. This has been a key driver for organisations to develop and store an audit trail of privileged activity, across the network, to provide clear visibility of what’s taking place and who is performing it. More recently, this trail has also been critical to verify an organisations compliance with legislation.

    These activity logs, often touted as irrefutable evidence of the organisations regulatory stance for auditors, to all intents and purposes are examples of digital forensics in action.

    Digital forensics can be split into two practices – proactive and reactive forensics. Let’s look at the evidence:

     Reactive Forensics

    As the name suggests, reactive forensics looks at something that has already happened then, retrospectively, conducts a post mortem and analyses the witnessed behaviour to glean what can be learned to prevent it happening again. Often considered the more traditional approach to security, it forms the bedrock of a number of security applications - such as firewalls and anti-virus software.

     Pro-Active Forensics

    Conversely, proactive forensics is the practice of looking for something in advance based on high level futuristic rules. Rather than responding to a situation, proactive forensics can be used as an early warning system by using key characteristics to identify certain behavioural changes in applications, detect anomalies in network traffic or unexpected alterations to system configurations. It requires a very high level view of everything that’s going on across the entire network. However, to be truly effective it must also be capable of issuing timely alerts when something erroneous occurs.

    Neither, nor, but a combination of both

    The way I see it is both elements go hand in hand. You can’t build good proactive monitoring systems without first knowing what to look for. However, that’s just one element as it’s only as strong as the rules you use to analyse the information that’s coming back.

    And therein lies the problem - they’re both based on rules. Unfortunately, malicious code writers and insider attackers don’t play by the rules so it’s an always going to be an ongoing struggle.

    Ultimately, what it boils down to is the organisation’s ability to create and effectively use an intelligent set of rules, to filter the evidence digital forensics correlates to look for pre-determined behaviour or system configuration changes that it is not expecting.

    For example, the use of a privileged identity can be a key indicator of suspicious activity, especially in applications that would not normally require admin rights to run. Take a web browser, for instance, if it were to ask for admin rights it should be flagged in any early warning system that something untoward may be about to occur. From this proactive position, it should then reactively quantify the request to determine its legitimacy. It could be something benign - such as installing a trusted Active X control, or it could be sinister - such as a drive by download that is trying to gain admin rights to take control of the system.

    Tomorrow’s too late

    A further complication for organisations is making timely use of the information being generated by the disparate security systems in use across the enterprise. If you don’t have the ability to process and make sense of all the information then ultimately it’s just more data taking up room.

    Instead, the data needs to be fed into a single repository capable of processing this very large constant flow of high bandwidth information and alerting those responsible when something erroneous occurs.

    For an organisation to be able to identify the one little nugget that might suggest that something bad has happened, or is about to happen, it needs good rules. Otherwise it risks the clues being missed and the alert not sounding or, if it’s too sensitive, the alert being hidden amongst all the generated ‘noise’.

    As you can see this balancing act is exceptionally complex. Organisations need to build, or deploy, intelligent tools capable of dealing with the volume of information. It’s about understanding what to look for and using powerful tools to accurately determine something truly malicious that requires intervention. If this expertise lies in house then that’s fantastic. Alternatively, solutions are available that offer and deliver the necessary intelligence.

    While some might argue that prevention is better than cure, even the best antidote will need an initial injection of venom to stimulate the production of antibodies. Digital forensics will become increasingly important as part of a security program, can you afford to let the clues slip through your virtual fingers?

  • 29 Aug 2012 12:00 AM | Anonymous

    In an economic environment that provides no leeway for mistakes and when a short burst of downtime can eradicate all profit margins, organisations of every size need to improve the performance and uptime of equipment. Yet a lack of capital budget continues to constrain the vast majority of maintenance teams: how can any organisation expect to maximise asset value when critical information regarding history, performance, stock and resource utilisation is recorded manually?

    Times are changing. As Karen Conneely, Marketing Manager for Real Asset Management explains, cloud-based Software as a Service (SaaS) overcomes the traditional IT and budget barriers to provide a business with reliable, cost effective software that is easily and quickly deployed. Cloud-based software doesn’t involve large, up front licencing costs or massive internal IT overheads and, critically, delivers the high level of functionality required to improve performance in both proactive and reactive maintenance activity.

    Maximising Asset Value

    With the continued economic travails affecting virtually every sector, organisations are under increasing pressure to maximise asset value. From the school that needs to extend the life of essential IT and teaching equipment to the manufacturing site desperate to minimise downtime and improve productivity, asset maintenance is finally taking centre stage.

    Yet the vast majority of UK businesses are struggling to drive forward improvements. Most are still reliant on highly inefficient, paper based processes or at best, spreadsheets for information. With no immediate or accurate insight into the cost of maintenance, stock expenditure or stock availability, into asset performance history or trends in repairs, it is impossible to establish more efficient asset maintenance processes.

    Without trusted, accurate asset information, how can any organisation determine whether or not asset life can be extended without compromising reliability and performance?

    As a result, organisations are actually incurring more costs, needing additional personnel to record the information required to meet burgeoning health and safety and compliance regulations. If companies are to respond to the demands to maximise asset value, drive down costs and improve efficiency, it is essential to evolve beyond the realms of basic maintenance solutions and processes.

    New Model

    Of course, many organisations have been looking to embrace robust asset maintenance solutions but have been constrained by the lack of capital budget available. Not only have budgets been slashed, but the time and red tape associated with getting a new software solution onto a network can be disheartening. The reality today is that despite the essential nature of maintenance management and asset maximisation, the vast majority of organisations simply cannot justify the expenditure in new hardware, networking infrastructure or IT expertise to implement or upgrade a maintenance management solution.

    As a result, the growing availability of cloud-based maintenance management systems is compelling for many reasons. The model can provide unprecedented, low cost access to highly functional software that can transform performance and ensure that today’s mobile workforces have full, real time access to asset maintenance information, enabling them to reduce failure rates and to improve equipment up time.

    Critically, for every organisation, the subscription based, hosted model enables the maintenance team to sidestep time consuming expenditure approval and gain access to the new system within a matter of weeks.

    So what is holding companies back? With early security concerns allayed in an increasingly mature marketplace, there is no requirement for the more costly private Enterprise Cloud with dedicated resources that demand IT input and overhead, which also incur large associated costs for construction and on-going support. Opting instead for a SaaS solution removes all IT overhead, driving down costs whilst delivering secure data storage and back up, as well as automated software upgrades to ensure the business always has access to the latest functionality.

    Asset Insight

    Cloud-based software as a service can be deployed quickly and reliably, with minimal investment, consequently providing organisations with rapid access to the critical asset information required to improve performance in both proactive and reactive maintenance activity. The web based model transforms accessibility – from providing mobile field engineers with real time access to check asset history, request parts and update maintenance activity, to enabling managers to review and approve stock requests and facilitating contractors inputting information relating to work undertaken. This anytime, anywhere access, via tablet, phone or laptop device is critical to improve the timeliness and accuracy of asset information and resource utilisation.

    With detailed information about asset history, stock utilisation and asset performance, organisations can use in depth analysis and reporting to determine new maintenance strategies. Day-to-day performance can be measured via dashboards, enabling proactive intervention to address problems and minimise the risk of productivity damaging downtime.

    Combining day-to-day performance management with on-going strategic analysis, organisations can transform the efficiency of maintenance work, improve planning, stock utilisation and resource utilisation and gain a return on investment typically within 12 months.

    By removing the capex barrier, the cloud model provides maintenance teams with the chance to make a long overdue investment in technology. Rapid, low cost access to fully functional asset maintenance software will provide organisations of every size with the asset visibility required to transform performance. Having access to in-depth, real time asset history, performance, stock availability and costs, maintenance departments can begin to evolve and to embrace the proactive maintenance strategies required to extend asset life. This allows them to maximise value and to deliver the cost savings, efficiencies and performance improvements demanded by the business.

    1 IDC Digital Universe study http://www.emc.com/collateral/analyst-reports/idc-extracting-value-from-chaos-ar.pdf

  • 29 Aug 2012 12:00 AM | Anonymous

    Over the past decade, there has been a tsunami of identity and access management technology. However, many organizations have not realised the benefits because they have taken a technology-led approach rather than one based on governance.

    The need to identify users, control what they can access and audit their activities is fundamental to information security. Over the past decade, there has been a tsunami of identity and access management technology designed to provide a solution to these needs. However, many organizations have not realised the benefits expected from the application of this technology, because they have taken a technology-led approach rather than one based on governance. In addition, the move to outsourcing and the cloud means that technology and some processes are no longer under direct control.

    What Is Governance?

    According to ISACA, a global association of 100,000 IT governance, security and assurance professionals, governance “ensures that stakeholder needs, conditions and options are evaluated to determine balanced, agreed-on enterprise objectives to be achieved; setting direction through prioritisation and decision making; and monitoring performance and compliance against agreed-on direction and objectives.” While management “plans, builds, runs and monitors activities in alignment with the direction of the governance body”, according to ISACA’s definition, governance sets the policies, procedures, practices and organizational structures that ensure the execution of strategic goals. Identity and access governance sets the framework within which identity and access technology and processes are implemented. By shifting the focus to control rather than execution, governance is also the ideal approach to manage identity and access in an outsourced environment like the cloud.

    Why Does Governance Matter?

    Good governance ensures that there is a consistent approach to risks and compliance across different lines of business and multiple laws and regulations. It can reduce costs by avoiding multiple ad hoc approaches to compliance and risk management. Identity and access governance ensures, in a consistent and efficient manner, that only authorized people have access to their confidential and regulated data.

    The governance process leads the organization to evaluate risks in terms of their likelihood and business impact, and then to decide on the best approach to manage those risks. For example, choosing how to authenticate individuals accessing a system is a trade-off between the risk of impersonation, the value of the information and cost of the different authentication technologies. Where the impact, in terms of losses, would be high, it may make sense to choose a stronger (and more expensive) form of authentication than a username and password. Where the impact is low, a cheaper but less effective authentication process may be more appropriate. Governance provides a way to make this kind of decision effectively and consistently.

    The Objectives of Access Governance

    The objectives of identity and access governance are to manage risk and ensure compliance in consistent, efficient and effective manner. These objectives are:

    • Availability—Business data and applications are available when and where they are needed.

    • Integrity—Data can only be manipulated in ways that are authorized.

    • Confidentiality—Data can be accessed only by authorized individuals and cannot be passed to other individuals who are not authorized.

    • Privacy—Privacy laws and regulations must be observed.

    • Accountability—It should be possible to hold people, organizations and systems accountable for the actions that they perform.

    • Transparency—Systems and activities can be audited.

    Access Governance Process

    Access governance is not just about implementing access governance tools instead of provisioning tools; it is about implementing governance processes. The governance process is composed of three major phases. The initial phase is to understand the business needs and obtain approval for a plan of action. A key objective of this initial phase is to get executive sponsorship, which is critical to the success of any identity and access project. The second phase is to define the organizational needs and to produce a set of metrics and controls. The third phase is to monitor the controls and manage divergence. Governance requires well-described processes, guidelines and books of rules.

    Who Is Responsible?

    The responsibilities for identity and access lay with the lines of business, the owners of data and applications, and IT management. The actual division of responsibilities will vary among organizations, and the following provides an illustration.

    • The owners of data and applications services are responsible for classifying the sensitivity of data.

    • The lines of business managers are responsible for defining what access individuals within their organization should have to the applications and data.

    • The HR department, in conjunction with line management, is responsible for performing background checks on new employees, initiating the on-boarding processes that give the access to IT systems, and initiating the off-boarding processes that remove access rights for employees leaving the organization.

    • IT management is responsible for ensuring that the identity and access infrastructure is installed, configured and functioning correctly.

    • The legal department is responsible for setting up legal agreements to identity federation with partner and supplier organizations as required by corporate management or line of business owners.

    • Lines of business owners are also responsible for the control of access to systems by external users such as customers and partners.

    Monitoring and Control

    In order to govern identity and access, there needs to be a set of measures against which performance can be judged. It is important that the performance at the IT process level can be related back to the strategic business requirements. For example, if a strategic goal of an organization is to comply with EU privacy legislation, then it needs to process the personally identifiable data that it holds within legally defined parameters. The identity and access processes necessary to meet these requirements include:

    • The organization needs to know what relevant data it holds and to classify this data accordingly.

    • Identity management processes need to correctly manage the user’s lifecycle in a timely manner.

    • The access management process needs to control which users have access to information. It also needs to ensure that users with privileged access do not make unauthorized access to data.

    • Processes must be in place to monitor and review which users have access rights to the personal data and which users have actually made access.

    Conclusion

    Managing who can access what is fundamental to information security and to compliance with laws and regulations. Experience has shown that a technology-led approach to this is not effective; what is needed is good governance rather than more technology. One way to attain this is by adopting a holistic governance and management framework such as COBIT 5. A full report on how to move to access governance is available from KuppingerCole.

  • 29 Aug 2012 12:00 AM | Anonymous

    The National Outsourcing Association is investigating strategies, both at corporate and national level, regarding what is already being done, and what could be done to make things better for the next generation. Please get involved – together we can make a difference.

    AGP Give Youth a Chance Evidence Questionnaire.pdf

  • 29 Aug 2012 12:00 AM | Anonymous

    Protesters are preparing to launch a series of campaigns against Olympic sponsor and IT outsourcer Atos.

    One of the campaigns will include a gathering outside Atos’ UK headquarters this afternoon, in remembrance of those who have died since being declared fit for work by the group.

    The demonstrations are aimed at highlighting perceived failings of Atos’ role in assessing disability employment support allowance.

    The Independent revealed today that more than 40 doctors and nurses employed by the French IT company, had been reported for professional misconduct.

  • 29 Aug 2012 12:00 AM | Anonymous

    Research carried out by Virgin Media Business has revealed that a majority of UK CIOs predict that work landlines will, in the next five years, be made superfluous to requirements by the rise of smartphone technology.

    The research polled 500 UK CIOs, and revealed that 65 percent predicted that landlines, would no longer be a common tool at work.

    Tony Grace, chief operating officer of Virgin Media Business, reported on the findings, saying that “businesses have recognised the importance of the mini-computers that smartphones have essentially become. This is leading us to rely increasingly on our smartphones and less on our landlines.”

  • 29 Aug 2012 12:00 AM | Anonymous

    Mobile giant Everything Everywhere has signed a five-year partnership agreement with Mastercard, to deliver mobile payment services.

    The partnership will include the delivery of Near Field Communications (NFC) payment accessibility to customers.

    Marion King, President of MasterCard UK, said: “As the use of cash continues to decline, we will be able to provide Everything Everywhere’s 27 million customers with an attractive range of new payment services backed by the processing power and security of MasterCard”.

  • 29 Aug 2012 12:00 AM | Anonymous

    The delivery of free Wi-Fi throughout the London Underground has been labelled as a ‘great service’ with mass uptake.

    The service was used by 443,000 users during the Olympic Games with the busiest day being the 1st of August with the success of Team GB.

    Gareth Powell, London's underground director of strategy and service development, said: “"WIFI at Tube stations helped keep everyone moving and entertained throughout the Games with up-to-the-minute travel information and journey planners at their fingertips. It's proving to be a great service and we expect it to be very popular during the Paralympics too".

  • 29 Aug 2012 12:00 AM | Anonymous

    Virgin Trains have moved forward with court action against the decision by the government to award the West Coast Main Line contract to FirstGroup.

    The legal action is aimed at forcing a judicial review of the procurement process.

    Pressure has also been placed on the government from Labour, who want a delay on the decision, so that MPs can review the contact.

    Founder of Virgin Group, Sir Richard Branson, commented: "In the bid process the one thing we do know is our bid was voted far more deliverable than FirstGroup's.” and “we're absolutely convinced they've got their maths wrong with FirstGroup.”

  • 28 Aug 2012 12:00 AM | Anonymous

    Global security contractor G4S has lost £50 million and seen a 60 percent dip in first-half profits after the Olympic failure.

    The loss comes from payments for troop deployments, which has been partly responsible for a £151 million drop in revenue from the same time last year.

    Nick Buckles, G4S chief executive, commented: “We were deeply disappointed that we had significant issues with the London 2012 Olympics contract and are very grateful to the military and the police for their support in helping us to deliver a safe and secure Games".

Powered by Wild Apricot Membership Software