Sunday, December 19, 2010

Malware Threats

Found this excellent flow chart which provides financial motivations and technical path with which malware threats are implemented.

http://computerschool.org/computers/malware/

Tuesday, November 30, 2010

MAEC and sample tools used to detect malware

Malware Attribute Enumeration and Characterization (MAEC) is the standard to represent malware by attributes. MAEC provides schema which can be used as a basis for creating malware repositories. It can also be used as the format to share malware information between applications. Basically, either Static and Dynamic analysis techniques are used to discover attributes of malware. Static analysis is performee by looking at the code and dynamic analysis is by at tracking the behaviour of system. Once the attributes are discovered with any of the techniques the applications can adopt MACE to report on discovered attributes of malware.

CWSandbox is one of tools which uses dynamic analysis techinque to report on detected malware. It is available for windows and yet to adopt MAEC.

ThreatExpert has tools for detecting malware on windows and does it by looking at changes in file system, memory, registry, and Outbound and SMTP traffic data. Here is the sample report form ThreatExpert memory scan from my system -

Full Scan Summary:
Scan details:
Scan started: Tuesday, November 30, 2010 20:15:23
Scan time: 01 minutes, 53 seconds
Number of memory objects scanned: 9356
processes: 60
modules: 3085
heap pages: 6211
Number of suspicious memory objects detected: 0
Number of malicious memory objects detected: 0
Overall Risk Level: Safe
Summary of the detected threat characteristics:
No suspicious characteristics detected.
Summary of the detected memory objects:
No suspicious memory objects detected.

For now, i could use the above tools and looking for other free tools which can be used to detect malware. Also looking out for tools which can report using MAEC schema format. Will keep this bolg updated on my findings. Adieos till then.

Sources -
http://maec.mitre.org
http://mwanalysis.org
http://www.threatexpert.com

Thursday, September 30, 2010

Vulnerability Assessment and Management

Vulnerability is weakness which can be exploited in a system. To find all the weaknesses in a system, Vulnerability scanning is performed. It involves running a program on one machine and then connecting via a network to machines that you choose to check.
This would help to find and fix weaknesses in systems before someone else finds it and decides to break in.

Vulnerability scanning is a part of Defense in depth strategy and would lead to
• Asset discovery.
• Provides necessary information to ensure that hosts with an enterprise are safe from known attacks.
• Provides enough data for tracking internal security posture over time.

Hence it is a key part of managing risk and will identify the risk of every system, not just the ones we know about.

To know the state of systems, we need to understand the weakness we are trying to defend against and where possible, remove those weaknesses. For this we need a source of all known vulnerabilities and what patches are available to address these vulnerabilities. This can be done by monitoring individual software manufacturer or get consolidated notifications from provider such as http://securitytracker.com.

The scanning can be performed either by deploying Vulnerability scanning application in the enterprise or by using SAAS services by VAM service providers such as http://securityspace.com or http://www.qualys.com.

When ever new update is released, system administrator should evaluate and determine its application for the organization and then install it.

NIST has defined various SCAP (Security Content Automation Protocol)standards which provides standard format to collect application/system attributes, performing assessments using specific tests and displaying results. When vendors follow these standards in providing its output, the results can be easily consumed by other products.

Some of standards that can be used with VAM are
CPE - Common Platform Enumeration, format to be followed for providing platform specific details such as attributes of OS.
CVE - Common Vulnerabilities and Exposures, a way to provide vulnerability and exposure information by product vendors.
OVAL – Open vulnerability and assessment language, which is an open standard from MITRE. It Enables automated assessment and compliance checking.
It provides standard schema for entire Assessment process
1. Data collection
Collect data bout the system under test. This would have oval system characteristics. There are oval definitions available for this.
2. Analysis
Collect and organize results from assessment. Oval definitions are available for this.
3. Results
Arrange detected data against defined machine states. Oval results schema is available for this.

Once the scanning is performed, useful metrics should be derived and reported on. Some of the metrics like percentage of vulnerable systems, Time from discovery to remediate, will benefit custodians and updated required enterprise policy to improve compliance.

Vulnerability scanning on regular basis generates a lot of data. Only by demonstrating that the data collected can be of real benefit will the enterprise come forward to deploy Vulnerability and Asset Management (VAM) application in the enterprise. New deployments of VAM should use security products which are certified with NIST standards.

References:
http://oval.mitre.org
Articles/papers from SANS reading room http://www.sans.org/

Wednesday, August 18, 2010

Incident Response Requirement of Massachusetts's Data Protection Law

Summarizing points form the Presentation of Joh Moynihan on Mass Law focusing on incident response -
The Mass Law -201 CMR 17 is effective from March 1, 2010. It applies to any entity collecting "personal information (PI)" of Massachusetts residents. In order to comply with it all entities processing Massachusetts resident’s personal information should have preventive measures in place. The law imposes severe penalties for violations. It requires having Incident Response Plan and applies to handling employee and customer records, avoid internal threats for employee or vendor data.

There are Administrative, Technical and Physical Requirements to comply with the law.
The organizations should adhere to administrative requirements by performing assessment of internal and external risk, have written Information security program, and develop policies to protect PI. This can be accomplished with ongoing employee training, having incident response plan, formal disciplinary standards, and third party controls.

Adherence to technical requirements requires that the PI to be encrypted, have updated virus protection and firewalls, have controls for password protection and measures to disable account after failed logon attempts. This can be accomplished with monitoring to detect unauthorized access, having patch management, and access controls in place.

Adherence to Physical requirements requires restricted physical access to PI, monitoring of areas housing PI and applies to both electronic and paper records.

It’s evident from the requirements that having an incident response plan is essential. It must be organized in a timely and efficient manner with engagement from independent participants. Organizations should adapt to change and evolve toward a pro active approach to data protection.

Saturday, July 31, 2010

DSCI Best Practies Meet

From the DSCI(Data Security Council of India) Best Practices Meet I attended this week (28th July 2010), some of the quick notes I would like to share are here.
DSCI has come up with 2 frameworks
- DSCI Security Framework (DSF)
- DSCI Privacy Framework (DPF)
Both DSF and DPF have Best Practices to be followed to achieve data protection. DSF focuses on Security related to Application,Infrastructure,Business Continuity etc., and DPF is based on global privacy best practices and frameworks.
Implementing DSF would help companies to achieve compliance with ease. I am sure that if all of the relevant Best Practices in each of the 9 disciplines of DSF are implemented in organizations, compliance objective would be met without any question. The benefits of implementing DSCI framework was also presented in a session and it was interesting to know how it helped in increased business profits. My quick notes ends here i would request you to visit http://www.dsci.in for more details.

Wednesday, June 30, 2010

Complex Event Processing

Complex Event processing(CEP) provides a means to gain actionable information from various events coming from desperate systems in real-time or near real-time. Increase in the number of attacks has increased the need for real-time processing of events and hence need for CEP system/product. Detection of attacks/vulnerabilities has shown that additional details are required to aggregate, correlate and analyze apart from the individual events coming out of various systems. Most products in market for event processing support query language which supports pattern matching, joining events on arbitrary criteria and creating time-based windows. Like other security deployments,i see that it is a challenge to deploy it for enterprises, handling many events from multiple streams of data and monitoring queried events to detect abnormalities in real time. And so, would also need some one who is expert and focused to make use of information derived from CEP systems.

Sunday, May 30, 2010

SIEM for Compliance and Risk management

Security Information and Event Management technology is driven by compliance and security needs in enterprise. Another factor for its use is for monitoring activity from different applications/devices/technology software’s for internal and external threat and fraud detections. The output form these technology can be used for automation of IT to meet Compliance and Risk Management needs.

Compliance management can be achieved from log management capability of SIEM technology and it is known as Security Information Management (SIM). This involves collection of logs from various devices/applications in the enterprise to a central location, analyze log data and provide the capability to generate useful reports meeting compliance requirements. The generated reports can be mapped to the IT procedures in organization as first step in achieving It automation. As SIM is compliance oriented it is also a means to store and archive logs for later investigations and for data retention requirements.

Risk management can be achieved from real-time monitoring and incident management capability of SIEM technology and its known as Security Event Management (SEM). This involves collection of events from various devices/applications within scheduled short intervals, correlating the related events, applying filters if required and generating alerts which can be monitored for analysis and addressed with in short duration of time.

Exploiting the capabilities from data mining and analytics to meet SIEM requirements is now the vision of some of the leading SIEM technology vendors. It’s evident that having SIEM technology which can scale, collect data from all applications, meet regulatory compliance reporting requirements, improve threat management and incident response capabilities is essential factor for automation of Compliance and Risk Management in enterprises now and in future.

Friday, April 30, 2010

Fundamentals of Risk Management

Here are some details I got to hear from a session on Leveraging Technology for Risk Management. The talk was a part of NASSCOM tech series and Mr. Vijay from KPMG was an excellent speaker talking about risk management.

As in one of his slides, the fundamentals of risk management are to
Know your risks
Know your Obligations
Know your Systems
Tie them up together and leverage technology for risk management.

He also talked about use cases on risk management and how it was overboard. One quote i recollect is ' What's the point in risk management if the result of analysis is not used!'.
He emphasized how risk management can really help organization in mitigating risk. Steps to initiate risk management would be to start small, merge physical and system access identities, get incidents and slowly respond to them globally, Security should be monitored on global basis.

To conclude on my understanding, many organizations now realize the need for Risk management. To what extent the technology can be leveraged in meeting risk management objectives would depend on the strategic plan and steps initiated to taken in this direction.

Tuesday, March 30, 2010

IT GRC and Risk Management

For success of  GRC in any enterprise, it should be a process driven by management i.e., a top-down approach. GRC would enable enterprises to have a consistent, consolidating data which in-turn can serve audit, risk management and control and compliance purposes.Good governance needs Risk Management. Assessing risk in IT is at the heart of enterprise GRC and it can be performed as a qualitative or quantitative approach. Data collection process for arriving at risk can be from many means such as offline interviews, web questionnaires, email, mobile or any other devices. But automation can only be achieved from technology resources. Risk calculation is an interesting part in risk management. Some input factors required as for arriving at risk are risk with respect to Policy adherence, established controls for meeting compliance, handling incidents and events, mitigating risk with remediation on time. NIST (800-30) and ISO Guide 73 provide enough guidance on how the risk assessment should be performed. There are few products in market which have the capability to automate data collection as evidence technology sources and provide risk analysis reports. Risk is closely related business and specific to each organization. How much of automation is really possible in Risk Management, will it be really usefulness for enterprises are open questions i am trying to find answers now.

Thursday, February 18, 2010

Attacks with Virtualization

Virtualization makes the provision and movement virtual machines faster in enterprises today. But the companies should make sure that they have implemented proper security control for the Virtual Machines (VM) and adhere to the compliance requirements and policies of the company. Advances in virtualization technology has also led to new methods to attack and penetrate into the networks of companies. Simple pictorial representation of the different layers in virtual environment and some of the attacks in those layers is given below.


Most common among the type of attacks on virtual environments is Hyper-Jacking. In this type of attack, the hypervisor itself is attacked and used by the attacker for harmful purposes.
Next type of attack is VM escape. This type of attack can cause serious threat to VM security. Here the attacker's code breaks OS of the VM and interacts directly with the hypervisor. With this type of attack they can discover other VM's and eventually take over entire virtual environment.
VM poaching is similar to Denial of  Service attack. The aim for the attacker is to overload the hypervisor, drain all its resources and make eventually make it non functional.

To gain maximum benefit of virtual environments, they should be monitored and managed well. Ensuring  virtual machine software patched, Installing only the resource-sharing features that are really needed and minimizing software installations to a minimum are some the steps the VM administrators can follow to keep it safe from attacks.

Save this article

Tuesday, February 2, 2010

Security and Complaince issues with Cloud Computing

Cloud computing has become the buzz word of the infosec world now. There are 'n' number of definitions for it and so i would not list them here. Topics discussed along with cloud computing are related to the advantages it brings in to enterprises or the issues/challenges to be faced with it.  Among others, security and compliance are hot topics discussed often.

Let us take the service models in cloud and try to see if there are any security and compliance challenges there. The service models in cloud are SaaS (Software as a Service), PaaS (Platform as a Service), IaaS (Infrastructure as a Service).
In SaaS everything like infrastructure, network, servers, storage, application is owned by provider, the consumer may have limited user-specific permissions. Taking email SaaS as an example, the consumer would just need a web browser to access the service. The consumer should trust the provider for the service being accessed. Secure connection and encryption are the steps to be taken  by the provider to establish it. Next challenge is with 'Muti tenancy' support by which the provider manages multiple instances of  service for different consumers.  The provider is guided by data protection, privacy, retention related regulations and frameworks to comply here.
In PaaS the consumer has control over the application deployed which is developed using provider's platform and some application hosting configurations. Trust and compliance issues as in SaaS apply here too. The consumer is also responsible for ensuring secure inter component communication with the application deployed.
In IaaS the consumer has capabilities to control fundamental computing resources and can deploy software in it. Its certain that trust, multi tenancy, encryption and compliance are key concerns in all the service models.

Next, is the different deployment models in cloud which are Private cloud owned by an enterprise, Community cloud which is shared for specific community, Public cloud which is sold for public and Hybrid cloud which is a composition of two or more clouds. Clearly security requirements, policy and compliance considerations increase for deployments starting from private to hybrid models. Cloud providers here are responsible to protect data. Important laws like HIPAA and GLBA requires the organization to safeguard the data. Also cross border data transfer should consider EU data protection drive or safe harbor which requires at minimum where the data is going to be and its implications. Data security law like Massachusetts requires providers or any third party to maintain security measures for personal information data. Encryption is another requirement to be addressed by the providers. Handling compliance here is related to meeting FISMA, HIPAA, SOX, PCI and SAS 70 Audits by the providers.

Organizations and governments have taken initiatives to address security and compliance challenges in cloud. It is evident that most cloud require strong security controls. As there can not be one cloud which fits all there would be many standards coming up and guide the providers and consumers for taking cloud computing to next level.

Thursday, January 14, 2010

What Changed after CISSP

CISSP certification requires you to have knowledge on wide range of security topics and given a circumstance, how you apply them at work. I was planning to take up this exam from very long time and I was under the assumption that becoming CISSP would change lot of things in Carrier for sure. The fact is true, there is a drastic change in approach and how to apply the knowledge gained during preparing for the exam. You gain a vast knowledge on different security domains and consider various aspects of security while design or implementing new solutions. You get used to reading so much for the exam that it becomes like a habit to read more. Also to keep up with the certification current, you keep yourself updated with current happenings in industry, tend to read more articles, white papers, blogs, posts, listen to webinars which may be technical, non-technical or security related topics. The only thing constant is CHANGE and it is evident for me after becoming a CISSP. A whole new approach at work and carrier.