Author Archives: cbroccoli

RSA 2012 Day 3: Securing IPv6 and Moving to the Cloud

The first session today covered the basics of security in IPv6.  IPv6 contains some features which provide it with some additional security.  Some are not actually features designed into the protocol but just exist because of the nature of the IPv6 address space, for example brute force scanning of IP addresses will no longer be possible with IPv6 just because of the sheer size of the address space you will need to scan.  Of course on the down side, this feature makes vulnerability scanning also impossible if it is based on scanning IP addresses. The same is true when using ULA addresses for internal private addressing (like the 10.0.0.0 in IPv4).  Since the number of ULA networks is so great, each company can pick their own and there will be virtually no chance that there will be an overlap with other companies. Worms will no longer be able to spread just by counting up IP 10.0.0.0 addresses and infecting the next active device.  Finally designed in features, such as IPSec or secure neighbor discovery do secure the protocol, howver, since IPSec is no easier to manage in IPv6 than it is in IPv4, it does not provide any additional security over using IPSec in IPv4.

Administrators should also actively implement certain controls to secure IPv6.  Controlling the boundaries of where headers can be distributed, controlling rogue router advertisements through using IPS and filtering at the layer 2 switches, and blocking tunnels (6to4, 4to6, etc.) from any but approved tunnel endpoints will help to secure an IPv6 based network.

The final recommendation is to develop an IPv6 security policy which parallels an IPv4 policy.  Everyplace where you have a policy which references IPv4 should also have a statement about IPv6 plus there should be some IPv6 specific statements to cover the IPv6 specific features.

In the Cloud session, the CTO from NASA’s Jet Propulsion Labs discussed how they use public and private clouds within their organization.  They have developed a very interesting model of using both public and private clouds depending on what the use case is for the data and applications being implemented.  The model allows the users to define their requirements for their application in an on-line tool and they will be given options showing which cloud based services are allowed to be used based on security level, performance, availability, cost and other factors.

RSA 2012 Day 2: Firewalls and Cloud Computing

This is the second installment of my notes from the RSA 2012 Conference.

Today was dominated by keynote speakers in the morning and technical sessions in the afternoon.  I attended three sessions, two of which I summarize below.

In the first session, “Firewalls – Past Present and Future,” a panel of 3 specialists from Juniper, Paolo Alto Networks and NSS discussed the future of firewalls.  The general consensus was that, despite the rumor that the firewall is dead; the firewall is very much alive and moving into new spaces as new disruptive technologies (cloud, mobile smartphones) are being implemented.  The firewall is now evolving to address a number of use cases, from the classic use case of filtering incoming traffic from outside the perimeter (outside – in), to filtering outgoing traffic from the internal network for such things as social networking traffic (inside – out), to filtering and protecting distributed public and private cloud based services which are capable of moving without notice within and between clouds.  In this regard one of the primary challenges is the overall management of this distributed firewall landscape.  Not only do the policies have to follow the protected object (e.g. the server, data or application) wherever it goes, the policies have to be applied consistently across the enterprise based on an overall security architecture.  The tools being developed for managing such environments will move away from the classic single vendor GUI to an open independent management console based on APIs.

The second session, “Data Breaches in the Cloud,” was presented by two lawyers and focused on what enterprises should consider when planning a move the cloud.  The speakers discussed the points which should be included in a cloud contract and what areas of the service should be reviewed as part of a due diligence the customer should perform on the cloud service provider.

The contractual points which I thought were worth noting were:

  • Defining the level of access the customer would have for performing audits on the providers cloud infrastructure.  Issues such as what may the customer audit, what level of access do they have, when can they perform the audits are some examples of topics which should be defined.
  • Defining what a security breach is in advance and defining what the time frames for notification of a security breach.  In some countries the notification timeframe is becoming regulated and should therefore could vary from country to country

Areas which the speakers recommended that customers should review as part of a due diligence are:

  • DR plans and other operational issues
  • What certifications does the provider have (e.g. ISO 27001, etc.)
  • Which forensic providers are used and what access levels do they have to the data if another customer in a multi-tenant environment is breached
  • How are multi-tenant environments segmented

RSA 2012 Day 1: Surviving as a Security Leader

All this week I will be posting notes from the RSA Security Conference taking place from 27. February – 3. March 2012.

Day one of the conference consisted of a half day seminar on the topic of building a security organization.  The five presentations were geared mainly towards CISOs and each of the presenters are currently or have been CISOs in the past.  I attended this session hoping to hear what CISOs are considering when setting up an organization, what their priorities are and what their vision of the future might be so that I could hopefully understand what opportunities might be available for me as a solutions architect in supporting their efforts.

A main theme running through all of the presentations was the alignment of security policies, strategy and initiatives with the business.  The CISO must take the time to understand the risk profile and risk tolerance of the organization and be able to propose policies, strategies and initiatives that are in line with them.  Just following regulatory requirements or implementing so called “best practices” will not cut it.  Regulators are looking for this understanding of what the risks are and want to see initiatives and policies in place to mitigate the risks.

Associated with this is the understanding of the appetite for change within an organization.  Proposing too much at one time could overwhelm stake holders and you may get nothing approved.  It is better to prioritize and propose actions in small doses, building confidence and credibility along the way.

One of the key factors in this alignment and also a topic in each of the presentations is the use of security metrics and reporting.  Done right, reporting on relevant metrics can go a long way towards showing the value of the security program and its progress over time.  Reports should be tiered to show relevant data to each type of stakeholder, from the Board down to the operational teams.  Having a one-size fits all report won’t work.  Above all they should add business value and be explained in a language appropriate to the audience.  Of course Andrew Jaquith’s book “Security Metrics: Replacing Fear, Uncertainty, and Doubt” was referenced a number of times and he will be part of a panel discussion later on in the week.  The book is a must read (yes I own it and have read it 🙂 ) for anyone planning security reporting or secuirty presentations for customers.

When asked about what parts of the security organization should be outsourced and which not, the panel agreed that oversight, governance and strategy should be kept in-house.  Routine operational tasks and tasks where expertise is either unavailable or too expensive to keep in-house should be outsourced.  Some examples are IDS/IPS monitoring, source code analysis and penetration testing.  Interestingly, SIEM and desktop security got mixed answers since the panel was not convinced that an outsourcer would have a good enough understanding of the risk profile of the customer.

All in all this was a good session to start the week off.  Given the number of technology companies which will be on display this week it sets the right tone, even if there were not earth shattering revelations.