Monthly Archives: February 2012

RSA 2012 Day 2: Firewalls and Cloud Computing

This is the second installment of my notes from the RSA 2012 Conference.

Today was dominated by keynote speakers in the morning and technical sessions in the afternoon.  I attended three sessions, two of which I summarize below.

In the first session, “Firewalls – Past Present and Future,” a panel of 3 specialists from Juniper, Paolo Alto Networks and NSS discussed the future of firewalls.  The general consensus was that, despite the rumor that the firewall is dead; the firewall is very much alive and moving into new spaces as new disruptive technologies (cloud, mobile smartphones) are being implemented.  The firewall is now evolving to address a number of use cases, from the classic use case of filtering incoming traffic from outside the perimeter (outside – in), to filtering outgoing traffic from the internal network for such things as social networking traffic (inside – out), to filtering and protecting distributed public and private cloud based services which are capable of moving without notice within and between clouds.  In this regard one of the primary challenges is the overall management of this distributed firewall landscape.  Not only do the policies have to follow the protected object (e.g. the server, data or application) wherever it goes, the policies have to be applied consistently across the enterprise based on an overall security architecture.  The tools being developed for managing such environments will move away from the classic single vendor GUI to an open independent management console based on APIs.

The second session, “Data Breaches in the Cloud,” was presented by two lawyers and focused on what enterprises should consider when planning a move the cloud.  The speakers discussed the points which should be included in a cloud contract and what areas of the service should be reviewed as part of a due diligence the customer should perform on the cloud service provider.

The contractual points which I thought were worth noting were:

  • Defining the level of access the customer would have for performing audits on the providers cloud infrastructure.  Issues such as what may the customer audit, what level of access do they have, when can they perform the audits are some examples of topics which should be defined.
  • Defining what a security breach is in advance and defining what the time frames for notification of a security breach.  In some countries the notification timeframe is becoming regulated and should therefore could vary from country to country

Areas which the speakers recommended that customers should review as part of a due diligence are:

  • DR plans and other operational issues
  • What certifications does the provider have (e.g. ISO 27001, etc.)
  • Which forensic providers are used and what access levels do they have to the data if another customer in a multi-tenant environment is breached
  • How are multi-tenant environments segmented

RSA 2012 Day 1: Surviving as a Security Leader

All this week I will be posting notes from the RSA Security Conference taking place from 27. February – 3. March 2012.

Day one of the conference consisted of a half day seminar on the topic of building a security organization.  The five presentations were geared mainly towards CISOs and each of the presenters are currently or have been CISOs in the past.  I attended this session hoping to hear what CISOs are considering when setting up an organization, what their priorities are and what their vision of the future might be so that I could hopefully understand what opportunities might be available for me as a solutions architect in supporting their efforts.

A main theme running through all of the presentations was the alignment of security policies, strategy and initiatives with the business.  The CISO must take the time to understand the risk profile and risk tolerance of the organization and be able to propose policies, strategies and initiatives that are in line with them.  Just following regulatory requirements or implementing so called “best practices” will not cut it.  Regulators are looking for this understanding of what the risks are and want to see initiatives and policies in place to mitigate the risks.

Associated with this is the understanding of the appetite for change within an organization.  Proposing too much at one time could overwhelm stake holders and you may get nothing approved.  It is better to prioritize and propose actions in small doses, building confidence and credibility along the way.

One of the key factors in this alignment and also a topic in each of the presentations is the use of security metrics and reporting.  Done right, reporting on relevant metrics can go a long way towards showing the value of the security program and its progress over time.  Reports should be tiered to show relevant data to each type of stakeholder, from the Board down to the operational teams.  Having a one-size fits all report won’t work.  Above all they should add business value and be explained in a language appropriate to the audience.  Of course Andrew Jaquith’s book “Security Metrics: Replacing Fear, Uncertainty, and Doubt” was referenced a number of times and he will be part of a panel discussion later on in the week.  The book is a must read (yes I own it and have read it 🙂 ) for anyone planning security reporting or secuirty presentations for customers.

When asked about what parts of the security organization should be outsourced and which not, the panel agreed that oversight, governance and strategy should be kept in-house.  Routine operational tasks and tasks where expertise is either unavailable or too expensive to keep in-house should be outsourced.  Some examples are IDS/IPS monitoring, source code analysis and penetration testing.  Interestingly, SIEM and desktop security got mixed answers since the panel was not convinced that an outsourcer would have a good enough understanding of the risk profile of the customer.

All in all this was a good session to start the week off.  Given the number of technology companies which will be on display this week it sets the right tone, even if there were not earth shattering revelations.

Juniper and a general NAC architecture

As the final NAC vendor, I decided to look at Juniper.  At the end of 2011 Gartner posted their new magic quadrant report for NAC, which Juniper (who of course is in the Leader’s quadrant) kindly published for the general public (go to www.juniper.net if you would like a copy).  Surprisingly, Cisco was also up in the leaders quadrant.  Curiously, HP was not included in the report even though they seem to have a fairly well rounded solution.   What I found most interesting about the report is the point that BYOD will be a driving force which may actually bring this wave of ANC products into the mainstream, something which to date, has not yet happened.

In looking at the Juniper model, and comparing it to the other two, it would seem that they all have converged on the same general architecture, even if the underlying protocols or implementations are different.   The architecture is illustrated in the following diagram:

NAC Architecture

The central control system from Juniper is the IC series Unified Access Control appliances.  These devices interact with layer 2 switches and AD to provide the dot1x authentication as well as Juniper secuirty devices (SSL VPN, firewalls) to provide the egress enforcement of the policies.  From the literature, it looks however, like the system is not as dynamic as the ISE from Cisco.   Policies are dynamically loaded onto the firewalls for egress filtering, but the policy is statically configured with the users IP address.  So even if it is loaded dynamically, it needs to be set up in advance.  Cisco applies the SGTs to the packets, which decouples the policy from the IP address, which is great.

Juniper does seem to support multiple NAC clients, including Microsoft statements of health.  The Junos Pulse client is available for Windows with some other dynamic clients available for linux and MACs.  There is a Pulse client for mobile devices (iOS, Android) but unfortunately this client does not support NAC for internal access.  It is however an interesting product for mobile device managment and remote access vpn to an enterprises network.  The MDM solution is a SaaS service provided by Juniper.

For a single management console, Juniper has their NSM (Network and Security Manager).  This console allows the admin to manage the IC appliances from a central location where switches and other devices are managed from.