Search This Blog


Friday, 25 September 2015

From Blueline to BlueZone - PCI Tokenization Matures

Last year, I wrote about a new Canadian company that had entered the Compliance Appliance market space.  Blueline Data had developed a tokenization gateway that would help you define and isolate your PCI compliance scope boundary.  This isolation was not only for Point Of Sale and Web Merchant portals (Shopping portal), but for Telephony and Unified Communications traffic as well!  This was a revolutionary step in this industry. Several other companies had tokenization systems available for structured and/or unstructured data, however no one had a viable solution that would also cover voice and unified communications. 

A lot has gone on in the past year, and I decided to revisit them, to see where their technology has progressed...


Last year, Forrester issued a paper defining the requirements necessary to secure data into the future, and discussing the technologies that will get us there. The Document titled "TechRadar™: Data Security, Q2 2014", states clearly that you need to:

  • Restrict and strictly enforce access control to data. This includes denying access to unauthorized persons or blocking their attempts to gain access.
  • Monitor and identify abnormal patterns of network or user behavior. This includes tools that analyze traffic patterns and/or monitor user behavior to detect suspicious anomalies (e.g., improper or excessive use of entitlements such as bulk downloads of sensitive customer information).
  • Block exfiltration of sensitive data. These are tools or features of tools that detect, and optionally prevent, violations to policies regarding the use, storage, and transmission of sensitive data.
  • Render successful theft of data harmless. Once you’ve identified your most sensitive data, the best way to protect it is to “kill” it.6 “Killing” data through encryption, tokenization, and other means renders the data unreadable and useless to would-be cybercriminals who want to sell it on the underground market.

The first three have been the bread and butter of the Information Security industry for the past 20 years or so.  From firewalls and both signature and heuristics based Intrusion Detection/Prevention, to Data Loss Prevention systems, the industry has been diligently protecting our perimeters.

It's that fourth one that I'm interested in here.  "Render successful theft of data harmless."  In other words, replace any valuable data such as Payment Card Info, Personal Health Info, Social Insurance Numbers, etc... with a "token" that has no value to would be thieves. These tokens can be made to preserve the format requirements of the original data, so as not to break backend processing, as well as including search/index criteria. 

To properly provide security through tokenization, one must be able to implement it not only on the server side for data at rest, but also for data in transit, as well as at the client side, such that the relevant sensitive data never even leaves the client's network.

What if, there was a service... APIs that could provide tokenization either at the client browser, or as data is passed to cloud apps?

I know that I'm not new to this train-of-thought, but the cost of non-compliance is growing exponentially. 
Financial Damage can be insured against... Reputational damage cannot.

As I said... a lot has gone on in the past year.  Blueline has matured from just providing on-premise gateway appliances, to hosting Compliance Services in the cloud.  

Blueline is about to introduce several hosting options.  You can still get on-premise control if that is what you desire, but that has been augmented with  co-located gateway services as well as true Cloud based "Compliance as a Service"  Tokenization/Encryption through APIs. 

Another move that Blueline has made it to provide "Diskless Tokenization".  Typically, tokenization services keep a very secure database in a cryptographic vault.  This database would include a table of  sensitive data to token pairs that are used to index and manage the tokens.  Across the industry,  customers have expressed concern over having this database, even though it is protected in a vault.  Complaints from too much residual risk, to database latency in very large token pair tables (tens or hundreds of millions of pairs) have driven out an alternate solution.

Blueline has introduce a diskless solution that creates a "derived" token using a one time pad, without the need for the data/token pairs to be stored. These derived tokens, can be recalculated from some secret value that do not need to be stored in a database.

Blueline has created two new offerings:

bluegrid™ is a turnkey solution for  "Compliance in a Box".  It is a standard 19" cabinet, consisting of a series of redundant "bluenodes™" that provide the various security, and compliance services required for a self contained Compliance DMZ. It can be installed in your own data center, or hosted externally for you.  Applying the "Zero Trust" model, bluegrid™ encapsulates your sensitive application environment and provides a full security stack to protect that environment, from firewall, IPS, authentication store, tokenization, encryption, logging and storage.

A standard bluegrid™ rack would consist of a mix of the following bluenode™ appliances:

bluenode tx - Traffic Manager (zero-impact deployment)
bluenode dx - Data Gateway (financial network integration)
bluenode cx - Cyber Vault (diskless tokenization, encryption)
bluenode ix - Identity Manager (device and service access)
bluenode ex - Event Manager (logging and event analytics)
bluenode sx - Storage Block (low-latency shared storage)

bluegrid™ can centralize and limit most of your PCI compliance scope to a single rack in the data center. (Point-of-Sale systems excluded)

bluezone™ takes this one step further, providing a Cloud based Security Infrastructure - leveraging APIs to isolate the sensitive data outside of your IT environment and enabling secure financial or other confidential data processing and exposing the following security services: 
  • Tokenization–replacement of the original sensitive data with a risk-free replica for secure transmission, processing or storage
  • Encryption–military-grade cryptographic protection of digital content
  • Key Management–cryptographic key storage and lifecycle control
  • Payment Gateway–secure real-time and offline merchant acquirer processing of tokenized e-commerce and m-commerce transactions
  • Credit Scoring–secure personal or commercial credit check against a credit bureau, reference agency or central bank
  • Address Verification–secure cardholder address validation
  • Issuer Reconciliation–transaction batch transfer to issuer bank
  • Digital Wallet–secure checkout for merchant commerce sites and mobile applications with the e-wallet payment method
bluezone™ can effectively remove most of your PCI compliance scope from your environment altogether.(Point-of-Sale systems excluded)

Forrester TechRadar report on Data Security Q2 2014 clearly shows Tokenization having "Significant Success" in securing sensitive data.


Friday, 8 May 2015

Test Driving The Aegis Secure Key 3.0

I just received a new item across my desk, and was so excited I had to share!

The Apricorn Aegis Secure Key 3.0 is a high capacity hardware encrypted USB 3.0 flash drive with up to 240GB in Storage Capacity.

The one I received, an ASK-30GB is.. well.. 30GB capacity. 
The first thing I noticed in this impressive device, is the crush resistant black aluminum extruded case.  Rubber seals provide dust and water resistance. The buttons on the front present a very good high quality tactile feel. A comfortable aluminum case closes over the keypad with the aforementioned rubber seals. There is also a nice comfortable weight to it.  Not too heavy... 
More like "This feels like a tool, not a toy" heavy.

Now, there *is* a very slight learning curve to getting it up and running, as you have to train two separate 7-16 digit PINs: one Administrator, and one User pin. As a corporate tool, this is very much a requirement.  If the user loses/forgets their PIN, we can still retrieve the secured contents. Once completed, daily use just requires your User PIN.
This is a true hardware encryption (256-Bit AES XTS Hardware Encryption) based USB media key.  What this means is that there are no specific drivers required for your Operating System to share encrypted files. Aegis are currently awaiting FIPS 140-2 Level 3 certification, expected Q2 this year.

Once unlocked via the keypad, the device shows up as a standard USB media drive.   I was able to read/write files easily between Windows 7, my Ubuntu Laptop, my OSX machine, as well as a Raspberry Pi, and an embedded microcontroller board I'm working on.  Serious compatibility across the board. 

Data transfer was fast.  I did not measure it, but it was quicker than many of the "normal" USB 3.0 flash drives I have on hand.  The documentation put it capable of

Specifications according to Apricorn:

• 256-Bit AES XTS Hardware Encryption
• Software-Free Design
• Cross-Platform Compatible
• Embedded Authentication
• No Authentication Info Shared with Host
• Two Read-Only Modes
• Programmable Brute Force Protection
• Separate Admin and User Modes
• Lock-Override Option
• Forced Enrollment
• 3-Year Limited Warranty
• FIPS 140-2 Level 3 (Pending Q2)
• IP-58 Certified: Dust and Water Resistant

Having come from using a few other software based "Secure Flash" Keys, this device is a godsend. The software keys typically have to store multiple binaries on an application partition in support of the popular Operating Systems. (Windows and OSX are usually included, and more frequently, Linux binaries are available.)  Running the appropriate binary unlocks the remainder of the drive once authenticated. 

I highly recommend this Aegis Secure Key 3.0 anywhere you require sensitive data to be securely stored and transferred between machines. 

Interactive Product Tour

Tuesday, 28 April 2015

Understanding Cloud Access Security Broker Services

Over the past 30 years, we the IT Security team have been promoting and building a "Defence in Depth" strategy to protect our corporate assets. 

This methodology was predicated on the fact that we need to assure our employees, customers, and shareholders that we were able to provide adequate Confidentiality, Integrity, and Availability (The CIA-Triad)  for the sensitive data/intellectual property residing in physical  data centers. 

We have installed Firewalls, Intrusion Prevention, AntiMalware,  Data Loss Prevention, Secure Email, VPN, etc... All with the intent on providing a stack of security capabilities to protect data withing our corporate network.  Within our corporate data centers.

Simultaneously, our lines of business are becoming more agile, more complex, and more attune to services available "in the cloud"Shadow IT is the new trend.  Lines of Business can and are spinning up new services at an aggressive rate to keep up with their online competition. Our ability to manage them "technically" as opposed to by policy has been almost non-existent.

We as Security Experts, are scrambling to augment our "bricks and mortar" based Defense in Depth strategy with Cloud Services, but the path is not presently clear.

Very recently
, a niche market has developed to fill this void. Several vendors identifying themselves as Cloud Access Security Brokers (CASBs) have defined a strategy to mitigate this problem.  CASBs are either on-premise, or cloud-based (or both) security policy enforcement points. Placed between your end users and the various cloud service providers, they can inspect traffic, manage and enforce policy, alert on anomalous behavior, and in most cases provide some level of DLP enforcement.

Either leveraging existing Single Sign On providers, or corporate Active directory services, these Cloud Access Security Brokers can identify individuals' access into Cloud Service Providers that are affiliated with the broker. Currently these number in the  hundreds if not thousands. For "Sanctioned" Cloud Applications (those services for which your enterprise has procured directly) end user access can be strictly enforced by context:
  • Who you are (Role based access)
  • Where you are coming from (corporate network, public Internet, wifi, geographic region)
  • What device you are using (Corporate laptop, Home PC, Tablet or phone)
  • What time of day you're working (Are you authorised to work during this time?)

This Context Awareness also allows the CASB providers to employ heuristic analysis on Cloud bound traffic, to do some form of anomaly detection to identify malicious or erroneous traffic.  This is an area that they are all investing heavily in today.
  Most of the Cloud Access Security Brokers provide granular encryption, but only three provide  Tokenization of your Corporate Data in the Cloud. This can be as coarse as entire records or documents, or as fine grained as a field in a form.  Adallom has also  leveraged the Right's Management functionality of Checkpoint's Capsule to secure data in the cloud, while allowing trusted collaboration.

For more on Tokenization vs encryption, please see my articles: Tokenization as a companion to Encryption and Toronto based PCI Compliance upstart Blueline brings holistic solution to Voice-Web-POS

One of the strengths of some of the Cloud Access Security Brokers is the ability to identify and report on employee access to  "Shadow IT" cloud services.  "Shadow IT" are described as services that the corporation has not subscribed to as a whole, or has not specifically provisioned for the user in question.  These typically include Cloud Storage facilities like Box or Dropbox.   Again, if the CASB has an affiliation with the cloud service provider, these can be managed by policy, otherwise they can be flagged and alerted on to your security operations team for manual remediation.

Several of these CASBs provide on-premise inspection and policy gateways to augment your corporate network controls and provide definitive logical access control to the cloud services from within the corporate network.  These on-premise gateways complement the cloud based CASB services and provide for a hybrid view of data movement.

Since their emergence in 2012, CASBs have grown in importance and today are the primary technical means of giving organizations more control over SaaS security. This technology will become an essential component of SaaS deployments by 2017.
 By 2016, 25% of enterprises will secure access to cloud-based services using a CASB platform, up from less than 1% in 2012, reducing the cost of securing access by 30%.

- Gartner, The Growing Importance of Cloud Access Security Brokers

Gartner has defined the four pillars of CASB as:
 Visibility, Data Security, Compliance and Threat Prevention.

 As of this time, there are about twelve companies playing in this space. I would like to highlight the leaders at the moment. 

(In alphabetical order, and in their own words. ie: pilfered from their websites.)

Adallom delivers an extensible platform to secure and govern cloud applications. In addition to discovering almost 13,000 cloud services in use, Adallom offers comprehensive controls for data sharing, data security, DLP, eDiscovery and access control. The Adallom platform also integrates with existing on-premises solutions such as SIEMs, MDMs, NACs and DLPs. Adallom has identified new malware attacks in the wild, including a Zeus variant attacking Salesforce, and an identity token hijacking vulnerability affecting Office 365On April 21st, Adallom announced an HP partnership where its platform will be resold on the HP price list, and offered with the HP Enterprise Security Products and Enterprise Security Services portfolio. 

the Total Data Protection company, is a Cloud Access Security Broker, founded in 2013, that delivers innovative technologies that transcend the network perimeter to deliver total data protection for the enterprise - in the cloud, on mobile devices and anywhere on the internet.  Bitglass delivers the security, visibility, and control that IT needs to enable mobile and cloud in the workplace, while respecting user privacy.

CipherCloud is a cloud security software suite that encrypts data during the upload process, and decrypts during download. The encryption keys used for this process remain within your business network; thus, unauthorized users accessing data in the cloud will only see indecipherable text.
CipherCloud also comes with built-in malware detection and data loss prevention. There are specific builds for commonly used cloud applications such as Salesforce, Office 365, Gmail and Box, as well as a variant that can be configured to work with any cloud-based applications your business uses.

Netskope is a leader in cloud app analytics and policy enforcement. Netskope aims to eliminate the catch-22 between being agile and being secure and compliant by providing visibility, enforcing sophisticated policies, and protecting data in cloud apps.  
Netskope is a service that discovers and monitors cloud apps and shadow IT used on your network. Netskope monitors users, sessions, shared and downloaded content as well as the shared content details, and provides detailed analytics based on this information.

Perspecsys' AppProtex Cloud Data Protection Platform provides a flexible cloud data control platform that enables organizations to identify and monitor cloud usage and then encrypt or tokenize data that it does not want to put in the cloud “in the clear”.  The Platform intercepts sensitive data while it is still on-premise and replaces it with a random tokenized or encrypted value, rendering it meaningless should anyone outside of the company access the data while it is being processed or stored in the cloud.

Skyhigh Networks enables organizations to adopt cloud services with appropriate security, compliance, and governance. Skyhigh supports the entire cloud adoption lifecycle, providing unparalleled visibility, analytics, and policy-based control. Specifically, Skyhigh shines a light on Shadow IT by giving a comprehensive view into an organization’s use and risk of all cloud services. Skyhigh analyzes the use of all cloud services to identify anomalous behavior indicative of security breaches, compromised accounts or insider threats. Finally, Skyhigh enforces the organization's policies on the use of over 12,000 cloud services by providing contextual access control, structured and unstructured data encryption and tokenization, data loss prevention, and detailed cloud activity monitoring for forensic and compliance purposes.

Zscaler is leading two fundamental transformations in the world of IT security. First—the shift from on-premise hardware appliances and software to Security as a Service. Second—the transition from point security solutions to broad unified security and compliance platforms. Both transformations exactly parallel what has happened in every other sector of information technology—CRM, ERP, HR, eCommerce, and personal productivity—all have evolved from on-premises point applications to comprehensive cloud—based platforms. 

While conducting this review of the CASB market, I looked at a number of Security Controls that I would expect a mature Access Broker to provide. I've laid this out in accordance with Gartner's four pillars: 
 Visibility, Data Security, Compliance and Threat Prevention.
If you think I have omitted your favorite Cloud Access Security Broker, or have mis-represented a control above, please have them forward details to me including their position on each of the items in the above controls list.  After validating each, I will gladly amend the list.

Although the CASB market space is still in it's infancy, the main players have done a good job defining - and meeting - most of the requirements of an off-premise security service. 
I'm interested to see what happens to this space over the next three years.   My money is on convergence of CASB, SSO, and Mobile Security providers.

Also Read: 

Standing at the Crossroads: Employee Use of Cloud Storage.


Gartner: The Growing Importance of Cloud Access Security Brokers
Gartner: Emerging Technology Analysis: Cloud Access Security Brokers
Bitglass: The Definitive Guide to Cloud Access Security Brokers
CipherCloud looks to stay at the head of the cloud security class 
Ciphercloud: 10 Minute Guide to Cloud Encryption Gateways
Ciphercloud: Cloud Adoption & Risk Report in North America & Europe – 2014 Trends

NetworkWorld: How the cloud is changing the security game
Adallom: The Case For A Cloud Access Security Broker
Adallom: Cloud Risk Report Nov 2014
Check Point Capsule and Adallom Integration 
HP - Adallom: Proven Cloud Access Security Protection Platform 
Adallom : to Offer Comprehensive Cloud Security Solution for Businesses With HP 
PingOne - Skyhigh: PingOne & Skyhigh Cloud Security Manager
ManagedMethods: Role of Enterprise Cloud Access Security Broker
Standing at the Crossroads: Employee Use of Cloud Storage. 
Cloud Computing: Security Threats and Tools 
SC Magazine: Most cloud applications in use are not sanctioned  

Monday, 27 April 2015

What's the difference between a Virtual Machine and a Container?

With the current trend towards "Containers" as opposed to "Virtual Machines", I've had a few people asking what the difference was, and where you might use one over the other.

I hope to keep this brief, but... 

Both Containers and Virtual Machines have been around for quite some time.  Mainframe and Commercial UNIX have had terms like LPAR for Logical Partition (Representing VM) and WPAR for Workload Partition (Representing Containers) for over a decade (Mainframe since 1972!!!).

UNIX/Linux have used "chroot" filesystems (otherwise known as "chroot jail")  for years to secure running processes such as a web server or database server. The earliest implementation of "containers" was the 1979 introduction of chroot into UNIX Version 7.

Currently chroot is a part of just about every major distribution of Linux.

In very high level terms, a Virtual Machine or Hypervisor (such as VMWare, Hyper-V, KVM, VirtualBox, and Xen) is designed to emulate an entire physical computer including the various hardware abstraction required for networking, video, audio, etc... 

In a word, VMs are FAT!
Via Accenture:

A container on the other hand ( DockerParallels , CoreOS, chroot, ...)  runs on top of an existing kernel, leveraging resources form the kernel, and merely presents a virtual userspace with separate filesystem, CPU, memory and protected processes.  

Without having to emulate the underlying hardware, you can pack 3-4 times as many containers into the same resource pool as a single Virtual Machine.

So why would I use Virtual Machines, if Containers are just as good?  

Well, because a Virtual Machine abstracts the ENTIRE hardware platform, there's evidence that it is better suited to defined network segregation.  

You could, for instance, define a Virtual Machine to represent your web application in it's entirety, then within that VM, create containers for the web, app, and database tiers.  The containers would provide logical segregation between the tiers, and the VM would protect the entire application from other apps in the DMZ.

Virtual Machines also allow you to run completely different Operating Systems simultaneously on the same hardware.  For instance, on your Ubuntu Laptop, you could use Virtualbox, to simultaneously run Windows 8.1 and OSX.    

Or, on your server, you could simultaneously run Redhat Linux, Windows Server 2008, and Windows Server 2012.   

A containerized system, as mentioned above, runs all containers off of the same Operating System Kernel.

And by far the biggest benefit of Containers over Virtual Machines is speed of launch. A Virtual Machine is, for all intents and purposes, a complete computer Operating System.  On boot, it has to run through all of the legacy boot processes... 

A Container launches on an already running kernel.  A full containerized application can launch in a fraction of a second (restricted only by I/O) whereas that same app launched within a Hypervisor context could be from tens of seconds to potentially a minute or more depending on boot requirements.

Edit: (04/28/2015)

Bromium is an newcomer to the virtualization space, and one to watch carefully.  Based on a fork of the Xen hypervisor, Bromium relies heavily on Intel's hardware virtualization for isolation.

Unlike either of the above Hypervisor or Container approaches,  Bromium isolates specific services in Windows, such as launching an application, downloading an email attachment, or clicking a hyper link in a browser.  When these activities are identified, Bromium creates a small task-specific "Microvisor" to encapsulate and segregate only the resources required for that task.  Mandatory Access Control policies ensure protection of the underlying Operating System, as well as any other apps running on the host.

When NSS Labs tested the Bromium architecture, it achieved a perfect score in defeating all malware, as well as manual and scripted attempts at penetration.


VMware just created its first Linux OS, and it’s container-friendly
Why Containers Instead of Hypervisors? 
IBM Systems Magazine: An LPAR Review 
Wikipedia: Workload Partitions
Wikipedia: Virtual machine 
Wikipedia: Operating-system-level virtualization 
Wikipedia: Chroot 
Best Practices for UNIX chroot() Operations 
Ubuntu: Basic chroot  
Containers—Not Virtual Machines—Are the Future Cloud 
Contain your enthusiasm - Part One: a history of operating system containers 
Accenture: Inspiration through Elevation: Simplified Configuration Management with Docker  
Gartner: Virtualization, Containers and Other Sandboxing Techniques Should be on Your Radar Screen 
Bromium vSentry Sets New Standard for Security Effectiveness 
NSSLABS: Threat Isolation Technology Test Report: Bromium vSentry
Bromium: Micro-virtualization for the Security Architect 

Wednesday, 4 March 2015

Tokenization as a companion to Encryption

For the protection of sensitive data, tokenization is every bit as important as data encryption.

(This article first ran in ITworld Canada in October 2014) 

We are all very familiar with the requirement to encrypt sensitive data at rest as well as in transit.  We have many tools that perform these functions for us. Our database systems allow for encryption as granular as field, or as course as table or entire database.  Network file systems likewise allow for various degrees of encryption.  All of our tools for moving, viewing, editing data have the ability to transport data encrypted via SSL/TLS or SCP.

Encryption, however, is intended to be reversed.  Sensitive data is still resident in the filestore/database, but in an obfuscated  manner, meant to be decrypted for later use.  Backups of your data still contain a version of your original data.  Transaction servers working on this data may have copies of sensitive data in memory while processing.  Recently we saw in the Target breach, that memory resident data is not secure if the host is compromised.  Memory scraping tools are among the payloads commonly delivered in a malware incursion.

As long as the valuable sensitive data such as Personally Identifiable Information (PII) or Payment Card Industry (PCI) resides in your facility, or is transmitted across your network, there is reason for a malicious threat agent to want to breach your network and obtain that information.

Additionally, the cost and time involved in regulatory compliance to ensure and attest to the security of that sensitive data can be daunting.   For PCI data, there are 12 rigorous Payment Card Industry Card Data Security Standard (PCI DSS) requirements that have to be signed off on annually.
For the rest of this discussion, I'm going to focus on credit card (PCI) data, as it is nearest and dearest to my field of experience, but the process is similar regardless of the type of sensitive data.

Tokenization is not encryption

Tokenization completely removes sensitive data from your network, and replaces it with a format preserving unique placeholder or  "token".  You no longer store an encrypted copy of the original data.  You no longer transmit an encrypted copy of the original data.  Transaction servers no longer keep a copy of the sensitive data in their memory.

With no data to steal, any network breach would prove fruitless.

The token value is randomly generated, but typically designed to retain the original format, ie: Credit card tokens retain the same length as a valid credit card number, and pass the same checksum validation algorithm as an actual credit card number, but cannot be reverse engineered to acquire the original credit card number.

Don't get me wrong, the actual data does get stored somewhere, but typically in an offsite, purpose-built, highly secure, managed and monitored vault.

In the case of PCI compliance, this vault and it's associated security mechanisms are the only infrastructure that requires review/attestation.  The rest of your network, including the transaction servers become outside the scope of review.

Neither Tokenization nor Encryption is a silver bullet in and of itself, but the appropriate mix of each will greatly reduce your overall risk exposure, and potentially keep your name off the next Breach Report.

Also ReadPCI DSS Cloud Computing Guidelines - Overview

Securosis: Tokenization Guidance: How to reduce PCI compliance costs
PCI Security Standards Coucil: PCI Data Security Standard (PCI DSS)
Securosis: Tokenization vs. Encryption: Options for Compliance, version 2 
Cardvault: Credit Card Tokenization 101 – And Why it’s Better than Encryption
3 Core PCI-DSS Tokenization Models- Choosing the right PCI-DSS Strategy
Encryption and Tokenization
Data Encryption and Tokenization: An Innovative One-Two Punch to Increase Data Security and Reduce the Challenges of PCI DSS Compliance
Paymetric: Tokenization Amplified
Tokenization is About More Than PCI Compliance
Tokenization: The PCI Guidance
Blueline Tokenization Infrastructure and Tokenization as a Service 

Friday, 13 February 2015

Giving your network a shot in the arm! Darktrace: The Enterprise Immune System.

I understand that most of you reading this have never worked in a Security Operations Center or SOC for short, but you've all seen them in movies.. 

Sterile, brightly lit rooms of computer screens.  All showing spreadsheets or charts or static maps of the world.  I yawn even thinking of it.
And yet the men and women working this environment 24/7 are responsible for detecting that one little anomaly or sorting out the REAL bad traffic patterns from among the thousands of False Positive bad traffic patterns that show up on their screens hourly.

Little wonder the poor Security Analysts over at Target missed the evidence in front of them.  The sheer enormity and chaos of data that assaults them in the course of their workday is stressful and overwhelming.  All the screens look the same, tables and columns, and rows of information about network and security events collected and forwarded by every device on the network.  Then hundred or thousands of rules process them to try to find deviations from "normal traffic".   Like any network has "normal traffic".  Right...

I know.  I've worked in or around these systems for the past two decades.  I've seen the tools appear, mature, merge, morph, and become "fairly" useable.  But the false positives are still rampant, and low and slow "Advanced Persistent Threats" are under the radar and typically don't show up here.

So when an upstart Security Analytics company called me late in 2013 to show me what they've been working on, well... I could care less.  Really... They tried hard to influence me with their Pedigree:  Harking from the minds ex-MI5 Security Intelligence employees, and funded by Autonomy founder Mike Lynch.   But all big software stands on the shoulders of giants, right?

Then a few months ago, a friend of mine convinced me to come out to a public demo of their system. 

Five minutes in, I was awestruck. 

So let me take a second to say that the basis of their tools revolves around some very propeller head complex math that us mere mortals could never comprehend.  They do not rely on rules or signatures or feeds from your network devices.  Yes... they DO require network span or tap at critical aggregation points in your network, but they are able to watch, analyze, identify, and correlate your traffic over a period of time, and through machine learning techniques, develop and understanding of "normal traffic" within several contexts.  

Darktrace touts themselves to be your "Enterprise Immune System", in that like the human body's immune system, which has an understanding of "self" or what belongs or is normal versus contaminants like bacteria or viruses. After a period of mapping your environment's traffic patterns: Source/Destination/Port/Protocol/Time of day/Day of year/etc... Darktrace will use it's learning algorithms to alert on traffic patterns that are NOT normal, and therefore should be looked at. It learns what "normal" or "self" is for each device on your network.  The difference here is the heuristic learning.  Not rules, made be people who think they know the system.  

All very impressive... BUT...  that's not really what caught my eye.  Sorry Darktrace guys, but the person or people you can never let leave your company are the ones who wrote that AWESOMELY FUTURISTIC HUMAN INTERFACE!!!  Oh My God! 
 (pause here to collect my breath)

Remember up top where I said how sterile and drab and monotonous staring at a gazillion screens full of spreadsheets was?   Well... now picture having the tools from Minority Report!  Yeah, you know the ones!   

The screen in front of me started off with a wireframe globe.  Little pins of light would show up, intensify, dim... whatever.. I've seen this before.  But... Our presenter took the mouse, spun the globe a few degrees, and zoomed in "just like in the movies". 

 I got the feeling at first that this was canned video footage. But then the presenter selected one of those intensifying lights. Zoomed in, and as he zoomed, images of network devices started showing up.  Lines between them glowing as well, in various intensities and colors.  They then portrayed a communication session initiated from a desktop to a webserver.  a faint white line... Then immediately more light from that webserver back to another device that turned out to be an associated database server... AND more illuminated lines back to the network storage array...  That one transaction, a web page request I would imagine, allowed me to visualize *VISUALIZE* connectivity to the various sub components of the web applications infrastructure.  

Before anyone had a chance to ask about those red glowing devices and lines, the presenter clicked one and detailed how THIS was not typical traffic from that particular device at this time of day, nor from the area of the network being connected.   Anomalous behaviorVISIBLE in real time.  

On a 3D rotatable glowing thingamabobber of a Awesome Graphical User Interface.  

If you want your Security Operations Center personnel to be engaged, alert, 
and notice the anomalies... 
let them play with Darktrace just for a few days.  I guarantee you'll  leave it in. 

Darktrace Corporate Overview.

Darktrace: Enterprise Immune System 
Darktrace: Recursive Bayesian Estimation 
Darktrace CEO Joins Prime Minister David Cameron on Official Cyber Security Visit to Washington D.C.  
Former MI5 chief advises Darktrace 
GCHQ Defence chief to head cyber security start-up Darktrace  

ZDNet: Darktrace: What happens when Bayesian analysis is turned on intruders 

Deloitte: The ‘Immune System’ of Enterprise IT?
How Threats Disguise Their Network Traffic 
TrendMicro: Network Detection Evasion Methods
What is “Normal Traffic” Anyway? (by Chris Greer) 
MI5: UK Security Intelligence

Cyber Security Exchange Conference with Darktrace