Proper Tokenization and Encryption Would Have Saved Capital One From A Billion Dollar Loss

We Need To Stop Finger Pointing … And Better Articulate Good Practice

Proper Tokenization and Encryption Would Have Saved Capital One From A Potential Billion Dollar Loss

We Need To Stop Finger Pointing … And Better Articulate “Good” from “Bad” Practice

The overall cost of the Capital One loss is not apparent yet, but in terms of others that have happened recently, they could be looking at over one billion dollars. This hack was no BA, Equifax (with systems that were over 40 years old) or TalkTalk hack, it was against a company with good levels of investment in technology, and with the very latest infrastructure.

So what happens when you keep telling the world that data breaches could be solved by encryption and tokenization, and then the company gets hacked? Well, you analyse and see what the problem is, and pin-point how it could be improved. From what I see from the breach report, the scope of the breach has been limited by the usage of encryption and tokenization, but the actual implementation of these methods is more of a checkbox approach than a proper integration.

We operate, we learn, we improve.

Well, as security professionals, we are not great at trying to articulate facts behind data breaches. We often just post new items, and then point and say “There you go again”. For many, they are just relieved it’s not their company.

But the Capital One data break is a kinda different hack, and as security professionals, we should be picking over it, and finding out how we can improve security, generally. For all the media hype, this is no BA hack, and where not one credit card or login detail was hacked. There was a leak though of other data, and Capital One must learn from this.

From the data breach notice, we see that Capital One uses both encryption and tokenization on its data, and it is these things which are likely to have saved the company from a much larger breach:

The true source of the data breach does point towards a firewall not being configured correctly, and in a weakness within the IAM role in AWS. But, here is the flaw this is general in the industry … and I am going to state it …

Our encryption and tokenization methods are often are flawed, as the protection “travels” with the data.

So, although I have applied tokenization and encryption on the data, if I have the requires system rights, I can resolve it. There’s no concept here of complex rules for checking my accesses or restricting them in some way if I log-in with a certain level, I get the data.

THIS IS CHECKBOX SECURITY!

I tell my CEO that I have encrypted and tokenized the data, and they are happy! For true encryption and tokenization, we must embed completed access control policies, and which define ROLE, ACTION, CONTEXT, LOCATION, and so on. And thus to tokenize the data, on a given field, we might require that the IP address must to resolved to Canada and that the user has a certain role.

We must not allow rights to travel with the data, but for it to be embedded into it. For us, it is methods such as CP-ABE (Cipher Policy- Attributed Based Encryption) and properly integrated tokenization, which are the way forward, and where we can define the policy for accesses, and then generate the required encryption. Just now, if you have a given role, you get the encryption key, and then token match-up automatically. This is just a bump in the road, and where we find out the role that works, and get the data.

My take-away, if tokenization and encryption had been used properly, it would have saved Capital One , at least, one billion dollars! As a community and as industries, we must understand what “bad”, “good” and “best” look like, rather than continually pointing the finger at others. Tokenization and encryption are not tick-box technologies, and need to be carefully integrated with access control methods.

We should all be worried that the general public, the media, government and legal systems will not be able to differentiate bad practice from good practice. The Capital One hack is NOT like Equifax (a 40-year-old data infrastructure and poor patching) or like the BA Hack (where credit card details were being siphoned off). Capital One use encryption, tokenization, and have state-of-the-art IT infrastructures, but we are still thinking of databases as things that hackers copy, and not that the link to live systems. Once linked to a live system, the data is revealed, with little thought of the context of the access. If your infrastructure implements just role-based security, be worried! We need to improve our integration of encryption, access control and tokenization, and make sure they all integrate together.

As long as there are humans in companies, every company is open to a breach, even the best ones.