Cloud Data Tokenization
PerspecSys tokenization technology solves cloud data residency, data privacy and data security challenges for enterprises using cloud applications.
Tokenization is a process by which a sensitive data field, such as a Primary Account Number (PAN) from a credit or debit card, is replaced with a surrogate value called a token. De-tokenization is the reverse process of redeeming a token for its associated original value. While various approaches to creating tokens exist, frequently they are simply randomly generated values that have no mathematical relation to the original data field. This underlies the security of the approach – it is nearly impossible to determine the original value of a sensitive data field by knowing only the surrogate token value.
Depending on the particular implementation of a tokenization solution, tokens can be used to achieve compliance with requirements that stipulate how sensitive data needs to be treated and secured by companies in order to adhere to guidelines such as PCI DSS, HITECH & HIPAA, ITAR, and Gramm-Leach-Bliley.
Whether sensitive data resides within on-premise systems or in the cloud, transmission and storage of tokens instead of original data fields are acknowledged industry-standard methods for securing data.
Visa’s publication, Visa Best Practices for Tokenization offers industry recommendations for using tokenization as a standalone solution or in conjunction with encryption to help merchants keep sensitive cardholder information out of the cloud. The guidelines help retailers and high-volume transaction-based companies maintain compliance with PCI DSS standards.
How is Tokenization Different From Encryption?
Encryption is an obfuscation approach that uses a cipher algorithm to mathematically transform sensitive data’s original value to a surrogate value. The surrogate can be transformed back to the original value via the use of a “key”, which can be thought of as the means to undo the mathematical lock.
So while encryption clearly can be used to obfuscate a value, a mathematical link back to its true form still exists. Tokenization is unique in that it completely removes the original data from the systems in which the tokens reside. As such, advantages of tokenization are:
- Tokens cannot be reversed back to their original values without access to the original “look-up” table that matches them up to their original values. These tables are typically kept in a “hardened” database in a secure location inside a company’s firewall.
- Tokens can be made to maintain the same structure and data type as their original values.
While format-preserving encryption can retain the structure and data type, it’s still reversible back to the original if you have the key and algorithm.
PerspecSys Tokenization & Residency
Because tokens cannot be reversed back to their original values, tokenization is frequently the de facto approach to addressing a market requirement known as residency. Depending on the countries in which they operate, companies often face strict regulatory guidelines governing their treatment of sensitive customer and employee information. These data residency laws mandate that certain types of information must remain within a defined geographic jurisdiction. In cloud environments, where datacenters can be located in various parts of the world, tokenization can be used to keep sensitive data local (resident) while tokens are stored and processed in the cloud.
Find out more about PerspecSys Tokenization and cloud security issues.
Learn more about encryption.
Learn more by visiting our Knowledge Center.