Scroll Top

Related Terms

DEFINITION

Tokenization

Tokenization is the process of replacing sensitive data with a unique non-sensitive equivalent called a token. Tokenization is most commonly used for added security when handling sensitive data such as credit card numbers or personal identifiers in industries such as healthcare or finance. Tokenization has also become essential for the blockchain due to its added layer of security and anonymity.

For tokenization to function as intended, businesses also need to use a secure vault to hold the original information. These vaults have to be compliant with Payment Card Industry Data Security Standards (or PCI DSS) to ensure robust security.

Synonyms

Data obfuscation

Acronyms

Share

Synonyms

Data obfuscation, surrogate date mapping, encrypted identifier generation

Acronyms

Examples

A person makes an online purchase using their credit card. Instead of storing the credit card information, the merchant’s payment gateway creates a token for further use. For example, the string SFS00-LJAI45. This string holds none of the original credit card information or identifiable information of the user.

However, the token can still be used to process the transaction while the actual credit card information is securely stored in a PCI DSS-compliant vault. By doing so, even if a hacker was able to intercept the transaction, they couldn’t use or extract any information from the random token string.

FAQ

Both tokenization and encryption are used to keep sensitive data safe and unreadable to unauthorized persons. However, they do so in different ways. Encryption masks the data and requires an encryption key for the encryption to be lifted. However, encryption keys can be stolen by threat actors. Tokenization on the other hand completely replaces the sensitive data with a random string which is useless even if stolen. To get the original data, a person must both have the token and prove their identity when accessing the token vault.

Therefore, tokenization is generally considered safer but due to its added complexity, it’s only used for handling small chunks of sensitive data, such as credit card numbers. Encryption on the other hand is widely used, is easy to scale up and can handle very large data sets.

Tokenization is used for secure payment processing, protecting sensitive customer data, anonymising medical records, and representing digital assets in the blockchain.

Regulations such as the PCI DSS or General Data Protection Regulations (GDPR) necessitate tokenization in specific industries. This is because it minimizes the storage and transmission of sensitive data.

Related Terms

Share

Join the Future of Banking

Book your demo today and see why leading financial institutions
worldwide trust Atfinity to drive their digital transformation.

Join the Future of Banking

Book your demo today and see why leading financial institutions worldwide trust Atfinity to drive their digital transformation.

Privacy Preferences
When you visit our website, it may store information through your browser from specific services, usually in form of cookies. Here you can change your privacy preferences. Please note that blocking some types of cookies may impact your experience on our website and the services we offer.