Tokenisation is used in the Payment Card Industry to replace sensitive data with innocuous substitutes that can easily be reversed when required. The data itself, should not be made easy to decipher, but the algorithm is what determines the strength and degree of the securitization of the tokenisation process.
Tokens can be used to protect sensitive data, such as driver's licenses, financial statements, medical and criminal records, stock trades and information that is normally classified as personal or PII.
The technology was developed to comply with industry and government standards. In the PCI, tokens are used to represent sensitive card-holder information, that is usually stored in separate databases, in off-site locations.
All organizations involved with storing, processing or transmission of card-holder information, are mandated to comply with the Payment Card Industry Data Security Standard or PCI DSS, which stipulates that credit-card data must be protected when it is accumulated.
Tokenisation, is the process that is used to replace the numbers of the credit cards, with a randomly generated string of numbers or characters. There are several methods used to format the tokens. The applications used by some of the providers that supply the tokens, may generate the tokens, so that they match the format of the original data in length of characters, or sequence. In some cases, in the Payment Card Industry, the token may consist of the same number of digits as the number of the bank card, and may contain some of the original data, such as the last four numbers of the account.
During the request to authorize a transaction, the merchant may receive a token in place of the card number, along with an authorization code. The token may be stored in the system that receives the information, while the actual information of the card holder is kept elsewhere in a secure system that is used exclusively for storing tokens. All storage facilities, whether virtual or physical must comply with PCI DSS.
In order for the token system to be compliant, it should meet a few characteristics that outlined below.
• The components of the tokenization system must be located on secure networks that are isolated from any networks that are out of the scope. They should be designed according to the stringent configuration standards, and they should also be hardened against vulnerabilities.
• All communications in and out of the system must be handled securely.
• The solution should enforce robust security and cryptography protocols that protect the information of the card-holder when it is stored or transmitted over public networks.
• Authentication and access controls must be implemented in accordance with PCI DSS requirements.
• Support for a mechanism for the secure deletion of data as required by the data retention policy must also be provided.
• The solution must provide the facilities of monitoring, logging and alerting to be used in the identification of any activity deemed as suspicious.
The process of using tokens, make it harder for hackers to gain illegal access to card-holder information that is outside of the token storage system. Using the process of tokenisation, could also make it easier to comply with the PCI DSS requirements, as there is no longer any storage of sensitive data such credit card information as the storage systems are removed from the scope the PCI auditing.