Tokenization

Tokenization is the process of substituting a sensitive data element with a non-sensitive equivalent, usually a string of random numbers, referred to as a token. The token is an identifier that maps back to the sensitive data through a tokenization system. The mapping from original data to a token uses methods which render tokens impossible to reverse in the absence of the tokenization system. When tokens replace live data in systems, the result is minimized exposure of sensitive data to those applications, stores, people, and processes, reducing risk of compromise or accidental exposure and unauthorized access to sensitive data. Applications can operate using tokens instead of live data, with the exception of a small number of trusted applications explicitly permitted to detokenize when strictly necessary for an approved business purpose, such as processing a card payment.


To get started, submit an online application and an expert payment specialist will contact you shortly to complete the setup and activation of your account.

Apply Today

Previous PostPayment Gateway
Next PostEMV/Chip Terminals