About 388,000 results
Open links in new tab
  1. Tokenization (data security) - Wikipedia

    To protect data over its full lifecycle, tokenization is often combined with end-to-end encryption to secure data in transit to the tokenization system or service, with a token replacing the original data on return.

  2. What is tokenization? - IBM

    In data security, tokenization is the process of converting sensitive data into a nonsensitive digital replacement, called a token, that maps back to the original. Tokenization can help protect sensitive …

  3. What is tokenization? | McKinsey

    Jul 25, 2024 · Tokenization is the process of creating a digital representation of a real thing. Tokenization can also be used to protect sensitive data or to efficiently process large amounts of data.

  4. What is data tokenization? The different types, and key use cases

    Apr 16, 2025 · Data tokenization as a broad term is the process of replacing raw data with a digital representation. In data security, tokenization replaces sensitive data with randomized, nonsensitive …

  5. What is Data Tokenization? [Examples, Benefits & Real-Time …

    Jul 9, 2025 · Data tokenization is a method of protecting sensitive information by replacing it with a non-sensitive equivalent — called a token — that has no exploitable meaning or value outside of its …

  6. What is Data Tokenization? [Examples & Benefits] | Airbyte

    Sep 10, 2025 · The fundamental principle behind tokenization lies in data substitution rather than data transformation. Unlike encryption, which mathematically converts data into ciphertext, tokenization …

  7. Explainer: What is tokenization and is it crypto's next big thing?

    Jul 23, 2025 · But it generally refers to the process of turning financial assets - such as bank deposits, stocks, bonds, funds and even real estate - into crypto assets. This means creating a record on digital...

  8. Data Tokenization - A Complete Guide - ALTR

    Aug 11, 2025 · Tokenization is a data security technique that replaces sensitive information—such as personally identifiable information (PII), payment card numbers, or health records—with a non …

  9. Tokenization Explained: What Is Tokenization & Why Use It? - Okta

    Sep 2, 2024 · Tokenization involves protecting sensitive, private information with something scrambled, which users call a token. Tokens can't be unscrambled and returned to their original state.

  10. How Does Tokenization Work? Explained with Examples - Spiceworks

    Mar 28, 2023 · Tokenization is defined as the process of hiding the contents of a dataset by replacing sensitive or private elements with a series of non-sensitive, randomly generated elements (called a …