Home : Technical Terms : Cryptography Definition


Cryptography is the science of protecting information by transforming it into a secure format. This process, called encryption, has been used for centuries to prevent handwritten messages from being read by unintended recipients. Today, cryptography is used to protect digital data. It is a division of computer science that focuses on transforming data into formats that cannot be recognized by unauthorized users.

An example of basic cryptography is a encrypted message in which letters are replaced with other characters. To decode the encrypted contents, you would need a grid or table that defines how the letters are transposed. For example, the translation grid below could be used to decode "1234125678906" as "techterms.com".

1t 6m
2e 7s
3c 8.
4h 9c
5r 0o

The above table is also called a cipher. Ciphers can be simple translation codes, such as the example above, or complex algorithms. While simple codes sufficed for encoding handwritten notes, computers can easily break, or figure out, these types of codes. Because computers can process billions of calculations per second, they can even break complex algorithms in a matter of seconds. Therefore, modern cryptography involves developing encryption methods that are difficult for even supercomputers to break.

Updated: July 15, 2015

Cite this definition:


TechTerms - The Tech Terms Computer Dictionary

This page contains a technical definiton of Cryptography. It explains in computing terminology what Cryptography means and is one of many technical terms in the TechTerms dictionary.

All definitions on the TechTerms website are written to be technically accurate but also easy to understand. If you find this Cryptography definition to be helpful, you can reference it using the citation links above. If you think a term should be updated or added to the TechTerms dictionary, please email TechTerms!