Home : Software Terms : Timestamp Definition

Timestamp

A timestamp is a specific date and time "stamped" on a digital record or file. While most often used as a noun, the word "timestamp" can also be a verb. For example, "The tweet was timestamped on January 8, 2021, at 10:44 AM."

Timestamps are the standard way to store dates and times on computer systems. For example, operating systems timestamp each file with a created and modified date and time. Digital cameras timestamp each captured photo. Social media platforms store a timestamp with each post, such as the Twitter example above.

While timestamps are universal, there is no universal timestamp format. For example, a programming language may use one method, while a database may use another. Even operating systems have different ways of storing timestamps. For instance, Windows uses the ANSI standard and stores timestamps as the number of seconds since January 1, 1601. Unix stores timestamps as the number of seconds that have elapsed since midnight on January 1, 1970. Because several different timestamp formats exist, most modern programming languages have built-in timestamp conversion functions.

A Unix timestamp is also known as "epoch time," which is equivalent to the number of seconds since January 1, 1970, at 12:00 AM GMT (Greenwich Mean Time). This GMT (or "UTC") date/time may be displayed by default on Unix and Linux devices (including Android) when a valid timestamp is not available. In the western hemisphere, such as the United States, this date will appear as December 31, 1969.

Storing a timestamp as an integer is efficient since it requires minimal storage space. However, the number must be converted to a legible time format when displayed. MySQL has a TIMESTAMP data type, which conveniently stores timestamps in the following format:

YYYY-MM-DD HH:MM:SS

MySQL stores timestamps in UTC, which is based in England. So January 16, 2021 at 10:15:30 AM US Central Time would be stored in a MySQL database as follows:

2021-01-16 16:15:30

If converted to a timestamp in Linux, this time/date would be represented as:

1610813730

Timestamps also have different resolutions or specificity. In some cases, seconds are sufficient, while in others, milliseconds or even nanoseconds are required. The Linux timestamp above would be 1610813730000 in milliseconds, which provides a resolution of one-thousandth of a second. Computing operations may require timestamps with even higher resolution. PHP includes a microtime() function that outputs a timestamp in microseconds, with a resolution of one-millionth of a second.

Updated: January 16, 2021

Cite this definition:

https://techterms.com/definition/timestamp

TechTerms - The Tech Terms Computer Dictionary

This page contains a technical definition of Timestamp. It explains in computing terminology what Timestamp means and is one of many software terms in the TechTerms dictionary.

All definitions on the TechTerms website are written to be technically accurate but also easy to understand. If you find this Timestamp definition to be helpful, you can reference it using the citation links above. If you think a term should be updated or added to the TechTerms dictionary, please email TechTerms!

Subscribe to the TechTerms Newsletter to get featured terms and quizzes right in your inbox. You can choose to receive either a daily or weekly email.

Sign up for the free TechTerms Newsletter

How often would you like to receive an email?

You can unsubscribe at any time.
Questions? Please contact us.