A trillion dollars is one million dollars multiplied by one million dollars. Or, if you prefer, a billion billion. It contains a total of **12 zeroes**: 1,000,000,000,000.00.

The number of computers in the world today is estimated to be around 5 billion. So $1 trillion would be enough money to buy all those computers at current prices.

$1 trillion is more than the annual output of all the factories in the world. In fact, it's more than the entire global economy was worth back in 2004. At that time, the world produced about $50 trillion of goods and services. Today, that number has increased almost nothing since 2004, so $1 trillion would still be more than the whole world produced.

The United States alone has a gross domestic product (the sum of all the goods and services produced in a country) of about $15 trillion. So the United States would have to produce twice what it does now to match $1 trillion.

In conclusion, $1 trillion looks like a lot of money but not too many people can afford to spend it. The rich may be able to, but most people wouldn't be able to live with themselves if they did.

A trillion is a 1 followed by 12 zeros, and it looks like this: 1,000,000,000,000. After trillion, the next named number is quadrillion, which is a 1 followed by 15 zeros: 1,000,000,000,000,000. 0 is the first non-number word in **the English language**.

Trillions are used in **scientific notation** to represent large numbers. For example, a trillion is written as 10^12 or 1,000,000,000. Astronomers use these numbers to describe the size of stars and planets. Scientists also use trillions when discussing large amounts of data. For example, astronomers may talk about the mass of our galaxy being equivalent to **a trillion suns**.

Quadrillion is the next larger number after trillion. It contains two dozen zeros so should always be written as 1,000,000,000,000. This huge number is used in mathematics to describe extremely large quantities. For example, physicists use the phrase "a quadrillion electrons in orbit around an atom" when talking about one of their experiments.

When writing out a number that big, mathematicians often use a special symbol called a "scientific notation" to help them keep track of how many zeros they need to put after the decimal point. They start with the largest power of ten that can be divided by the number they're working with without leaving a remainder.

Between the two, numbers would always include the term "trillion": two trillion, a hundred trillion, and so on. The number 1 billion is just as common as the number 1 million.

When you multiply a trillion by 10, the result is a billion. Multiply another trillion by 10 and the result is a million trillion, or with **more correct digits**, 1.2×1023. This goes on forever without **any limit**; there are no hundreds, thousands, millions, or billions of trilion-or-more-and-still-counting-dollars. It's easy to see why these huge numbers are called "illogical" or "logical inconsistencies."

It's also easy to see why people came up with ways to write down much larger numbers than anyone could possibly use. For example, one method was to split up the number into groups of three, like this: 3,000,000. That way you can write down a whole lot of zeroes at the end of a number.

Another method was to use figures that were easier for humans to deal with, such as powers of ten. For example, one way to write 1000 is as one thousand, and to write 0.01 you can simply write.

A trillion is 1,000,000,000,000, often known as 10 to the 12th power or 1,000,000,000,000. A trillion is greater than a million and greater than a billion; it is 1,000,000,000,000. (and even 1,000,000,000,000,000,000 in some countries). (Yes, that's 18 zeroes in the final one.)

The number of atoms in a single human being is estimated to be about 100 billion. Thus a trillion represents about 10 for every person on earth. This is a very large number.

In terms of **storage capacity**, a typical hard disk drive today is capable of storing **about 250 gigabytes** of information. Therefore a trillion bytes is equal to 8 x 1024 x 250 = 32 x 10^12. That's a lot of data!

A binary number is a sequence of ones and zeros. A binary number with **all ones** is called **a bit string**. The length of a bit string is its value, which can range from 0 to 2^n - 1, where n is the number of bits in the bit string. For example, the bit string 101110.. Is 7 digits long, since 2^7 - 1 = 127 - 1 = 126.

A bit string of length 1 is called a bit, a bit pattern. A bit string of length 2 is called a word. A binary number is called "binary" because each digit can be either a one or a zero.

12 digits Another way to remember it is that a trillion contains **12 zeros**, or that it is a million million. This uses **the same principle** as the other ways of remembering large numbers: if you write out the number in words, then add an extra "0" to **each word**, you will be able to recall it later.

There are several methods used to calculate how many zeros are in a trillion. The most straightforward method is to divide the value by 10 to the power of 9 and multiply by 100,000. For example, if we want to know how many zeros are in 180000000000, we would first convert this number into its decimal form, which is 1.8e+27. Then we would divide 180000000000 / 10 to the power of 9 to get 180999999960, and finally we would multiply this number by 100,000 to get 1.88e+27. In general, if we were to divide a number by 10 to the power of 9, we would get the number of zeros in a trillion using this method.

An easier way to calculate this number is to note that a trillion is simply one million millions, so there should be 12 zeros before the decimal point.