How do you write 2,000,000 as an integer?

An integer is a whole number (not a fraction) that can be positive, negative, or zero. Unlike floating point numbers, integers cannot have decimal places.

So, 2 million as an integer is 2,000,000.

Integers can be negative {-1, -2,-3, -4, -5, ... }, positive {1, 2, 3, 4, 5, ... }, or zero {0}

We can put that all together like this: Integers = { ..., -5, -4, -3, -2, -1, 0, 1, 2, 3, 4, 5, ... }

Examples: −16, −3, 0, 1 and 198 are all integers. But numbers like ½, 1.1 and 3.5 are not integers.

Integers are a commonly used data type in computer programming. For example, whenever a number is being incremented, such as within a "for loop" or "while loop," an integer is used. Integers are also used to determine an item's location within an array.

When two integers are added, subtracted, or multiplied, the result is also an integer. However, when one integer is divided into another, the result may be an integer or a fraction.

For example, 6 divided by 3 equals 2, which is an integer, but 6 divided by 4 equals 1.5, which contains a fraction. Decimal numbers may either be rounded or truncated to produce an integer result.

Thursday, September 14 2017
Source: https://www.mathsisfun.com/whole-numbers.html

Related questions