The thumb is a unit of measurement of length, which is 2.54 cmand is mainly in use in Anglo-Saxon countries where it is called inchbut it is used all over the world to measure the screens of electronic devices such as TVs, smartphones and PCs. Let’s see why and how the conversions from inches to centimeters and vice versa are done.
How many centimeters is an inch: calculation with the formula
The thumb (inch), which is the twelfth part of the foot, it’s worth it 2.54 centimetres and is indicated with a double quote (10 inches is written as 10″), consequently to convert from inches to centimeters you must multiply by the conversion factor 2.54 according to the formula
size in cm = size in inches × 2.54.
For example to determine how many centimeters correspond to 10″ let’s calculate 10×2.54 getting 25.4cm. On the contrary, to convert from centimeters to inches we must divide by 2.54, according to the formula
size in inches = size in cm / 2.54
So to find out how many inches they correspond to 10cm Enough divide by 2.54Therefore 10cm=10/2.54 that is to say about 3.94 inches.
To convert from inches to cm and vice versa we use the conversion factor 2.54, which is not a very convenient number if we have to do the calculations without a calculator, but if we are satisfied with an approximate conversion there is a quick way to do it in mind. The trick is inapproximate the conversion factor to 2.5which is half of five, and do the following:
- to convert from inches to cm we multiply by 5 and then halvefor example to convert 30″ we calculate 30×5=150 and halve it to obtain the approximate value of 75cm
- to convert from cm to inches we divide by 5 and doublefor example to convert 30cm we calculate 30:5=6 and then double obtaining the approximate value of 12″
In the example we have converted 30″ approximately into 75cm, a number lower than the 76.2 cm which actually corresponds to 30″, it is not an exact result, but it is a more than sufficient approximation if we have to estimate the dimensions of a TV by hand.
Why inches are used to measure TV screens
But why are they used all over the world inches to measure televisions even though the International System requires meters and centimeters to measure lengths? The reason is essentially historical in that theinvention of television it occurred mainly in the 1920s in the USAwith some contributions also from Great Britain, and for inventors such as the Scottish John Logie Baird and the American Philo Farnsworth it was natural to use the inch as a unit of measurement. Basically, since the creation and marketing of the first cathode ray tube TVs, a custom has established itself, that of measuring screens in inches, which has remained in force over the decades, extending to other electronic devices equipped with a screen, such as PCs and smartphones.
This custom is not very convenient for us who are used to thinking in meters and centimeters but, as we have seen, we can always resort to a quick approximate conversion if we need to estimate the dimensions of a TV by hand.
