The image is from Wikipedia Commons
|Unit system||imperial/US units|
|Symbol||in or ″ (the double prime)|
|1 in in ...||... is equal to ...|
|imperial/US units|| 1/36 yd
|metric (SI) units||25.4 mm|
The inch (abbreviation: in or ″) is a unit of length in the (British) imperial and United States customary systems of measurement. It is equal to 1⁄36 yard or 1⁄12 of a foot. Derived from the Roman uncia ("twelfth"), the word inch is also sometimes used to translate similar units in other measurement systems, usually understood as deriving from the width of the human thumb. Standards for the exact length of an inch have varied in the past, but since the adoption of the international yard during the 1950s and 1960s it has been based on the metric system and defined as exactly 25.4 mm.
The English word "inch" (Old English: ynce) was an early borrowing from Latin uncia ("one-twelfth; Roman inch; Roman ounce") not present in other Germanic languages. The vowel change from Latin /u/ to Old English /y/ (which became Modern English /ɪ/) is known as umlaut. The consonant change from the Latin /k/ (spelled c) to English /tʃ/ is palatalisation. Both were features of Old English phonology; see Phonological history of Old English § Palatalization and Germanic umlaut § I-mutation in Old English for more information.
In many other European languages, the word for "inch" is the same as or derived from the word for "thumb", as a man's thumb is about an inch wide (and this was even sometimes used to define the inch). Examples include Afrikaans: duim; Catalan: polzada ("inch") and polze ("thumb"); Czech: palec ("thumb"); Danish and Norwegian: tomme ("inch") tommel ("thumb"); Dutch: duim; French: pouce; Hungarian: hüvelyk; Italian: pollice; Portuguese: polegada ("inch") and polegar ("thumb"); Slovak: palec ("thumb"); Spanish: pulgada ("inch") and pulgar ("thumb"); and Swedish: tum ("inch") and tumme ("thumb") and Russian: дюйм ("duim").
The inch is a commonly used customary unit of length in the United States, Canada, and the United Kingdom. It is also used in Japan for electronic parts, especially display screens. In most of continental Europe, the inch is also used informally as a measure for display screens. For the United Kingdom, guidance on public sector use states that, since 1 October 1995, without time limit, the inch (along with the foot) is to be used as a primary unit for road signs and related measurements of distance (with the possible exception of clearance heights and widths) and may continue to be used as a secondary or supplementary indication following a metric measurement for other purposes.
The international standard symbol for inch is in (see ISO 31-1, Annex A) but traditionally the inch is denoted by a double prime, which is often approximated by double quotes, and the foot by a prime, which is often approximated by an apostrophe. For example, three feet two inches can be written as 3′ 2″. (This is akin to how the first and second "cuts" of the hour and degree are likewise indicated by prime and double prime symbols.) Subdivisions of an inch are typically written using dyadic fractions with odd number numerators; for example, two and three eighths of an inch would be written as 2 3/″ and not as 2.375″ nor as 2 6/″. However for engineering purposes fractions are commonly given to three or four places of decimals and have been for many years.
1 international inch is equal to:
- 10,000 tenths[a]
- 1,000 thou[b] or mil[c]
- 100 points[d] or gries[e]
- 72 PostScript points
- 10,[f][e] 12,[g] 16, or 40[h] lines
- 6 computer picas[i]
- 3 barleycorns[j]
- 25.4 millimetres exactly (1 millimetre ≈ 0.03937008 inches.)
- 0.999998 US Survey inches
- 1/ or 0.333 palms
- 1/ or 0.25 hands[k]
- 1/ or 0.08333 feet
- 1/ or 0.02777 yards
The earliest known reference to the inch in England is from the Laws of Æthelberht dating to the early 7th century, surviving in a single manuscript, the Textus Roffensis from 1120. Paragraph LXVII sets out the fine for wounds of various depths: one inch, one shilling, two inches, two shillings, etc.[l]
An Anglo-Saxon unit of length was the barleycorn. After 1066, 1 inch was equal to 3 barleycorns, which continued to be its legal definition for several centuries, with the barleycorn being the base unit. One of the earliest such definitions is that of 1324, where the legal definition of the inch was set out in a statute of Edward II of England, defining it as "three grains of barley, dry and round, placed end to end, lengthwise".
Similar definitions are recorded in both English and Welsh medieval law tracts. One, dating from the first half of the 10th century, is contained in the Laws of Hywel Dda which superseded those of Dyfnwal, an even earlier definition of the inch in Wales. Both definitions, as recorded in Ancient Laws and Institutes of Wales (vol i., pp. 184, 187, 189), are that "three lengths of a barleycorn is the inch".
King David I of Scotland in his Assize of Weights and Measures (c. 1150) is said to have defined the Scottish inch as the width of an average man's thumb at the base of the nail, even including the requirement to calculate the average of a small, a medium, and a large man's measures. However, the oldest surviving manuscripts date from the early 14th century and appear to have been altered with the inclusion of newer material.
In 1814, Charles Butler, a mathematics teacher at Cheam School, recorded the old legal definition of the inch to be "three grains of sound ripe barley being taken out the middle of the ear, well dried, and laid end to end in a row", and placed the barleycorn, not the inch, as the base unit of the English Long Measure system, from which all other units were derived. John Bouvier similarly recorded in his 1843 law dictionary that the barleycorn was the fundamental measure. Butler observed, however, that "[a]s the length of the barley-corn cannot be fixed, so the inch according to this method will be uncertain", noting that a standard inch measure was now (by his time) kept in the Exchequer chamber, Guildhall, and that was the legal definition of the inch. This was a point also made by George Long in his 1842 Penny Cyclopædia, observing that standard measures had since surpassed the barleycorn definition of the inch, and that to recover the inch measure from its original definition, in the event that the standard measure were destroyed, would involve the measurement of large numbers of barleycorns and taking their average lengths. He noted that this process would not perfectly recover the standard, since it might introduce errors of anywhere between one hundredth and one tenth of an inch in the definition of a yard.
Before the adoption of the international yard and pound, various definitions were in use. In the United Kingdom and most countries of the British Commonwealth, the inch was defined in terms of the Imperial Standard Yard. The United States adopted the conversion factor 1 metre = 39.37 inches by an act in 1866. In 1893, Mendenhall ordered the physical realization of the inch to be based on the international prototype metres numbers 21 and 27, which had been received from the CGPM, together with the previously adopted conversion factor.
In 1930, the British Standards Institution adopted an inch of exactly 25.4 mm. The American Standards Association followed suit in 1933. By 1935, industry in 16 countries had adopted the "industrial inch" as it came to be known.
In 1946, the Commonwealth Science Congress recommended a yard of exactly 0.9144 metres for adoption throughout the British Commonwealth. This was adopted by Canada in 1951; the United States on 1 July 1959; Australia in 1961, effective 1 January 1964; and the United Kingdom in 1963, effective on 1 January 1964. The new standards gave an inch of exactly 25.4 mm, 1.7 millionths of an inch longer than the old imperial inch and 2 millionths of an inch shorter than the old US inch.
US Survey inches
The United States retains the 1/-metre definition for survey purposes, producing a 2 millionth part difference between standard and US survey inches. This is approximately 1/ inch per mile. In fact, 12.7 kilometres is exactly 500,000 standard inches and exactly 499,999 survey inches. This difference is significant when doing calculations in State Plane Coordinate Systems with coordinate values in the hundreds of thousands or millions of feet.
Before the adoption of the metric system, several European countries had customary units whose name translates into "inch". The French pouce measured 27.0 mm, at least when applied to describe the calibre of artillery pieces. The Amsterdam foot (voet) consisted of 11 Amsterdam inches (duim). The Amsterdam foot is about 8% shorter than an English foot.
The now obsolete Scottish inch (Scottish Gaelic: òirleach), 1/ of a Scottish foot, was about 1.0016 imperial inches (about 25.4406 mm). It was used in the popular expression Gie 'im an inch, an he'll tak an ell, in English "Give him an inch and he'll take an ell", first published as "For when I gave you an inch, you tooke an ell" by John Heywood in 1546. (The ell, equal to 37 inches (about 940 mm), was in use in England until 1685.) Modern versions of the saying include "Give him an inch and he'll take a mile" and "Give him an inch and he'll take a yard".
- Беларуская (тарашкевіца)
- Bahasa Indonesia
- Kreyòl ayisyen
- Lingua Franca Nova
- Bahasa Melayu
- Norsk nynorsk
- Simple English
- Српски / srpski
- Srpskohrvatski / српскохрватски
- Tiếng Việt
- This page is based on the Wikipedia article Inch; it is used under the Creative Commons Attribution-ShareAlike 3.0 Unported License (CC-BY-SA). You may redistribute it, verbatim or modified, providing that you comply with the terms of the CC-BY-SA.