I recently had an exchange of correspondence with an acquaintance (a former RAF pilot) who tried to explain to me why most of the world of aviation still uses nautical miles and knots rather than kilometres and km/h. The explanation went like this.
“Now navigation. There are still lots of aircraft that are flown around the world that do not have sophisticated navigation aids and pilots need simple ways of mentally calculating navigational requirements. One of the most common is based on the fact that 1 radian (the angle at the centre of a circle that is subtended by an arc equal to the radius) is approximately 60 degrees. The navigational trick is known as the “one in sixty rule.”
Very simply put, if a pilot is 1 mile off track after 60 miles then the error is 1 degree which gives a simple way of calculating the change needed to take out the error.. As an example, if the distance to be flown is 120 miles and after 60 miles the pilot identifies that he is 1 mile off track then he needs to turn 2 degrees to make good his destination (one to fly parallel to his track and one to close the destination). Now this will work for any unit of measurement. One banana off after 60 bananas is still an error of 1 degree. The crunch it seems to me is that, as I understand it, the internationally agreed global positioning system is still based on latitude and longitude (all the GPS systems I have dealt with start with a very sophisticated lat/long model of the earth) and the angle subtended by one minute of arc at the earth’s surface on a latitudinal meridian is a nautical mile. Now any maritime chart or aviation chart/map is overprinted with the lat/long grid so it is very easy to see 1 minute of arc and therefore see what 1 nm [sic] looks like irrespective of the scale of the map. It makes using the one in sixty easier. One could do it in kms but I think you would need to know the scale and use a ruler to measure kms so why make life difficult?. As the Merecats would say – Simple. Incidentally this also explains the dominance of using knots as a measurement of speed.”
This was my reply:
“Thanks for the explanation, which I think I understand. [Actually, I didn’t fully]
In your example, the deviation from course (1 nautical mile or banana) divided by the distance travelled (60) is in fact the sine of the angle subtended, and inverting this gives 0.955 degrees – i.e. roughly one degree off course. Presumably, this is good enough for travelling short distances. ( I couldn’t see the relevance of radians, but I note that one radian is 57.296 degrees, and this divided by 60 degrees gives 0.955). So far, so good. However, as you say, this relationship is independent of measurement units.
Turning to the latitude and longitude grid, surely this only works in a due north-south direction, as the parallels of latitude are shorter as you approach the poles. Measuring from my Philips world atlas, I calculate that 5 degrees along the equator (i.e. 300 nautical miles) in Brazil represents 554 km, which gives 1846 m per nautical mile (the SI definition of a nautical mile is 1852 m), whereas, measuring horizontally from British OS maps, one degree at latitude 50 degrees north (The Lizard) represents 1190 m. At 60 degrees north (Shetland) it is 940 m. So even at the scale of the UK, there is a considerable difference – and this ignores the related problem of portraying a curved surface on a flat map! (I also thought of introducing the Heisenberg Uncertainty Principle, but that would be a digression).
I suppose that if you have a pair of dividers to hand you could use the vertical scale of minutes of latitude to set the dividers to measure in nautical miles, but assuming that the map or chart has a scale in km, it would be just as easy to set it to measure in kilometres. Or use a standard scale rule calibrated in metres at the appropriate scale (such as architects and planners have used for 40 years). (The 1939 OS national grid, which I believe is used by the army, is based on kilometre squares).
(Incidentally, I gather from browsing internet sites that GPS can use decimal degrees as an alternative to degrees, minutes, seconds – so it is not necessarily dependent on minutes – hence, nautical miles are not essential to GPS systems. As I understand it).
So my conclusion is that the claimed advantage of using nautical miles is fairly weak. It must be primarily a question of resistance to change and the historical domination of the Americans in the aviation industry. Of course, changing the habits of a lifetime is always inconvenient at first, but I would have thought the long term advantages of a world-wide system, used and understood by all for all purposes, far outweighs the temporary inconvenience of a small minority having to adjust to change.
Incidentally, as NATO armies work in km, whereas air forces work in nautical miles (and feet for height?), what units do they use when they need to talk to each other about ranges, distances, heights etc – e.g. an OS map shows that a mountain is 782 m high, so what altitude do I need to fly at to clear it? (I seem to remember a Chinook helicopter flying into a hillside on the Mull of Kintyre – attributed to pilot error – could confusion over measurement units have had anything to do with it?).
Anyway, I have gone on too long. Hope this makes some sense.”
There was no reply from my correspondent – so I did a little more research.
It seems that the fundamental problem is that maps are flat, whereas the Earth is (roughly) spherical. So a co-ordinate map grid based on kilometre squares that is suitable for a relatively small land area – say, the UK – does not work when extended to a continent. (I am advised that navigators on ships crossing the Irish Sea have to make minor adjustments when they sail from the British National Grid area into the Irish grid area, as the latter has a different origin). For longer distances, navigators use latitude and longitude as a co-ordinate system in order to determine their position and their course.
However, what I still do not understand is why this should affect the units of distance used. There is no particular logic in dividing the Earth’s circumference into 360 degrees of longitude and then into 21 600 minutes (i.e. 360 x 60), and then using the distance that one minute represents at the equator (and only at the equator) as the basis for a unit of measurement. Wouldn’t it perhaps be more useful to divide the distance from the equator to the poles by a convenient number – say, 10 000 – and then base measurements on that? But, oh, I forgot: that’s exactly what the founders of the original metric system did.
In fact, until the Second World War, most aviation outside America and the British Empire actually did use the kilometre for distances (and, consistently, metres for height), and indeed, for domestic aviation, Russia still does. It was only the post-war dominance of the USA in IATA and ICAO (supported of course by the British) that imposed nautical miles on an otherwise metric world.
Or have I missed or misunderstood something? Can anybody help? Above all, is there any hope of getting the situation changed?