Besides being the first and last letters of the alphabet, why do we like abc and xyz so much?
In algebra, both abc and xyz are used, and occasionally pqr as well. Why those specifically? And for vectors, one of the notations is ij. Why is that? And in programming, it is a convention to use loops with ijk as the index. Why not something like cde (for count) or tuv(tally)? Why index? And in coordinates, we use xyz. Why is that? Why not abc? What about ijk, pqr, etc.? Also, why is y and z flipped in some 3D programs, and why is in some 2D programs, up is negative y rather than positive?
Finally, why do these kind of triplet letter combinations always use the Latin alphabet? Why aren’t there any conventions that use Cyrillic, Greek, Arabic, or the Latin letters with cool hats? Is it because English became the dominant language? That can’t be the only reason, as we still use some Greek symbols for some things like pi, phi, alpha(for angles as well as the particle), beta, gamma (rays), etc. The Greeks figured out things like coordinates, algebra, etc., so why Latin letters? Is it because of the Romans?
A lot of it comes down to convention and convention is often set by those who did it first or whose work dominated a field. The whole mathematical notation system we use today is just a convention and is not the only one that exists, but is the one the world has decided to standardise to…
Rene Descartes is usually regarded at he originator of the current system. He used abc for constants and xyz for unknown variables amongst other conventions.
Sequential letter sets are easy to use as they are easily recognised, and convenient as a result, plus are generally accepted to have non specific or less specific meaning. For example:
a2+b2=c2
That formula is a much simpler concept to get round using sequential leffer than:
V2 + G2 = z2
Under the common system, when you don’t use sequential letters it also implies much more specific meaning to the individual letters, and that can introduce ambiguity and confusion.
When writing a proof there can be many many statements made and you’d quickly run out of letters if you didn’t have a convention for accepting abc are variables and can be reused.
We also do use symbols from other alphabet sets, and alpha/beta/gamma is commonly used trio. But in mathematical notation there are a huge range of defined constants and symbols now that many have been ascribed specific uses. Pi for example. So you risk bringing in ambiguity of meaning by moving away from the accepted conventions of current maths by using other sets.
Even e has specific meaning and can be ambiguous if you need to stretch to 5 variables. When working with e it’s not uncommon to use a different string of lwtters in the latin alphabet to avoid confusion if you need to use variables
And we don’t stop at 3; abcd etc is used.