I often find myself explaining the same things in real life and online, so I recently started writing technical blog posts.
This one is about why it was a mistake to call 1024 bytes a kilobyte. It's about a 20min read so thank you very much in advance if you find the time to read it.
I suggest considering this from a linguistic perspective rather than a technical perspective.
For years (decades, even), KB, MB, GB, etc. were broadly used to mean 2^10, 2^20, 2^30, etc. Throughout the 80s and 90s, the only place you would likely see base-10 units was in marketing materials, such as those for storage media and modems. Mac OS exclusively used base-2 definitions well into the 21st century. Windows, as noted in the article, still does. Many Unix/POSIX tools do, as well, and this is unlikely to change.
I will spare you my full rant on the evils of linguistic prescriptivism. Suffice it to say that I am a born-again descriptivist, fully recovered from my past affliction.
From a descriptivist perspective, the only accurate way to define kilobyte, megabyte, etc. is to say that there are two common usages. This is what you will see if you look up the words in any decent dictionary. e.g.:
I don't recall ever seeing KiB/MiB/etc. in the 90s, although Wikipedia tells me they "were defined in 1999 by the International Electrotechnical Commission (IEC), in the IEC 60027-2 standard".
While I wholeheartedly agree with the goal of eliminating ambiguity, I am frustrated with the half-measure of introducing unambiguous terms on one side (KiB, MiB, etc.) while failing to do the same on the other. The introduction of new terms has no bearing on the common usage of old terms. The correct thing to have done would have been to introduce two new unambiguous terms, with the goal of retiring KB/MB/etc. from common usage entirely. If we had KiB and KeB, there'd be no ambiguity. KB will always have ambiguity because that's language, baby! regardless of any prescriptivist's opinion on the matter.
Sadly, even that would do nothing to solve the use of common single-letter abbreviations. For example, Linux's ls -l -h command will return sizes like 1K, 1M, 1G, referring to the base-2 definitions. Only if you specify the non-default --si flag will you receive base-10 values (again with just the first letter!). Many other standard tools have no such options and will exclusively use base-2 numbers.
Here's the summary for the wikipedia article you mentioned in your comment:
In the study of language, description or descriptive linguistics is the work of objectively analyzing and describing how language is actually used (or how it was used in the past) by a speech community.All academic research in linguistics is descriptive; like all other scientific disciplines, it seeks to describe reality, without the bias of preconceived ideas about how it ought to be. Modern descriptive linguistics is based on a structural approach to language, as exemplified in the work of Leonard Bloomfield and others. This type of linguistics utilizes different methods in order to describe a language such as basic data collection, and different types of elicitation methods.