That is a prescriptivist way of thinking about language, which is useful if you enjoy feeling righteous about correctness, but not so helpful for understanding how communication actually works. In reality-reality, "kilobyte" may mean either "1000 bytes" or "1024 bytes", depending on who is saying it, whom they are saying it to, and what they are saying it about.
You are free to intend only one meaning in your own communication, but you may sometimes find yourself being misunderstood: that, too, is reality.
It's not even really prescriptivist thinking… "Kilobyte" to mean both 1,000 B & 1,024 B is well-established usage, particularly dependent on context (with the context mostly being HDD manufacturers who want to inflate their drive sizes, and … the abomination that is the 1.44 MB diskette…). But a word can be dependent on context, even in prescriptivist settings.
E.g., M-W lists both, with even the 1,024 B definition being listed first. Wiktionary lists the 1,024 B definition, though it is tagged as "informal".
As a prescriptivist myself I would love if the world could standardize on kilo = 1000, kibi = 1024, but that'll likely take some time … and the introduction of the word to the wider public, who I do not think is generally aware of the binary prefixes, and some large companies deciding to use the term, which they likely won't do, since companies are apt to always trade for low-grade perpetual confusion over some short-term confusion during the switch.
Does anyone, other than HDD manufacturers who want to inflate their drive sizes, actually want a 1000-based kilobyte? What would such a unit be useful for? I suspect that a world which standardized on kibi = 1024 would be a world which abandoned the word "kilobyte" altogether.
> with the context mostly being HDD manufacturers who want to inflate their drive sizes
This is a myth. The first IBM harddrive was 5,000,000 characters in 1956 - before bytes were even common usage. Drives have always been base10, it's not a conspiracy.
Drives are base10, lines are base10, clocks are base10, pretty much everything but RAM is base10. Base2 is the exception, not the rule.
How can there be both a "usual meaning" and a "correct meaning" when you assert that there is only one meaning and "There's no possible discussion over this fact."
You can say that one meaning is more correct than the other, but that doesn't vanish the other meaning from existence.
When precision is required, you either use kibibytes or define your kilobytes explicitly. Otherwise there is a real risk that the other party does not share your understanding of what a kilobyte should mean in that context. Then the numbers you use have at most one significant figure.
That's funny. If I used the "correct" meaning when precision was required then I'd be wrong every time I need to use it. In computers, bytes are almost always measured in base-2 increments.
When dealing with microcontrollers and datasheets and talking to other designers, yes precision is required, and, e.g. 8KB means, unequivocally and unambiguously, 8192 bytes.
I kid good-naturedly. I'm always horrified at what autocorrect has done to my words after it's too late to edit or un-send them. I swear I write words goodly, for realtime!
The line between "literal" and "colloquial" becomes blurred when a word consisting of strongly-defined parts ("kilo") gets used in official, standardized contexts with a different meaning.
In fact, this is the only case I can think of where that has ever happened.
"colloquial" has no place in official contexts. I'll happily talk about kB and MB without considering the small difference between 1000 and 1024, but on a contract "kilo" will unequivocally mean 1000, unless explicitely defined as 1024 for the sake of that document.
Which is the reality. "kilobyte" means "1000 bytes". There's no possible discussion over this fact.
Many people have been using it wrong for decades, but its literal value did not change.