I often find myself explaining the same things in real life and online, so I recently started writing technical blog posts.
This one is about why it was a mistake to call 1024 bytes a kilobyte. It's about a 20min read so thank you very much in advance if you find the time to read it.
Unlike many comments here, I enjoyed reading the article, especially the parts in the "I don’t want to use gibibyte!" chapter, where you explain that this (the pedantry) is important in technical and formal situations (such as documentation). Seeing some of the comments here, I think it would have helped to focus on this aspect a bit more.
I also liked the extra part explaining the reasoning for using the Nokia E60.
I don't quite agree with the recommendation to use base 10 SI units where neither KiB or kB would result in nice numbers. I don't see why base 10 should have an influence on computers, and I think it makes more sense to stick to a single unit, such as KiB.
The reasons I have this opinion are probably to do with:
My computer has shown me values using KiB, Gib, etc for years - I think it's a KDE default - so I'm already used to the concept of KiB being different from kB.
I dislike the concept of base 10 in general. I like the idea of using base 16 universally (because computers. Base 12 is also valid in a less computer-dominant society). I therefore also think 1024 is a silly number to use, and we should measure memory in multiples of 2^8 or 2^16...
p.s, I agree with other commenters that your comments starting with "Pretty obvious that you didn’t read the article." or similar are probably not helping your case... I understand that some comments here have been quite frustrating though.