I often find myself explaining the same things in real life and online, so I recently started writing technical blog posts.
This one is about why it was a mistake to call 1024 bytes a kilobyte. It’s about a 20min read so thank you very much in advance if you find the time to read it.
Feedback is very much welcome. Thank you.
Pushing 30 years myself and I confirm literally not a single person I’ve worked with has ever used **bi… terms. Also, I recall the switch where drive manufacturers went from 1024 to 1000. I recall the poor attempt from shill writers in tech saying it better represents the number of bits as the format parameters applied to a drive changes the space available for files. I recall exactly zero people buying that excuse.