I often find myself explaining the same things in real life and online, so I recently started writing technical blog posts.
This one is about why it was a mistake to call 1024 bytes a kilobyte. It’s about a 20min read so thank you very much in advance if you find the time to read it.
Feedback is very much welcome. Thank you.
It’s because the power of 2 makes more sense to the computer.
This is such a strange post and comment section to me. Computers work because of binary.
which is 2-state, which is why it’s powers of 2
which is why we have kibi, mebi, gibi, etc
Which nobody uses in the industry because we all know that storage uses base2 prefixes.
deleted by creator
It’s actually a decimal Vs binary thing.
1000 and 1024 take the same amount of bytes so 1024 makes more sense to a computer.
Nothing to do with metric as computers don’t use that. Also not really to do with units.
deleted by creator