Nope. I just made the math with the calculator:
18,600,000,000 / ( 1024 * 1024 * 1024) ≃ 17.3
So we can say 18.6 GB = 17.3 GiB
That's because those fancy terms "Kibibyte (KiB), Mebibyte (MiB) und Gibibyte (GiB)"
DIDN'T EXIST (not even in IT)
BEFORE THE LATE NINETIES!
So older folks like me keeps using what we were taught in the eighties: 1 GB = 1 GIGABYTE = 2^30 Bytes = 1024 * 1024 * 1024 Bytes.
----
The damn technical committees can’t just go around changing the definition of a unit of measurement and expect the whole world to fall in line immediately.
It’s obvious that someone who studied in the ’80s, when a Gigabyte was defined as 2^30 bytes
and that weird/pretentious name “gibibyte” didn’t even exist, and who hasn’t subscribed to the ISO or IEEE fanzine… will keep using the old definition.
ESPECIALLY BECAUSE WINDOWS STILL USES kB, MB, AND GB — NOT “KiB, MiB, GiB” —
to mean 2^10, 2^20, 2^30… JUST AS WAS STANDARD IN IT IN THE ’80s AND EARLY ’90s.
---
You must be registered to see the links
’s a clear historical timeline about the change in the definition of
gigabyte (GB) and the creation of the neologism
gibibyte (GiB).
View attachment 5246732