Create and Fuck your AI Cum Slut –70% OFF
x

ZenoMod

Well-Known Member
Nov 12, 2022
1,690
2,130
386
Does anyone know why the file for 0.11.1 in gofile weighs more than in mega? 18.6GB in gofile and 17.35GB in mega, we're talking about a 1.3GB difference. Is mega a bit more compressed or what?

I'm not sure, but probably because one considers 1GB = 10^9 bytes, while the other considers 1GB = 2^30 bytes.

When I was in high school, in the '80s, there was no confusion at all: multiples for bytes where defined multiplying N times by 1024 (2^10) and not by 1000 (10^3) like for all the others unit of measures.

We were positive: 1 km = 1000 m, 1 kg = 1000 g, etc, BUT 1 kB = 1024 kB.

Then the fucking HDD vendors started selling the first 4 or 16 GB drives with a small asterisks saying "1GB = 1,000,000,000 Bytes".

It all started as a commercial questionable advertising, and per the power of money led to the redefinition of 1GB as indeed 10^9 Bytes, while for the "traditional gigabyte" they invented the "GiB" abbreviation (with a fucking lowercase i following the G, or M for mega, T for tera, etc).

So youngsters nowadays say 1GB = 10^9 Bytes, 1GiB = 2^30 Bytes.
 

icewolf81

Member
Jan 24, 2024
135
93
57
I'm not sure, but probably because one considers 1GB = 10^9 bytes, while the other considers 1GB = 2^30 bytes.

When I was in high school, in the '80s, there was no confusion at all: multiples for bytes where defined multiplying N times by 1024 (2^10) and not by 1000 (10^3) like for all the others unit of measures.

We were positive: 1 km = 1000 m, 1 kg = 1000 g, etc, BUT 1 kB = 1024 kB.

Then the fucking HDD vendors started selling the first 4 or 16 GB drives with a small asterisks saying "1GB = 1,000,000,000 Bytes".

It all started as a commercial questionable advertising, and per the power of money led to the redefinition of 1GB as indeed 10^9 Bytes, while for the "traditional gigabyte" they invented the "GiB" abbreviation (with a fucking lowercase i following the G, or M for mega, T for tera, etc).

So youngsters nowadays say 1GB = 10^9 Bytes, 1GiB = 2^30 Bytes.
You could be correct but 1,3 GB is a lot difference for that.

The IT therm is Kibibyte (KiB), Mebibyte (MiB) und Gibibyte (GiB), in binary code.

But the usually outside the IT most people use(not really correct) KB, MB and GB.
 
  • Wow
Reactions: legend_shon

ZenoMod

Well-Known Member
Nov 12, 2022
1,690
2,130
386
You could be correct but 1,3 GB is a lot difference for that.
Nope. I just made the math with the calculator:

18,600,000,000 / ( 1024 * 1024 * 1024) ≃ 17.3

So we can say 18.6 GB = 17.3 GiB

The IT therm is Kibibyte (KiB), Mebibyte (MiB) und Gibibyte (GiB), in binary code.

But the usually outside the IT most people use(not really correct) KB, MB and GB.
That's because those fancy terms "Kibibyte (KiB), Mebibyte (MiB) und Gibibyte (GiB)" DIDN'T EXIST (not even in IT) BEFORE THE LATE NINETIES! :HideThePain:

So older folks like me keeps using what we were taught in the eighties: 1 GB = 1 GIGABYTE = 2^30 Bytes = 1024 * 1024 * 1024 Bytes.

----

The damn technical committees can’t just go around changing the definition of a unit of measurement and expect the whole world to fall in line immediately.

It’s obvious that someone who studied in the ’80s, when a Gigabyte was defined as 2^30 bytes and that weird/pretentious name “gibibyte” didn’t even exist, and who hasn’t subscribed to the ISO or IEEE fanzine… will keep using the old definition.

ESPECIALLY BECAUSE WINDOWS STILL USES kB, MB, AND GB — NOT “KiB, MiB, GiB” — to mean 2^10, 2^20, 2^30… JUST AS WAS STANDARD IN IT IN THE ’80s AND EARLY ’90s.

---

’s a clear historical timeline about the change in the definition of gigabyte (GB) and the creation of the neologism gibibyte (GiB).

1757815115531.png
 
Last edited:

Roger-a-Dale

Well-Known Member
May 9, 2024
1,970
2,876
332
Nope. I just made the math with the calculator:

18,600,000,000 / ( 1024 * 1024 * 1024) ≃ 17.3

So we can say 18.6 GB = 17.3 GiB



That's because those fancy terms "Kibibyte (KiB), Mebibyte (MiB) und Gibibyte (GiB)" DIDN'T EXIST (not even in IT) BEFORE THE LATE NINETIES! :HideThePain:

So older folks like me keeps using what we were taught in the eighties: 1 GB = 1 GIGABYTE = 2^30 Bytes = 1024 * 1024 * 1024 Bytes.

----

The damn technical committees can’t just go around changing the definition of a unit of measurement and expect the whole world to fall in line immediately.

It’s obvious that someone who studied in the ’80s, when a Gigabyte was defined as 2^30 bytes and that weird/pretentious name “gibibyte” didn’t even exist, and who hasn’t subscribed to the ISO or IEEE fanzine… will keep using the old definition.

ESPECIALLY BECAUSE WINDOWS STILL USES kB, MB, AND GB — NOT “KiB, MiB, GiB” — to mean 2^10, 2^20, 2^30… JUST AS WAS STANDARD IN IT IN THE ’80s AND EARLY ’90s.
There is a joke that is perfect for moments like these.

Ahem.

There are 10 kinds of people in the world. Those who understand binary and those who don't.
 

Eeliejun

Newbie
Jul 19, 2017
84
206
201
I am really late for this party but just finished EP11 did not expect who was sending the letters. Like they were not on my radar at all. Also fuck Derek his finger. Like did not expect that at all at the end.
 
  • Like
Reactions: legend_shon
4.70 star(s) 1,728 Votes