Started computer science in grade school with only an hour of actual computer time a week. A LOT of theory and history. Charles Babbage, Ada, ENIAC, etc.
This stuff was drilled into our heads. Same with bit, byte and, halfway between bit and byte, a nibble. It's a thing. 4 bits is a nibble.
Funny enough, I couldn't code to save my life now.
Nibbles are still a thing in embedded programming and in ultra low bandwidth comms like LoRa. For example you can pack 2 BCD digits into a byte, one for the high nibble and one for the low nibble. This results in the hex representation of the byte actually being directly readable as the two digits, which is convenient.
Datasheet for sensors will sometimes reference nibbles as well, often for status bits on protocols like Onewire where every bit counts. i.e low nibble contains a state value 0-15 and high nibble contains individual alarm flags.
QBasic came with NIBBLES.BAS, a snake game using text-mode characters as "pixels". Specifically it faked a 80x50 "pixel" grid using the standard 80x25 text screen where each 8-bit (=1 byte) text character made up two monochrome pixels using ▄ or ▀ or █ or an empty space.
I assume the name derived from the fact that, in a way, one pixel was "using half a byte", i. e. a nibble.
Such good memories of learning to code as a kid in QBasic, I remember NIBBLES.BAS.
I was totally spoiled as my dad had the professional paid version which had an incredible IDE for the time and things like user defined types and structs that I later found out weren't usually part of BASIC. It also had a ton of fancy graphics modes, double buffering, and even a sprite library. I loved playing around making crappy games.