If it helps to see things a bit more visually, I made this quick infographic.
This show a hypothetical four global variables (in an 8-bit system, like SCI0) with their values in them. Each of the Decimal, of Hexadecimal, or Binary is the exact same value, just how it's represented in each of those number formats.
I think it's very easy to get the concept of a single variable. At it's simplest, you give it a name and you can put a number in there that you can refer to later or change or whatever.
The concept of a "flag" in SCI or programming in general is just that: a concept. It has no special construct in and of itself. It's just means a variable that is only ever binary. On/off, yes/no (or more commonly true/false i.e. 1/0). Kinda like a mailbox flag... it's either up or it's not.
You
could use a full variable for your 1/0 value, but because each variable is really made up of bunch of bits, and memory space is oh so very precious, why not use each of those bits individually. That gives you 8 times as many yes/no variables (on an 8-bit system like SCI0, or 16 times as many yes/no's on a 16-bit system like SCI1).
The bitwise functions BTst, BSet, BClr do all the legwork to figure out which specific bit in which specific variable to manipulate, but essentially what it's doing is only changing a small part of one variable out of a bunch of variables set aside for the flags.