Hi, bro. Let me attempt to explain the significance of the "bits" in the digital realm.
"Bits" is a concatenation of 2 words--Binary Digit. Meaning, unlike the usual decimal counting system that consists of ten numbers (0 to 9), the binary counting system uses only 2 numbers--0 and 1.
If I have for example a 1-bit converter, that means I can only convert an analog signal to 2 different digital values, namely, 0 and 1. It's like having just a simple switch that is either "on" or "off". To represent video, this wouldn't be very useful in this day and age.
If on the other hand I have a 2-bit converter, that means I can now represent 4 different analog values in digital, namely, 00, 01, 10, 11. Now that's twice the values of a 1-bit system. If I have a 3-bit converter, I can now represent 8 unique values, namely, 000, 001, 010, 011, 100, 101, 110, 111. Again, this is twice the values of a 2-bit system and 4 times more than the 1-bit system.
If you notice, the formula to determine the number of unique values would be 2 raised to the power of n, where n is the number of bits. So technically speaking, the more number of bits, the more analog values can be digitally represented, ergo the better the quality.
Unfotunately, there's a throng of other factors that would determine quality but all else being equal, the more bits, the better.
'Hope this helps.