You could verify it out yourself and experiment with some image files on a PC using an image editing tool.
Take a pic with a digital camera set to capture and save on TIF file format at maximum resolution. Upload that to a PC. A TIF file is uncompressed bitmap image. Using the same resolution, save it as a 20% compressed JPG file and another at 90% compressed JPG file. The former has a higher file size than the other, indicating a higher bit information than the other - equivalent to your bitrate in video or moving images. A bitrate is just the number of bit information flowing to your display per second.
Similarly, our beloved DVD uses compressed video file information - with about 16 frames per second or 16 compressed images each one is just like your compressed jpg files. The higher the bit information, the more details and color information you get per image in the mpeg2 video file.
Just FYI:
Video bitrates at 1.5mbps is VCD quality, 4mbps-6mbps is DVD quality (at 480i), 7mbps - 9mbps is Superbit DVD quality, 15mbps - 18mbps is HDTV broadcast quality(typically at 720p), 35mbps - 45mbps+ is HD DVD and Blu-ray quality(at 1080p).
That's proof positive that in general, the higher the bitrates, the better quality you get. Also for audio.
But the efficiency of the compression algorthm also determines the bitrate.
In Blu-ray, some titles go as much as 56mbps because BD used the older mpeg2 compression used for standard DVD in their early releases. While HD DVD can go even lower than 30mbps because they use a more efficient VC-1 compression, with the same gorgeous HD quality.