To start this topic we should know the basic definition of a sensor, a sensor is a device that converts real world quantities such as temperature and light into electrical signals, these signals can be either analog or digital. A signal is an electrical quantity such as voltage that changes over time.
Just for teaching purpose we can assume we are using 5 volt supply, this is just an arbitrary value and nothing special about it, so the voltage cannot get higher than 5. An analog signal can take any value between 0 and 5 so 0.00000002, 4.998, 3.51, 1.56, 0.5 are all valid values, assuming we are reading temperature, the temperature is no just 32 C/F but in reality it is 32.something. So analog signal are precise but unfortunately computers cannot process any value but only discreet values call digital signals
A graph of analog (black) versus digital (yellow) is found attached
|analog vs digital.png||37.2 KB|