Circuits draw current. Power supplies do not "push" current. A power source (wall wart, battery, whatever) will only be able to supply a certain amount of current before overloading, so circuits that draw more than the rated current will see problems.
For wall warts, you may have a fuse or PTC blow if you draw too much.
For batteries, you will see the voltage sag, and the battery potentially overheat.
For well designed power supplies, you will see the voltage drop until the circuit only draws the design limited current (this may be very close to 0 Volts for a short)
Separately, voltage measures how hard/hot you're driving a circuit. While a specific circuit will likely be designed for a specific voltage, smaller variations will often work fine. Larger variations may either cause lower/higher than expected performance (motors, heaters, etc) or destruction of components not designed for the voltage range (capacitors, etc.) Also, a higher voltage will cause a resistive circuit to draw more current. A 100 Ohm circuit at 3 Volts will draw (3/100) == 30 mA. At 6 Volts, it will draw (6/100) == 60 mA. Because power is the square of current (also the square of voltage,) the 6 Volt version will dissipate 4x the power, and likely overheat whatever was designed for 3 Volts.
So, battery, runtime. If the battery provides the appropriate voltage for a circuit, then larger capacity (Ah or Wh) will mean longer run times, all else being equal. Some batteries also have significant self-discharge, that may need to be factored in if you're talking about weeks or months of runtime.
If the power source provides a voltage that is not suitable for the circuit, then you need to regulate the voltage. For a 12V battery driving a 1.8V LED, you will need to drop at least 10V. The easiest way to do this is to look up the current usage of the LED (10 mA, say) and then calculate the appropriate resistance to drop 10V at 10 mA (U = I * R means R = U / I means R = 10V / 100 mA == 100 Ohms.) Now, 10V times 100 mA equals 1 Watt of power, so you will need a fatter than normal resistor for this to not burn up. Normal resistors are rated for 1/4W usually, or even 1/8W for smaller components.
If the voltage source will vary, or the load will vary, the resistor will need to vary based on the input voltage. If a battery starts out at 14V, and drops to 11V over discharge, you will need to drop between 12V and 9V. The easiest way to do this is with a linear voltage regulator. The voltage regulator will keep a set output voltage, and adjust its resistance to "drop" the rest of the voltage it sees, based on the current drawn. An adjustable linear regulator like a LM350 can easily be set to output 1.8V over a wide range of input voltages.
However, the resistor, and linear regulator, routes, are very inefficient. The fact that you get a lot of heat should tell you that! For a 1.8V LED running from a 14V battery, 85% or so of the energy will be wasted heating up your resistor/regulator, meaning you have a 15% efficiency. That's not good for runtime.
Thus, for a case like this, you typically want a switching buck regulator (DC DC converter.) These circuits use inductors, capacitors, and switches to generate one voltage from another, with much less loss -- 90% is typical, > 98% is achievable in some sophisticated designs. Thus, for the best runtime, you want a battery with a high charge capacity (Wh or Ah) and a voltage regulator with high efficiency.