Your question as is can not be answered. You seem to have some misconceptions of how transformer work.
transformers converts voltages to a higher or lower value but in that process energy or current is lost when converted to a higher voltage, if that is the case is current generated when the voltage is stepped down?
They store energy as a magnetic field in the core (iron for 50-60Hz) on applying a current from the primary.
When the current in the primary is off (AC mains zero crossing) the magnetic field in the core collapses and this induces a current to flow in the secondary winding.
This means that a transformer does not work with DC since it is the changing magnetic field that transfers energy to the secondary.
The turns ratio of the primary to secondary and the primary Voltage sets the secondary Voltage.
The ratio can be a step down or step up of the output Voltage. If the transform were "IDEAL" then all the energy input will be output. However, there are 'losses' in the transformer (winding resistance and eddy current in the core) that drops the efficiency (less than 100%) and produces heat. This is the same regardless of whether the transformer is a step up or down.
also if the voltage in a DC current were to increase would that also mean more current is being drawn from the battery(ohms law) and therefore kills the battery faster?also if that were the case couldn't the battery potentially heat up?
Now you sound to be talking about some type of power supply that uses a Battery.
As I stated above the transformer does not work with DC current. Now there are DC to DC converters that use a transformer but they also have a circuit to switch the DC On-Off into the transformer and thus create/collapse the magnetic field in the core to transfer energy.
Lets just treat this DC-DC converter as a 'black box' that we ca change the output voltage of.
Lets also assume the DC-DC convert efficiency is 100% to start.
If the Load resistance is constant and you increase the Voltage applied then the current increases as per Ohm's Law and so does the power (P = I * E). If this increased Voltage is from the above DC-DC converter with the same input Voltage (battery) then the power (and current) drawn from the battery also increases and the battery will be discharged sooner.
Battery time to discharge = battery capacity (Amp-Hr) / current draw (Amps)
Will the Battery heat up? That depends on other factors such as the Current draw verse the battery's size, chemistry and construction. For any current draw battery temperature will increase but will it get above its rated operating temperature?
For this check the Battery's Specs and data sheet. There should be a spec that states the degrees of temperature rise per Amps drawn.