An air conditioner (AC) generally has an on-off switch. You turn on the AC and it cools. It take several thousand Watts to operate the AC.
You turn it on, and the temperature in the room (or house) drops. The temperature drops until you decide to turn it off.
But, since it is not always convenient for you to stand around until the temperature is comfortable for you, you can let it shut off automatically. You could put a timer on it, and say that it should turn off after 3 hours of cooling, for example. Or you could put a thermometer, so that it shuts off when it turns off at a certain temperature.
And that is exactly what most thrmostats are. They have an off switch that once a certain temperature is reached the AC is turned off. So the AC is consuming just as much power regardless what temperature it is set on, until it the thermostat turns it off.
Because it take longer to cool a room down to a lower temperature than a higher temperature, the AC is on for a longer amount of time, and thus, uses more energy.
For example, if you have a 4000 Watt AC, and it takes 3 hours to cool your house down to 70 F, but it take 3 hrs 15 minutes to cool it down to 68 F, then you are paying for extra 15 minutes of energy. At 10 cents per kilowatt-hour, 15 minutes at 4000 Watts will cost you 10 cents.
It will use the same amount of electricity cooling down from 100 to 70. At this point it will shut off and use NO electricity.
However if you have it set to 68 then it will continue to run until it reaches 68 and *only then* will it shut off.
The difference in energy usage is in how much energy it takes to cool it down the extra 2 degrees. On hotter days this takes more energy than colder days because the house is heating up faster and thus requiring the AC unit to cycle on more often.
Latest Answers