There is some confusion as to precisely how a setback thermostat saves energy. In fact, because of misunderstandings I have heard a number of people proclaim that a setback doesn't save energy. There are two common arguments:
1. Although you save energy as the house is initially cooling during the setback period, the furnace has to work overtime to make up this loss once the setback period is over. This "overtime" counteracts the initial savings for no net savings.
2. If the house is set for, say, 68F, when it cools a degree to 67F the furnace will turn on. It takes just as much energy to bring the air up one degree, whether it's from 67 to 68 or from 59 to 60.
So what's the deal?
Both arguments are true but they're misleading (actually, the second one is true only if we ignore a certain aspect). Heat loss from your house is a function of the internal/external temperature difference. Most people understand this intuitively. For example, it's obvious from the fact that the furnace turns on more frequently when it's 0F outside than when it's 50F outside. In other words, the true heat loss isn't of the form X joules (or BTUs, therms, etc.) per hour, but X joules per hour per degree. If you plot this, you get a decaying function; something like an exponential decay instead of a nice straight line. Of course, if you're only looking at a very narrow temperature range (like your thermostat constantly tripping at 67F to raise the house one degree), you can approximate the initial bit as a straight line. The important thing to remember is that this is only an approximation. The initial rate of energy loss for a house at 70F to a 30F environment is greater than the initial rate for a house at 60F in the same environment. If this was not true, that is, if the loss rate was constant, your furnace would use no more energy to keep your house 68F on a 50F day than on a -100F day.
So, while it is true that your furnace has to make up the energy you saved during the initial setback cooling, saving energy that way is not the point of the process. Further, while it is true that warming a mass of air one degree will take just as much energy whether you're starting from 60F or 70F, that is true only if you ignore the differential to the outside. This is, admittedly, a second order effect though as the furnace is not on for that long. Where you really save is during the setback period. When your house is at 60F instead of 70F, the loss rate is reduced, and therefore the furnace doesn't need to come on as often. Once the interior temperature has stabilized, the furnace may only need to fire 10 times in the course of a few hours instead of 12 times. Because it's all based on input/output differentials, as far as your furnace is concerned, it works just as hard whether it's 70F in and 40F out, or 80F in and 50F out, or 60F in and 30F out. So, energy-wise, dropping your thermostat 10 degrees is like keeping it where it is and raising the outside temperature 10 degrees.
In our house, the setback is 58F. This kicks in at 9 PM. As we're not around for very long in the morning during the week, the temperature isn't programmed to rise until about 5 PM, at which time it goes to 66 F.
Yes, we wear sweatshirts alot.
- Log in to post comments
Both arguments are true but they're misleading
Actually, the first one is just plain false. It would have been "true but misleading" if you left of the last 4 words. Those last words make it just false.
actually, the second one is true only if we ignore a certain aspect
Actually, I think the second one is just plain true. It is misleading of course, because it ignores one certain aspect, as you point out.
And, of course, for those of us living where air conditioning is the big energy eater the savings from a lower differential is compounded: it takes more energy to pump a calorie from in to out when the outside temp is higher and less when the inside temp is higher.
Thus, ideally, we would have huge interior thermal masses which we would pump down in the early morning when outdoor temps are lowest and indoor temps have been rising since the previous day ...
DC - that is something that is beginning to happen commercially. The data center I work for is building a new facility that will use a new cooling system from IBM. During the night it stores cool air in a large mass (I believe it's a gel of some kind) then during the day it draws on that stored cold to help lessen the load on the CRAC units and other HVAC systems. The overall result is something along the lines of a 20% reduction in power required to run cool the server farm.
I must be simple-minded. I always assumed, and never even thought about it, that if the house was kept colder that less fuel was burned than if it was kept warmer. Seems like people are doing a lot of justification to keep that thermostat up.
In addition to #2:
It takes more energy to increase the same mass of air from 69 to 70*F than it does from 59 to 60*F b/c the specific heat of air increases with its temperature.
I am sure that the specific heat of air varies greatly between 59 & 60 degrees F:))