Why does this matter, you ask? Well, when the furnace is running before the blower starts up it's using gas that doesn't heat the house. It seems like the furnace is constantly coming on and off and thus spends a fair bit of time wasting gas. Upon further investigation, it appeared that our thermostat tried to keep fairly tight control over the interior temperature. What I mean by that is that if the temp is set to 71 it kicks on as soon as the reading drops to 70 and cuts off right at 72. I don't know if the thing rounds or not or what the actual precision is, but I'll guess that it's kicking on at 70.49 and cutting off at 71.50. I've been wondering if there was some way to change the tolerances so that it runs over a slightly wider range, say on at 69.49 and off at 72.5, a three degree range instead of 1. This would result in the furnace being on for longer each time it is on, but it would also not come on as often, thus a higher proportion of the gas using time is actually heating the house.
Besides the economic issues it has some practical relevance as well. When watching a movie together after Keenan has gone to sleep we have to adjust the volume depending on if the blower is running or not. It's more convenient if these volume adjustments can occur less frequently. If it's more convenient and saves money, well...it's a no-brainer at that point.
So why a model? Shouldn't we just use different settings and compare the gas bill at the end of the month? Most of the variables affecting heat loss (insulation, windows, surface area, etc.) from the house are fixed from day to day. However, the weather (including temperature, wind speed, cloud cover, precipitation, etc.) is highly variable and this has a huge impact on the rate of heat loss from the house. Since we're unlikely to have two identical days, let alone months, it would not be very accurate to compare gas bills from different months or even to compare meter readings from different days. Lastly, the meter moves so slowly that you can't really compare usage from one hour to the next.
The model was made in Excel using a set point of 71 degrees Fahrenheit, a 2 minute pre-warming time, natural cooling at 10 min/degree and heating at 3 min/degree. The spread from on to off was either 1 or 3 degrees. The main output of interest is the cumulative time the furnace is on. The result for 6 hours of runtime is shown in graphical form below:
This would appear to show that the one degree range uses 24% more gas than the three degree range. Altering the range shows that the one degree range uses 17% more gas than a two degree range and 29% more than a four degree range. I assume that the larger the range the less the model can be trusted because the underlying assumptions are less valid. For example, it is highly unlikely that a 20 degree range really will produce a 41% savings because it will either run for a week to get the house up to 81 degrees or will go for a week without running as the house cools down to 61 when the outside temp is 55. Certainly this would be not be a very comfortable house to live in and would probably not be good for the heating system (or the hardwood floors).
Length of pre-warming affects the magnitude of the effect (as would be expected). With all other variables held constant, pre-waming times of 1, 2, and 4 minutes result in a 1 minute range vs. 3 minute range excess gas consumption of 14%, 24%, and 36% respectively. This is a pretty obvious result, the less pre-warming, the less wasted gas.
Now, this model obviously has some other shortcomings. If you tried to use it to compare the gas usage with a set temp at 61 and 71 with all other parameters the same, it would tell you that they use the same amount. Obviously this is wrong because you can't use the different mean temperature and assume that the rates of heating and cooling will be the same. At 61 the house will cool off much faster and heat up much quicker, thus using less gas. In fact that is one of the huge unknowns here because I'm using the same rates of heating and cooling despite the wider spread. In reality, the rate of cooling should decrease as the temperature falls and the rate of heating should decrease as the temperature rises. I made the assumption that for the wider temperature range the faster heating loss at the higher temperatures would be equally offset by slower cooling at the lower temperatures along with the corollary of that thought for heating. This may well be wrong. Furthermore, the house continues to loose heat while the furnace is on during both the pre-warming stage and the heating stage. I assumed no temp decrease during the pre-warming stage and the heating rate during the heating phase is a net heating rate without separate variables for heat lost and heat added. The image below shows what true temp vs. time graphs would look more like due to the different rates of heating and cooling at different temperatures.
(Click on chart to view at full size)
Lastly, since I've spent enough time putting this together and Keenan is calling, if the pre-warming time is set to 0 in the model, the total gas consumption is exactly the same in the long-run. I suspect that this is not really true, one of the methods must be inherently more efficient, but I'm not sure how the heating and cooling rates should vary to demonstrate which method is better. It may be the case that the more efficient method depends on the home and HVAC system being used.
Anyway, there doesn't appear to be a way to change the tolerance of the thermostat so the only way to apply this in practice would be to manually raise the set point just after the furnace turns on and to lower the set point just after the furnace turns off. Certainly, that is not terribly convenient. Furthermore, this really is just a model. Maybe I'll ask Philip to collect some data and see how it compares with the model. I'm coming Keenan!