Determining home energy usage from temperature

New: Here's an updated visualization, made with Tableau Public:

Powered by Tableau

I was curious about how our home energy usage varied with respect to temperature, so I made some graphs with R. The first graph is a graph of the average high temperature in Austin versus our kWH/day energy usage (see where I got the data from) with a fitted linear regression line. The slope is 1.68, indicating that for every degree hotter it gets, we spend 1.68 kWH extra per day.
Fitted data showing the linear regression

However, the line is not a great fit for the data (R-squared is 0.7778), so I decided to look at a plot with the dates (in "year-month") on it:
Data with dates instead of points

Aha! It looks like there's something else going on - almost all the 2009 entries are above the line, and all the 2008 entries are below the line. If we split these up into two regressions, we get:
Data split into two better-fitting lines

The blue line (for 2008) has slope 1.31 with R-squared 0.8979, and the red line (for 2009) has slope 2.086 with R-squared 0.9344.

Updated graph with new months:
Data split into two better-fitting lines

About this project: The temperature data was taken from a few places: firstly this infochimps data set, augmented with the preliminary reports from the National Climatic Data Center and the monthly reports from Camp Mabry.

For the energy data, I didn't have all of our old Austin Energy bills, but I did know how much they were for so I was going to estimate based on that. This turned out to be somewhat inaccurate, but I was able to sign up for an online account and get all the old data, which was extremely helpful since it had the kWH/day averages. (since billing cycles seem to range between 28 and 35(!) days)

After processing that, I loaded it into R, and did the following analyzes/plots:

The output of the regressions was:

Greg's home page