Making pretty pictures

Data is great, but visualizations are better. In this step we'll manipulate our stored history so that we can make really nice graphs!

First we'll start by making our sensors named, so that its easier for us to keep track of which is what. Then we'll look at our graph options and data formats. Finally we'll reformat our data to make it ready for graphing

Configuring the sensor names

Its no fun to have data marked as "sensor #1" so I added a 'config' page where the app engine code looks at what sensor numbers have sent data to the database and then allows you to name them. Of course, you need to have the sensor on and sending data -first- before this will work

The configure screen looks something like this:

This code uses GET when it should really use POST. I'm kinda old and dont like debugging with POST so...yeah.
class Configure(webapp.RequestHandler):
  def get(self):
    # make the user log in if no user name is supplied
    if self.request.get('user'):
      account = users.User(self.request.get('user'))
    else:
       if not users.get_current_user():
         self.redirect(users.create_login_url(self.request.uri))
       account = users.get_current_user()


    self.response.out.write('<html><body>Set up your sensornode names here:<p>')

	# find all the sensors up to #10
    sensorset = []
    for i in range(10):
      c = db.GqlQuery("SELECT * FROM Powerusage WHERE author = :1 and sensornum = :2", users.get_current_user(), i)
      if c.get():
        sensorset.append(i)

    self.response.out.write('<form action="/config" method="get">')
    for sensor in sensorset:
      name = ""
      currnamequery = db.GqlQuery("SELECT * FROM Sensorname WHERE author = :1 AND sensornum = :2", users.get_current_user(), sensor)
      currname = currnamequery.get()
    
      # first see if we're setting it!
      if self.request.get('sensornum'+str(sensor)):
        name = self.request.get('sensornum'+str(sensor))
        if not currname:
          currname = Sensorname()  # create a new entry
          currname.sensornum = sensor
          currname.author = users.get_current_user()
        currname.sensorname = name
        currname.put()
      else:
      # we're not setting it so fetch current entry
        if currname:
           name = currname.sensorname
          
      self.response.out.write('Sensor #'+str(sensor)+': <input type="text" name="sensornum'+str(sensor)+'" value="'+name+'"></text><p>')
      
    self.response.out.write("""<div><input type="submit" value="Change names"></div>
      </form>
      </body>
      </html>""")
Now we can have more useful data in the history dump:
Now we can see that Phil is mostly to blame for our power bill!

Google Visualizer

So we have data and we'd like to see our power usage history. Graphing data is a lot of work, and I'm lazy. So I look online and find that Google -also- has a visualization API! This means I don't have to write a bunch of graphical code, and can just plug into their system. Sweet!

OK checking out the gallery of available visualizations, I'm fond of this one, the Annotated Time Line:

Note how you can easily see the graphs, scroll around, zoom in and out and each plot is labeled. Perfect for plotting power data!

To see the above visualizer as actively running outside the Learning System on a website, check it out the first example here.

Data formatting

Theres a few restrictions to how we get the data to the visualization api and our best option is JSon data. As far as I can tell, JSON is what happened when everyone said "wow, XML is really bulky and wasteful". Anyhow, theres like 4 layers of framework and interpretive data structions and in the end there was a pretty easy to use library written by the Google Visualizations team that let me 'just do it' with a single call by putting the data into a python 'dictionary' in a certain format.

Lets go through the code in sections, since the function is quite long



class JSONout(webapp.RequestHandler):
  def get(self):

    # make the user log in if no user name is supplied
    if self.request.get('user'):
      account = users.User(self.request.get('user'))
    else:
       if not users.get_current_user():
         self.redirect(users.create_login_url(self.request.uri))
       account = users.get_current_user()
         
    # assume we want 24 hours of data
    historytimebegin = 24 
    if self.request.get('bhours'):
      historytimebegin = int(self.request.get('bhours'))


    # assume we want data starting from 0 hours ago
    historytimeend = 0 
    if self.request.get('ehours'):
      historytimeend = int(self.request.get('ehours'))

    # data format for JSON happiness 
    datastore = []
    columnnames = ["date"]
    columnset = set(columnnames)
    description ={"date": ("datetime", "Date")}
      
    # the names of each sensor, if configured
    sensornames = [ None ] * 10
First up we get the user we're going to be looking up the data for. Then we have two variables for defining the amount of data to grab. One is "ehours" (end hours) and the other is "bhours". So if you wanted the last 5 hours, bhours would be 5 and ehours would be 0. If you wanted 5 hours from one day ago, bhours would be 29 and ehours would be 24. datastore is where we will corall all the data. columnnames and description are the 'names' of each column. We always have a date column, then another column for each sensor stream. We also have a seperate array to cache the special sensor names.

onto the next section! Here is where we actually grab data from the database. Now app engine has this annoying restriction, you can only get 1000 points of data at once so what I do is go through it 12 hours at a time. The final datastore has all the points but its easier on the database, I guess. One thing thats confusing perhaps is each column has a name and a description. The name is short, say "watts3" for sensor #3, but the description might be "Limor's workbench". I dont even remember writing this code so maybe you can figure it out on your own?

    # we cant grab more than 1000 datapoints, thanks to free-app-engine restriction
    # thats about 3 sensors's worth in one day
    # so we will restrict to only grabbing 12 hours of data at a time, about 7 sensors worth
    
    while (historytimebegin > historytimeend):
      if (historytimebegin - historytimeend) > 12:
        timebegin = datetime.timedelta(hours = -historytimebegin)
        timeend = datetime.timedelta(hours = -(historytimebegin-12))
        historytimebegin -= 12
      else:
        timebegin = datetime.timedelta(hours = -historytimebegin)
        historytimebegin = 0
        timeend = datetime.timedelta(hours = -historytimeend)

      # grab all the sensor data for that time chunk
      powerusages = db.GqlQuery("SELECT * FROM Powerusage WHERE date > :1 AND date < :2 AND author = :3 ORDER BY date", datetime.datetime.now()+timebegin, datetime.datetime.now()+timeend, account)

      # sort them into the proper format and add sensor names from that DB if not done yet
      for powerused in powerusages:
        coln = "watts" + str(powerused.sensornum)
        entry = {"date": powerused.date.replace(tzinfo=utc).astimezone(est), coln: powerused.watt}
        if not (coln in columnset):
          columnnames.append(coln)
          columnset = set(columnnames)
          # find the sensor name, if we can
          if (len(sensornames) < powerused.sensornum) or (not sensornames[powerused.sensornum]):
            currnamequery = db.GqlQuery("SELECT * FROM Sensorname WHERE author = :1 AND sensornum = :2", account, powerused.sensornum)
            name = currnamequery.get()
            
            if not name:
              sensornames[powerused.sensornum] = "sensor #"+str(powerused.sensornum)
            else:
              sensornames[powerused.sensornum] = name.sensorname

          description[coln] = ("number", sensornames[powerused.sensornum])
          #self.response.out.write(sensornames)

        # add one entry at a time
        datastore.append(entry)
Finally at the end of all the looping, we call the magic function that turns the dictionary into JSON, wrap it in the proper Google Visualization package, then spit it out!
    # OK all the data is ready to go, print it out in JSON format!
    data_table = gviz_api.DataTable(description)
    data_table.LoadData(datastore)
    self.response.headers['Content-Type'] = 'text/plain'
    self.response.out.write(data_table.ToJSonResponse(columns_order=(columnnames),
                                    order_by="date")) 
If you were to visit http://wattcher.appspot.com/[email protected]&bhours=1 it would output something like this:
google.visualization.Query.setResponse({'version':'0.5', 'reqId':'0',
 'status':'OK', 'table': {cols: [{id:'date',label:'Date',type:
 'datetime'},{id:'watts1',label:'Limor',type:'number'},{id:'watts5',
 label:'Workbench',type:'number'},{id:'watts2',label:'Adafruit',type:
 'number'},{id:'watts4',label:'Phil2',type:'number'}],rows: [{c:[{v:new 
Date(2009,1,25,21,20,2)},{v:64.8332291619},,,{v:null}]},{c:[{v:new 
Date(2009,1,25,21,20,3)},,{v:230.122099757},,{v:null}]},{c:[{v:new 
Date(2009,1,25,21,20,3)},,,{v:65.4923925044},{v:null}]},{c:[{v:new 
Date(2009,1,25,21,20,4)},,,,{v:48.6947643311}]},{c:[{v:new 
Date(2009,1,25,21,25,3)},,{v:228.409810208},,{v:null}]},{c:[{v:new 
Date(2009,1,25,21,25,3)},{v:67.3574917331},,,{v:null}]},{c:[{v:new 
Date(2009,1,25,21,25,3)},,,{v:66.0046383897},{v:null}]},{c:[{v:new 
Date(2009,1,25,21,25,4)},,,,{v:47.3892235642}]},{c:[{v:new 
Date(2009,1,25,21,30,2)},{v:84.9379517795},,,{v:null}]},{c:[{v:new 
Date(2009,1,25,21,30,3)},,,,{v:99.7553490071}]},{c:[{v:new 
Date(2009,1,25,21,30,5)},,{v:229.73642288},,{v:null}]},{c:[{v:new 
Date(2009,1,25,21,30,6)},,,{v:66.6556291818},{v:null}]},{c:[{v:new 
Date(2009,1,25,21,35,2)},,,{v:67.3146052998},{v:null}]},{c:[{v:new 
Date(2009,1,25,21,35,3)},{v:96.2322216676},,,{v:null}]},{c:[{v:new 
Date(2009,1,25,21,35,3)},,{v:226.678267688},,{v:null}]},{c:[{v:new 
Date(2009,1,25,21,35,4)},,,,{v:158.428422765}]},{c:[{v:new 
Date(2009,1,25,21,40,3)},,{v:232.644574879},,{v:null}]},{c:[{v:new 
Date(2009,1,25,21,40,4)},,,,{v:153.666193493}]},{c:[{v:new 
Date(2009,1,25,21,40,6)},,,{v:66.7874343225},{v:null}]},{c:[{v:new 
Date(2009,1,25,21,40,12)},{v:95.0019590395},,,{v:null}]},{c:[{v:new 
Date(2009,1,25,21,40,21)},{v:95.0144043571},,,{v:null}]},{c:[{v:new 
Date(2009,1,25,21,40,23)},,,{v:66.8060307611},{v:null}]},{c:[{v:new 
Date(2009,1,25,21,45,2)},,,{v:66.9814723201},{v:null}]},{c:[{v:new 
Date(2009,1,25,21,45,3)},,{v:226.036818816},,{v:null}]},{c:[{v:new 
Date(2009,1,25,21,45,3)},{v:99.2775581827},,,{v:null}]},{c:[{v:new 
Date(2009,1,25,21,45,4)},,,,{v:154.261889366}]},{c:[{v:new 
Date(2009,1,25,21,50,4)},{v:102.104642018},,,{v:null}]},{c:[{v:new 
Date(2009,1,25,21,50,4)},,,,{v:155.441084531}]},{c:[{v:new 
Date(2009,1,25,21,50,5)},,,{v:67.0087146687},{v:null}]},{c:[{v:new 
Date(2009,1,25,21,50,5)},,{v:230.678636915},,{v:null}]},{c:[{v:new 
Date(2009,1,25,21,55,3)},{v:103.493297176},,,{v:null}]},{c:[{v:new 
Date(2009,1,25,21,55,3)},,,,{v:151.309223916}]},{c:[{v:new 
Date(2009,1,25,21,55,4)},,,{v:66.9174858741},{v:null}]},{c:[{v:new 
Date(2009,1,25,21,55,4)},,{v:227.765325835},,{v:null}]},{c:[{v:new 
Date(2009,1,25,22,0,3)},,,{v:67.0004310254},{v:null}]},{c:[{v:new 
Date(2009,1,25,22,0,3)},,,,{v:150.389989112}]},{c:[{v:new 
Date(2009,1,25,22,0,3)},,{v:230.892049553},,{v:null}]},{c:[{v:new 
Date(2009,1,25,22,0,4)},{v:92.2432771363},,,{v:null}]},{c:[{v:new 
Date(2009,1,25,22,15,3)},{v:97.5910440774},,,{v:null}]},{c:[{v:new 
Date(2009,1,25,22,15,3)},,,,{v:143.722595861}]},{c:[{v:new 
Date(2009,1,25,22,15,4)},,,{v:64.4898008851},{v:null}]},{c:[{v:new 
Date(2009,1,25,22,15,4)},,{v:222.357617868},,{v:null}]}]}});
Anyways, you can kinda see the data, also note its actually a function call, this stuff is really kinky!

Now go to the Google Visualizations Playground and enter in that URL into the sandbox

And you can see the visualization itself pop out! (this is just a screen shot so go do it yerself if you want to mess around)

OK go mess around, adding and changing bhours and ehours

Wrapping up the visualization

OK we're nearly done. Now we just need to basically grab the code from the sandbox and make it a subpage in our app engine...like so:

class Visualize(webapp.RequestHandler):
  def get(self):

    # make the user log in if no user name is supplied
    if self.request.get('user'):
      account = users.User(self.request.get('user'))
    else:
       if not users.get_current_user():
         self.redirect(users.create_login_url(self.request.uri))
       account = users.get_current_user()

         
    historytimebegin = 24 # assume 24 hours
    if self.request.get('bhours'):
      historytimebegin = int(self.request.get('bhours'))


    historytimeend = 0 # assume 0 hours ago
    if self.request.get('ehours'):
      historytimeend = int(self.request.get('ehours'))

    # get the first part, headers, out
    self.response.out.write('''
<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Strict//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-strict.dtd">
<html xmlns="http://www.w3.org/1999/xhtml">
<head>
  <meta http-equiv="content-type" content="text/html; charset=utf-8" />
  <title>Google Visualization API Sample</title>
  <script type="text/javascript" src="http://www.google.com/jsapi"></script>
  <script type="text/javascript">
    google.load("visualization", "1", {packages: ["annotatedtimeline"]});

    function drawVisualizations() {
    ''')

    # create our visualization
    self.response.out.write('''new google.visualization.Query("http://wattcher.appspot.com/visquery.json?user='''+
								account.email()+'''&bhours='''+str(historytimebegin)+'''").send(
          function(response) {
            new google.visualization.AnnotatedTimeLine(
                document.getElementById("visualization")).
                draw(response.getDataTable(), {"displayAnnotations": true});
          });
          ''')
                     
    self.response.out.write('''}
    
    google.setOnLoadCallback(drawVisualizations);
  </script>
</head>
<body style="font-family: Arial;border: 0 none;">
<div id="visualization" style="width: 800px; height: 250px;"></div>
</body> 
</html>''')
The first part is pretty straight forward, get the user name or login. Then we will assume the user wants 1 last day of data, so set bhours and ehours. Then we literally just print out the code we copied from Google's Visualization sandbox, done!

Viz Viz Viz

The only thing I couldn't figure out is how to get 3 visualizations going on at once (last hour, day and week) with the above code. It just kinda broke. So for the triple view I had to use iframes :(

class VisualizeAll(webapp.RequestHandler):
  def get(self):

    # make the user log in if no user name is supplied
    if self.request.get('user'):
      account = users.User(self.request.get('user'))
    else:
       if not users.get_current_user():
         self.redirect(users.create_login_url(self.request.uri))
       account = users.get_current_user()

    self.response.out.write('''
<h2>Power usage over the last hour:</h2>
<iframe src ="[email protected]&bhours=1" frameborder="0" width="100%" height="300px">
  <p>Your browser does not support iframes.</p>
</iframe>

<h2>Power usage over the last day:</h2>
<iframe src ="[email protected]&bhours=24"  frameborder="0" width="100%" height="300px">
  <p>Your browser does not support iframes.</p>
</iframe>

<h2>Power usage over the last week:</h2>
<iframe src ="[email protected]&bhours=168"  frameborder="0"  width="300%" height="500px">
  <p>Your browser does not support iframes.</p>
</iframe>

      ''')
Anyhow, it works just fine.

Timecodes!


The final thing that wont be reviewed here is how I got the date and times to be EST instead of UTC. As far as I can tell, its kind of broken and mysterious. Check the code if you want to figure it out.

This guide was first published on Dec 03, 2014. It was last updated on Dec 03, 2014.

This page (Graph) was last updated on Feb 14, 2013.

Text editor powered by tinymce.