My new Mac setup

16 September 20113

I recently updated my entry-level mid-2011 13" MacBook Pro to a nice new mid-2013 13" MacBook Air. I went with the 256Gb SSD version and bumped the RAM up to 8Gb and, so far, it's been a great machine. As with all new setups though, this needed a bit of customising as I setttled in and I thought I'd make some notes on what I put on, mainly for my own reference.

So, these are the details of the second setup of the Air - I say second, as the first thing I did was to try to setup the same triple boot as I had on the Pro (OSX, Windows 7 and Ubuntu). Bootcamp makes installing Windows easy, but there are still huge problems in completing the Ubuntu install on the new Air hardware and I ended up having to to a full disk erase and Mountain Lion recover by the end of the first day! I'll have to be content with just OSX and Windows for now.

At any rate, here's the software that made up the essentials for the new OSX install:

  • Chrome: still the best browser for me, although I worry about the RAM usage over time
  • Git: makes it easier to download and manage many other things, including...
  • Dotfiles: keeping my dotfiles on GitHub makes it easy to move settings between machines. These aren't kept in the best shape, but there's enough in there to get me setup and comfortable on a new Mac. PC or Linux box. These files include settings, plugins and themes for zsh, vim and other utilities
  • Vim: my editor of choice, so I use MacVim on OSX and plain old Vim on the command line
  • XCode: command lines tools and homebrew: the basics of the toolkit
  • Dropbox: keeps a lot of my files, including projects, photos, settings and so on, in sync between machines
  • SuperDuper: for backing up to an external disk that's also partioned with a Time Machine instance
  • Launchbar: my preferred launcher app, keeping my fingers on the keyboard
  • Caffeine: keeps the screen on, useful when keeping an eye on the progress of all the downloads and installations going on
  • Growl: I don't really use it to its potential, but it goes on to manage notifications
  • PopClip: I'd be lost without this one, I cycle through extensions fairly often, but it's worth it for the clipboard actions alone
  • 1Password: again, I don't use it to potential, but it is the way I manage product keys and some passwords
  • Sparrow: I worry for its future, but I still love it for email
  • Moom: after switching from Windows a couple of years ago, I still have no idea what the green button does. Now it does what I want
  • VLC: the omni-media player
  • Office: unfortunately still necessary these days

Some nice tweaks that follow soon after:

  • Bartender: tidy up an unruly menubar
  • F.lux: make it all much easier on the eyes
  • TotalFinder: I quite like what this adds to Finder, although it will be interesting to see what get duplicated in Mavericks
  • TotalTerminal: I've tried using other termainal emulators, but I still like using TotalTerminal's visor mode, mapped to Control-~
  • iTunes: logging in and setting up iTunes match, although Rdio gets most of my music listening currently
  • Plus a nice background and at least two Spaces - one on the left for Sparrow/Twitter and one on the right for working. The dock gets sent to the bottom left of screen and autohides for maximum screen real estate

And then the rest of the apps that I use every so often:

That's actually a pretty long list, it really doesn't feel like that when everything is set up and running! Dropbox and hosting some dotfiles on Github really make transferring settings a breeze and it doesn't take long from a fresh install to feel comfortable again.

Monitoring a Raspberry Pi on Status Board

12 April 2013

Panic's latest iPad app, Status Board, offers a really attractive dashboard to monitor key statistics, email, social media and other goings on. I really don't have that much use for it, but I downloaded it to play with anyway.

One thing that I thought I could monitor was the CPU load on my Raspberry Pi server that I keep running from home. It's not a production server, just a sandbox to play in, but I thought it would be the kind of thing that Status Board should be used for. Seeing that Status Board works well with StatHat, and that StatHat offer a feature-reduced free tier, I thought I would see if I could tie all of these together to get something happening.

Here's what I got to work. I'm sure there are better ways to do this and I'm sure something like Nagios would be a much better alternative, but this was what I chose to play with.

Getting stats from the Raspberry Pi

My Pi runs Arch and is a very barebones setup. It's easy to get the current CPU load average via the /proc/loadavg file. This file gives the load average for the last one, five and 15 minute window, as well as some process info. I wanted to log the one-minute average, so only needed the first four characters of this file to be dumped out. Starting a short script, the line

loadavg=`head -c 4 /proc/loadavg`

was enough to grab the current one-minute average and place it in a variable called 'loadavg'.

Sending the stats to StatHat

StatHat make it really easy to send them data to log. There are a whole stack of ways to do this programatically, but I opted to use the cURL method so I could add it to the same script. Once you have a StatHat account ready, sending the data is as easy as adding a single line to finish the script:

curl -d "stat=Pi Load&email=YOUR@EMAIL.COM&value=$loadavg" http://api.stathat.com/ez

This takes the figure that we pulled from /proc/loadavg and send it to StatHat to add to the 'Pi Load' log.

Doing this more than once

Having the commands to pull the data and send it are great and putting them in a script makes it even easier, but it's no good unless this is being done regularly and automatically. A cron job to do this every five minutes takes care of this task. I'm certainly not a cron expert, but here's what works for me. Running crontab -e will open your editor to enter the cron tasks. The syntax to do this is better explained at this site than what I could do, but, effectively, this line will run your script every five minutes and scrap any output (otherwise cron may email it to you, a great way to fill your inbox):

*/5 * * * * ~/path/to/your/script.sh > /dev/null 2>&1

The script I run is literally just the two lines that I mentioned above, here's the whole thing:

#! /bin/zsh
loadavg=`head -c 4 /proc/loadavg`
curl -d "stat=Pi Load&email=YOUR@EMAIL.COM&value=$loadavg" http://api.stathat.com/ez

Displaying the log on Status Board

StatHat is one of the two services that are already integrating with Status Board. This makes it super easy to drop the log into Status Board and have it come out as a nice graph. Just visit the log on the StatHat site, choose the timeframe you want to plot and look for the 'Status Board Graph URL' button at the bottom of the page. This can email the URL to your iPad. Once there, copy the URL, open Status Board and add a new graph, pasting in the URL you received.

That's about it

With the graph added, you're all done - the cron job will pull the load data every five minutes and send it to StatHat, where it will be pulled from Status Board to display. It may look something like this:

Pi Load on Status Board

I quite like StatHat - they offer a really good site and adding data is dead easy. Unfortunately, their one price point of US$99/month means that it won't be worth me signing up to unlock all the features of the site and will remain on their free plan, limited to ten stats (probably enough) with no automatic alerts or small-interval tracking (a bit disappointing). Hopefully, more services will be integrating with Status Board soon and the options will open up a bit.

Publishing with Jekyll

08 December 2012

After using WordPress for years for both blogs and content sites, I was looking for a new way to host my own blog, as well as an opportunity to experiment a bit more with CSS and hosting options. A number of Ruby tutorials use writing a CMS as their end product, but I wasn't too keen on doing that and instead came across Jekyll as a way to create a static site from a set of markdown files and layout templates. So after spending a while learning how the different pieces of the site come together, how the Liquid tags work and the best way to deploy the site from the various local and remote systems I've set up, I've finally switched this site over from a standard WordPress install to a deployed Jekyll static site.

There's still a fair bit left to implement - a lot of the styling needs to be done, RSS needs to be fixed up, more pages need to be added and there's probably a lot of stuff that should/shouldn't be in the 'head' tag. However, the basic site seems to work fine and the deploy scripts seem to work from each place I might want to write from. The full site, with the exception of a drafts folder, is kept on GitHub. I'm not using GitHub Pages to host it at this stage, but I have set up a page that I might use in future. Instead, the site uses Glynn to push it to the shared host that it lives on, or rsync to push to a test server on a VPS.

I've found Jekyll to be a great way to experiment with putting together a site, with a good balance between control over layout/configuration and automated 'magic' to make the site all come together and be ready to upload. I look forward to doing more work with it and will look at using it for other projects in future, too.

Rails: Trusting the magic

23 August 2012

I'm working my way through a couple of Ruby on Rails tutorials at the moment, looking at why this framework has gotten so much interest. In particular, Michael Hartl's tutorial has been excellent. So far, things are going well and I have a nice little app coming together. However, the thing that gets me with such frameworks and helpers is the level of abstraction from what you are writing at any given time.

I don't know if it's just the way I work or not, but I'm most of the way through the tutorial series and I'm still struggling with the amount of 'magic' that Rails is doing on my behalf. While I can see the advantage of simply installing a few dozen gems and not writing a full framework from scratch, I can't help but be a little puzzled when the notion of database access or password hashing is hand-waved away by the framework. Writing tests, while a great way to develop, is likewise confusing and I'm still finding that the attempt to use a native-sounding syntax is more confusing than helpful. Differentiation between bracketing types {([||])} is another puzzling aspect that I'm having trouble finding consistency with, something that I'm sure is to do with the way that the lower-level Ruby code is abstracted away.

So yes, I can see the benefits in using such a framework and I can see the time saved in bootstrapping an app, but there's only so much opacity that I feel comfortable with in a framework and, at this very early stage, Rails seems to be way over the limit. Hopefully, continued practice will make me feel more at home and I will be able to relax into it a bit more, trusting the magic to hold up in the end.