Top 10 Songs of 2013: #9 Au Revoir Simone - Crazy  

Au Revoir Simone - “Crazy”

This is a grower. The first time I heard it, I think it was on a podcast (maybe KEXP?). It starts off so simple, with that little riff and drum beat. And then the song is just all hook. Seriously, it’s just all gorgeous hook. It’s a hook that you hear for the first time, and you’re singing along before the songs over. You hear it the first time, it hangs out in your brain. You go listen to it again, you think “hmm, this is pretty fun.” The tenth time? You realize you’ve lost and Au Revoir Simone have won.

I think that’s the power of pop music. “Crazy” is literally all hook. I didn’t count, but if there’s more than 10 lines of the song that aren’t the chorus, I’d be shocked. There’s something magical about being able to create a song like that. Everyone gets it, everyone’s welcome. You don’t need to decode the lyrics. It’s all sitting right there for you, making you feel like part of the cool kids.

(Especially when this ends up backing a commercial for Girls on HBO and you’re rocking out the lyrics and impressing all your friends.)

Top 10 Songs of 2013 #10: Sean Nelson - Kicking Me Out of the Band  

Sean Nelson - “Kicking Me Out of the Band”

Singer, actor, writer, tweeter, frontman of (the currently on hiatus) Harvey Danger, Sean Nelson is a lot of things. He’s a supremely talented songwriter, bringing in rhyming and phrasing that you don’t typically hear on pop songs. He also brings to the table a tremendous gift of storytelling, and “Kicking Me Out of the Band” is a wonderful example, telling a sly story of a wasted youth in England, starting a band with his pal, getting bigger, getting into drugs, having the press proclaim you “the next big thing”, and eventually get kicked out of the band.

It’s very simple song, with a pretty, quiet intro, that flows into sort of a New Wavey-synth beat, with Nelson’s vocals sitting on top telling the story. And it’s such a smart, cutting story.

The NME said we were “quintessential
power pop meets rock meets folk
meets punk meets alt-country,
but with a healthy sense of metal”

Top 10 Favorite Songs of 2013 - Honorable Mention  

Alright, I’m a couple of days late. I’ll catch up before the end of the year. I’m a day ahead of when I started last year, so I’ve got that going for me.

This is the seventh year of this list. If you care to take a look at the previous entries, you can here:

2012 | 2011 | 2010 | 2009 | 2008 | 2007

The rules are very simple:

  • The song came out on an album in 2013
  • I can only pick one song per artist

I occasionally, but rarely, break those rules.

This was an odd year for me, musically. With Spotify and iTunes Radio and satellite radio and whatever else-radio, I heard a lot of music. So much, in fact, that I did a really poor job of carving out what I liked. Unless something really caught my ear, or was from an artist I already liked, it usually ended up on a random playlist or starred on Spotify, hoping to see the light of day.

In putting together the list, I also seemingly fell into a couple of trends this year—as often seems to be the case—with many songs on this list falling into one or two categories.

Right out of the gates, you’ll note the first trend: female vocalists. The first female vocalist is featured in a group who’ve made my top 10 before, with their debut album.

Cults - “I Can Hardly Make You Mine”

This song is very Cults, but after a whole bunch of caffeine. Still sounding a bit lo-fi, “I Can Hardly Make You Mine” is a super shiny 60s pop song that does what a good song does: makes you want to listen to it again. Ask me again in a week, and this may have hit the top 10.

Sylvan Esso - “Hey Mami”

Until a few hours ago, this was in my top 10. It only slipped out because I realized how mesmerized I had been by hearing and seeing it live. Opening for Minor Alps, I had never heard of Sylvan Esso. Then two folks walk out, a woman with a mic and a guy behind a couple of Moogs. And then she starts singing, laying down the backing vocals, then breaking out into the chorus while slowly dancing. As the beat starts to pick up, she dances a bit more actively. In any other situation, this might come off forced or fake. It wasn’t. It was so genuine as to be incredibly infectious and quickly won the crowd over.

The woman is Amelia Meath, who has a tremendous voice, and it just doesn’t need much instrumentation behind it at all. But, I think the instrumentation might be just a bit too sparse to be a complete earworm. (Check out the beat on “Play It Right” which I think works better.)

Night Beds - “Ramona”

Here’s a very straight forward alt-country(ish) song lifted into the stratosphere by Winston Yellen’s phenomenal voice. In the hands (or voice … ) of almost any other band, this is just a catchy, but not overly noteworthy song. But in the last verse, when he breaks into his falsetto, the song just sort of takes on a new life.

Phoenix - “Entertainment”

This is such an incredibly great, poppy song. It easily slides into Phoenix’s pantheon of amazing pop songs (like “1901”, “Lisztomania”, and “Long Distance Call”). But it’s not a genre-shaking pop song. That’s a really high bar to set, but Phoenix set it for themselves. Seriously, both “1901” and “Lisztomania” were songs that completely redefined power/indie pop (and the album Wolfgang Amadeus Phoenix is the album that launched a million remixes).

It’s a great song, in a band with a long history of great songs. And I think that’s why it doesn’t stand out as much.

"The Failed Techno-Libertarian Agenda"  

"The push toward Bitcoin comes largely from the libertarian portion of the technology community who believe that regulation stands in the way of both progress and profit. Unfortunately, this alarmingly magical thinking has little basis in economic reality. The gradual dismantling of much of the US and international financial regulatory safety net is now regarded as a major catalyst for the Great Recession. The ‘financial or political constraints’ many of the underbanked find themselves in are the result of unchecked predatory capitalism, not a symptom of a terminal lack of software."

(Via Alex Payne.)

While he does have a stake in the game, this entire post is a great takedown of the fundamentally flawed thinking that Bitcoin (or similarly “unfettered” currencies) will somehow solve the world’s financial and wealth inequity problems.

Instead, Bitcoin is simply another opportunity for the able to squeeze out more money for themselves, while the unable (due to resources, options, station in life) are left behind. Payne says it better than I can. Go read it.

Community. Season 5.  

January 2nd. I’m ready for the smartest show on TV to be back.

When Caching Bites Back  

We have an application on our site that was rewritten a few years back by a developer who is no longer with the company. He attempted to do some “smart” caching things to make it fast, but I think had a fundamental lack of understanding of how caching, or at least memcached works.

Memcached is a really nifty, stable, memory-based key-value store. The most common use for it is caching the results of expensive operations. Let’s say you have some data you pull from a database that doesn’t change frequently. You’d cache it in memcached for some period of time so that you don’t have to hit the database frequently.

A couple of things to note about memcached. Most folks run it on a number of boxes on the network, so you still have to go across the network to get the data. [1] Memcached also, by default, has a 1MB limit on the objects/data you store in it. [2] Store lots of stuff in it, keep it in smaller objects (that you don’t mind throwing across the network), and you’ll see a pretty nice performance boost.

Unless … someone decides to not cache little things. And instead caches a big thing.

We started to notice some degradation in performance over the past few months. It finally got bad enough that I had to take a look. It only took a little big of debugging to determine that the way the caching was implemented wasn’t helping us: it was actively hurting us. Rather than caching entries individually, it was loading up an entire set of the data and trying to cache a massive chunk of data. Which, since it was larger than the 1MB limit, would fail.

You’d end up with something like this:

  • Hey, do I have this item in the cache?
  • Nope, let’s generate the giant object so we can cache it
  • Send it to the server to cache it
  • Nope, it’s too big, can’t cache it
  • Oh well, onto the next item … do I have it in the cache?

Turns out, this wasn’t just impacting performance. It was hammering our network.

Screen Shot 2013 12 12 at 11 51 30 AM

The top of that graph is about 400Mb/s. The drop off is when we rolled out the change to fix the caching (to cache individual elements rather than the entire object0. It was, nearly instantaneously, a 250Mb/s drop in network traffic.

The lesson here? Know how to use your cache layer.


  1. You can run it locally. It’s super fast if you do. But, if you run it locally, you can’t share the data across servers. It all depends on your use case.

     ↩

  2. That 1MB limit is changeable

     ↩

More Detail on Dropbox Backup  

As mentioned in yesterday’s post, I’ve taken to backing up the important guts of my server via Dropbox. It turned out to be very easy, and gives me the added benefit of not having to do any sort of versioning: I get that for free with Dropbox. Plus, it seamlessly integrates with my existing backup routines.

So, how do I do it? It’s honestly very simple.

First, I generate some backups. I run these out of cron every morning.

mysqldump -u root wordpress | gzip -c > /path/to/backup/blogdb.sql.gz
tar czf /path/to/backup/apache2.tgz /etc/apache2/*

I do this for my web databases, any important config files, crontabs, etc. (The actual sites are already backed up, since I develop them locally.)

Once they’re dropped off in the backup location, it’s just a matter of having the script come along and copy them to Dropbox. I chose to write it in ruby. Honestly, my code is 90% of what you find in Dropbox’s example. Here it is, in all of its glory:

require 'dropbox_sdk'

# Dropbox app key, secret, token
APP_KEY = 'get your own'
APP_SECRET = 'get your own'
ACCESS_TOKEN = 'get your own'
BACKUP_DIR = '/path/to/backup/'

# Open our Dropbox Client
client = DropboxClient.new(ACCESS_TOKEN)

# Our hardcoded list of files
files = ['list of files to backup']

# For each file, let's upload it
files.each do |filename|
    file = File.new(BACKUP_DIR + filename)

    # Send file to dropbox -- last param tells us to overwrite the previous one
    response = client.put_file('/' + filename, file, true) 
end

That’s it. I don’t really do any error checking (I should, and probably will some day, but I don’t today). I should probably store the key/secret/token in another place, but since my app was created and only has access to one Dropbox folder, and I can revoke access at any time, it’s not too much of a risk. Eventually, when I get really ambitious, I’ll have the files list be dynamic, not static. But for now, it’s less than 20 lines of code to backup important files.

That’s good enough for me.

Remote Server Backup via Dropbox  

This site runs on a VPS server sitting out in a data center. I try hard to keep the server reasonably backed up, similar to how I keep my desktop backed up, but it’s something I’ve not been nearly as diligent about.

Previously, here was how I kept my VPS backed up:

  1. Paid daily server snapshot (i.e. I could restore my server to exactly what it looked like yesterday)
  2. Database dumps of my important mysql databases that I would remember to copy off to another server every couple of weeks (i.e. not particularly reliable)
  3. A backup of my important web stuff over FTP and git when I remembered to do it (i.e. not particularly reliable)

So, in the grand scheme of things, not very reliable.

All I really wanted was a simple backup where I could easily get the important files and configs off of the server onto my local machine, so that it would get included with my normal, robust, multi-location backup strategy. The normal method to do that would be rsync, but that means running my machine all the time. Not a big deal for my desktop machine, but that’s also not the machine that gets updated and backed up to Amazon Glacier daily.

Really, what I wanted is to use Dropbox on my server, without having to run Dropbox on the server.

It turns out, that’s pretty easy [1].

There are a few different solutions out there [2], but for me, I wanted a bit of control over it. Dropbox’s Ruby API is really, really easy to use. My total script is about 22 lines (I’ll go into that, in the future). Very simply:

  • I’ve created a Dropbox app that is tied to a single directory in my Dropbox account
  • I’ve got a number of crons that backup data into a location on my server
  • My little 22 line Dropbox script runs once a day and uploads that backed up data into my Dropbox account

So, right there, I’ve now backed up my data daily to Dropbox, which is a big win over what I had before. Even if my computer dies or is turned off for a month, I’m still backing up my data. But, even better, when my various computers are turned on, I’m getting the backed up data synced down to those machines. So, I’m getting multiple copies of my backup data, and that backup data is getting included in the places that those machines are backed up (Time Machine, Glacier, etc.).

I’m not sure why I didn’t think of this before. It took all of about 60 minutes to get the entire thing working, including building the little Dropbox backup app that runs on the server.


  1. If you can write a little bit of code.

     ↩

  2. If you care to Google.

     ↩

Updated NBA Points Created Page  

For the 7 people who care, I’ve finally updated my NBA Points Created page [1] to include the 2013–2014 season. Early in the season, the raw numbers clearly favor big men, but that effect should even out over the course of the season. The per–40 numbers are definitely a better barometer for overall value (I think), and at least pass the sniff test.

I’ve not really updated any of the coefficients in the last few years, and it probably could use a slight tweak to reflect the changes in the pace of the game, but as a quick and dirty barometer, it’s at least interesting to look at.

The LinkedIn Conundrum  

Every few weeks, I log into LinkedIn and to accept/reject connections and clear out the stupid notices of people who’ve said I can do something well.

(I don’t think the people are stupid. I think the notices are stupid and spammy. Which, I think, you could say about pretty much 99% of LinkedIn’s email communications.)

I clear all the notifications, flags, messages, requests for human blood from LinkedIn and then I notice my profile is out-of-date. As I go to update it, I encounter The LinkedIn Conundrum.

“What’s The LinkedIn Condundrum?”, you ask? Good question.

The LinkedIn Conundrum (or at least, what I see as the conundrum …) is the desire to update your profile so that it at least reflects reality, but then the fear to do so because you don’t want it to look to your colleagues that you’re shining up your resume to look for a job.[1] Some people clearly don’t have this fear, and they update their profile one hundred times a day, outlining every new thing they’ve done in their job (“skills: getting coffee four times a day without anyone noticing I’m away from my desk”). There are certainly others, and I fall into this category, where I struggle over whether to update my profile at all, fearing that someone will assume I’m looking for a job. So instead, my profile stays there, frozen in carbonite, forever out of date.


  1. The corollary to The LinkedIn Conundrum is the same fear whenever you accept the connection of a recruiter on LinkedIn, which clearly makes people think you’re talking to a recruiter, when in reality, you’re just accepting some connection to a recruiter who reached out to you, or you’re using for sourcing. Of course, none of this would be a problem if it was socially acceptable to not connect with everyone on LinkedIn, but that doesn’t seem to be the societal norm. Though, I do, on occasion, not accept a connection, if only to exert some feeble power over the LinkedIn borg.

     ↩