Life after OS X

For years I’ve been grumbling about OS X – Apple’s loss leading product that’s the best thing they’ve ever made for users – turning into iOS – Apple’s loss leading product that’s the best thing they’ve ever made for their business – and how it’s bad for users. I’ve also been toying with the idea of making a Hackintosh for a long time.

So when, 7 weeks ago, I came back from holiday to start work on a new project, only to find my 2.5 year old MBP with an unrecoverably broken HDD, I decided to buy a different laptop. I needed something to code on quickly and didn’t want to spend a lot of money, so after a very small amount of research I opted for Acer’s cheap-as-chips C7 Chromebook.

It’s got a dual core 1.1GHz chip, came with 2GB RAM that I’ve bumped to 10GB (it has user-upgradeable parts) and a 320GB HDD. With my RAM upgrade it cost £250 all in. It’s lighter than a Macbook Air and smaller too, with a decent keyboard.

After playing with (and enjoying) Chrome OS briefly I went on to Crouton and then ultimately to a ‘full’ Ubuntu install via Chrubuntu – it’s now a full Ubuntu machine with dual-boot into Chrome OS.

And you know what? I haven’t looked back. OK, well I am cheating a bit. I still have a Mac Mini in the living room that I use almost every day. And I’ve only really used this machine for browsing and coding (which is a dream – coding on the same platform you’re deploying to is great!). And I have occasionally missed Keynote, Photoshop and Illustrator, though the LibreOffice presentation thing isn’t too bad.

But overall, Ubuntu (with Unity in my case) is easy to learn, easy to set up, easy to use, fast, and basically great. Actually so is Chrome OS, though it doesn’t support a lot of the dev tools I need. Most of the utilities you’re used to using on your computer work well on Ubuntu (Skype, Dropbox, etc); so there’s really no need to hold back. And for £200 for a portable, relatively hardwearing machine, you really should…

Books I’ve read in the last year

One of the things I’ve been doing instead of posting here is reading. SciFi mainly. A lot of it. Because I’m now pretty much exclusively reading Kindle books I can show you, in approximately reverse order, all the books I’ve read in the last 12 months. Here they are:

Title Author Date
Great North Road
Hamilton, Peter F. 9 Oct 2013
The Departure (Owner Novel 1)
Asher, Neal 2 Oct 2013
Gods of Risk (Expanse)
Corey, James S. A. 10 Oct 2013
Abaddon’s Gate: Book Three of the Expanse series
Corey, James S. A. 30 Sep 2013
Caliban’s War: Book Two of the Expanse series
Corey, James S. A. 22 Sep 2013
Leviathan Wakes: Book One of the Expanse series
Corey, James S. A. 6 Sep 2013
Italian Phrasebook (Collins Gem) (Italian Edition)
Collins UK 14 Aug 2013
World War Z
Brooks, Max 13 Aug 2013
Allan Quatermain
Haggard, H. Rider (Henry Rider) 13 Aug 2013
Tuf Voyaging
Martin, George R.R. 3 Aug 2013
Homeland
Cory Doctorow 25 July 2013
Just a Geek
Wil Wheaton 10 July 2013
Machine of Death: A Collection of Stories About People Who Know How They Will Die
Ryan North 10 July 2013
Shards of Honor (Vorkosigan Saga)
Lois McMaster Bujold 10 July 2013
Little Brother
Doctorow, Cory 10 July 2013
Century Rain: Totally Space Opera
Reynolds, Alastair 8 July 2013
The Golden Torc: Saga of the Exiles: Book Two. Traed Paperback (Saga of the Exiles 2)
May, Julian 29 Jun 2013
Mammoth Books presents Sleepover
Reynolds, Alastair 24 Jun 2013
The Many-Coloured Land: Saga of the Exiles: Book One. Trade Paperback (Saga of the Exiles 1)
May, Julian 12 Jun 2013
House of Suns (GOLLANCZ S.F.)
Reynolds, Alastair 6 Jun 2013
Future Perfect: The Case For Progress In A Networked Age
Johnson, Steven 28 May 2013
Galactic North
Reynolds, Alastair 27 April 2013
Excession
Banks, Iain M. 22 April 2013
The Wasp Factory
Banks, Iain 9 April 2013
Complicity
Banks, Iain 9 April 2013
Chasm City
Reynolds, Alastair 3 April 2013
Absolution Gap
Reynolds, Alastair 27 Mar 2013
Diamond Dogs, Turquoise Days
Reynolds, Alastair 27 Mar 2013
Redemption Ark
Reynolds, Alastair 12 Mar 2013
The Prefect (GOLLANCZ S.F.)
Reynolds, Alastair 26 Feb 2013
Exiles (The Progenitor Trilogy, Book One)
Worth, Dan 16 Feb 2013
The Lean Startup
Ries, Eric 12 Feb 2013
The Ascendant Stars: Book Three of Humanity’s Fire
Cobley, Michael 10 Feb 2013
The Orphaned Worlds: Book Two of Humanity’s Fire
Cobley, Michael 5 Feb 2013
Blood, Bones and Butter: The inadvertent education of a reluctant chef
Hamilton, Gabrielle 5 Feb 2013
Transmission: Ragnarok: Book Two
Meaney, John 9 Jan 2013
Revelation Space
Reynolds, Alastair 24 Dec 2012
Seeds of Earth: Book One of Humanity’s Fire
Cobley, Michael 24 Dec 2012
Absorption: Ragnarok v. 1
Meaney, John 24 Dec 2012
The Left Hand Of Darkness
Le Guin, Ursula K. 24 Dec 2012
The Tatja Grimm’s World
Vinge, Vernor 22 Dec 2012
The Collected Stories of Vernor Vinge
Vinge, Vernor 4 Dec 2012
The Three Stigmata of Palmer Eldritch
Dick, Philip K. 30 Nov 2012
How to Create a Mind: The Secret of Human Thought Revealed
Kurzweil, Ray 21 Nov 2012
Tik-Tok (Gollancz S.F.)
Sladek, John 16 Nov 2012
Rapture of the Nerds
Doctorow, Cory; Stross, Charles 7 Nov 2012
The New York Trilogy: “City of Glass”, “Ghosts” and “Locked Room”
Auster, Paul 21 Oct 2012
Ubik (GOLLANCZ S.F.)
Dick, Philip K. 5 Oct 2012
Rainbows End
Vinge, Vernor 4 Sep 2012
Smiley’s People (Coronet Books)
le Carré, John 26 Aug 2012

…that’s 49 books, though admittedly I’m cheating a little and going back a bit too far.

I can’t find an API for this stuff or I’d tell you how many words that was, and how many per day on average, when I read most (time of year and time of day) and least. But what can we see here?

  • Of the 49 books, 2 are dramatised autobiographies, 2 are non-fiction and the rest are firmly fictional. So over 90% of what I read is fiction.
  • Of the fiction books, only 7 are not SciFi; all but two of those are by authors who also write SciFi. So 40 / 49, or just over 80% of everything I read (outside of blogs, articles, etc.) is SciFi.
  • 21 of the 49 are by 5 authors (if you count Iain Banks’ two names as a single author, which I do). I tend to find someone I like and follow them around. Incidentally, those two Iain Banks books were bought because I’ve already gone through all his SciFi material and found out about his prognosis – and wanted to send some royalties his way.

What else did you spot?

What, you want reviews? Well, read some Alastair Reynolds. Or James Corey. Or ask a specific question :)

Auto-filter your sent Gmail emails

I’m a bit of an email nerd.

Let me clarify. I spend a lot of time in my email client (Gmail browser), send and receive a lot of emails to a lot of different people about a lot of different things. I’ve also learned that unless I implement processes, I forget things (I always keep my phone in my left pocket and my wallet and keys in my right). I also love to automate processes whenever possible (that’s part of what I love about coding!).

I recently realised that the obvious twist to put on my Gmail labels/filters setup was a ‘Follow Up’ label that I’d apply to mails that would need my attention if they didn’t get dealt with (you sometimes want to wait a while before you nag someone for a reply, especially if you don’t know them…).

After choosing a nice bright colour and pulling the labelled threads into a new section on my Multiple Inbox I went ahead and labelled the messages that had made me think about this in the first place, while thinking about rules that would accurately capture all the emails I’d forget to chase down.

It’s relatively easy to remember to label an ongoing discussion, and it’s pretty easy to unlabel something you don’t need a label on, but labelling a message as you send it is hard (or impossible) to do, and it’s also the one you’re least likely to remember later.

I realised I’d need a filter to auto-label these messages, and I came up with the following matching criteria:

  • A message I’d sent…
  • that hadn’t been replied to…
  • that was older than a couple of days, but newer than last week

Imagine my frustration as I realised that I wouldn’t be able to automate my process because — of course, dammit! — you can’t apply a filter to messages as they’re sent. I’ve tried using Boomerang, but it didn’t stick, mainly because there’s no automatic part to it. I had to make this work.

Well of course there’s a way. As I’ve been realising increasingly of late, it’s using Google Apps Scripts. In a nutshell, you write a script to search for certain messages in your inbox, iterate over them checking anything you couldn’t search for and applying your label – and you then trigger the search on a timer. Amazing.

Here’s the how-to:

1. Specify your initial search. I couldn’t figure out a way to get messages without replies, so I left that out for now. You’ll also need to search only for messages on a single day (you’ll see why in a moment), and make sure to exclude chat messages – I ended up with:

in:sent older_than:2d newer_than:3d -in:chats

2. Open up a new script. If you’ve never done this before, follow the beginning of my previous post, or just google it (it’s easy).

3. Copy and paste the following code into your script:

function followUpUnreplied() {
  // Every thread in your Inbox that is read, older than two days, and not labeled "delete me".
  var threads = GmailApp.search('in:sent older_than:2d newer_than:3d -in:chats');
  for (var i = 0; i < threads.length; i++) {
    if (threads[i].getMessageCount() < 2) {
    // looks like an unreplied-to message
      threads[i].addLabel(GmailApp.getUserLabelByName("Follow Up"))
    }
  }
}

What you’re doing here is running your search, then looping through the results (each of which is a “thread” – a conversation). You check each thread for the number of messages and if there’s only 1 you apply the label to it – simples!

4. Next, you need to set your script to run automatically every day (which is why we only searched for messages from a single day in the past). In the script editor window, go to Resources > Current project’s triggers, which will open an overlay with an empty list on the editor page.

Select ‘Add a new trigger’ choose your function name ‘followUpUnreplied’ in this case, choose ‘Time-driven’, ‘day timer’, and select a time that suits you.

5. Lastly, run the script manually from Run menu item; the first time you run it it’ll need authorisation to access your account and you’ll need to do this manually. After that, you should be good to go!

Now you just have to remember to act on all those labels :)

Setting up the Lisp REPL in Sublime Text 2 on OS X

If the title of this post is all gibberish to you, you have a stark choice: turn back now or read the next three paragraphs before deciding (and in all likelihood turning back then). If you know what the title is talking about you may as well skip the next 3 paragraphs :)

Lisp is a programming language. It’s pretty old as these things go – it was invented in 1958 – and I first learned it at university where it was in widespread use on the AI course. It broke my mind then, but I was recently reading an interesting essay by Paul Graham, which took me by surprise in its eloquent argument that Lisp is perhaps the most powerful programming language of all. I decided to relearn it by reading this online book (after some research).

One of the first things you do in the book is set up the REPL (read-eval-print-loop) – it’s basically a console where you can interactively run your code, a bit like the JS console in the dev tools section of your browser. I hadn’t realised, but this awesomeness is actually available for tons of languages. It’s awesome by the way, because you can experiment and fiddle and debug things all while you’re working on them, without saving and reloading and compiling and all that other annoying stuff; it helps you get things working quicker.

Sublime Text 2 is simply a code editor that’s fast, powerful, extensible, customisable and hugely used these days. The whole reason I’m writing this post is because I wanted to code my Lisp in ST2 rather than emacs which the book suggests, because I have a religious aversion to emacs :)

Lisp REPL

OK, down to brass tacks: to install Lisp and the REPL in ST2 I had to first install Lisp on the machine. I used SBCL which seems to be widely used and respected. I downloaded the package and compiled it using the install shell script (inside the downloaded folder) which made it a one step process once I’d installed Xcode’s command line tools (the machine I’m using didn’t have make – you find the tools here after login and through searching. Thanks Apple.)

With SBCL installed you do actually have a REPL now, but it’s not the handiest thing when you have to copy/paste your whole file(s) from Sublime to Terminal every time you want to try anything out.

Sublime Text 2

So onto Sublime. If you’ve not used it before, the first thing you should probably do is install Package Control, which basically makes it a snap to install pretty much any extra feature you like. At this point, you can (optionally) go down a wormhole of styling, fonts and all sorts of options to make your Sublime experience more pleasant.

SublimeREPL

After that diversion, go ahead and install SublimeREPL into your editor using Package Control. SublimeREPL gives you a REPL as a ‘file’ inside ST2 itself, along with keyboard shortcuts to move lines, selections or whole files into the console tab – and when you’re typing in there you get Sublime’s full autocomplete and other functionality – which is great! It has a host of supported languages – perhaps most notably Python, allowing you to load up the virtualenv of your choice – woop!

SublimeREPL Lisp support

Ok so now we have everything we need, but Sublime doesn’t know about the Lisp installation on the machine, and neither does SublimeREPL. First thing, therefore, is to edit your Lisp package in Sublime to show it where to find SBCL (I figured this out thanks to this helpful post). To quote pcmind:

In ST2 go to Preferences/Browse Packages, go to Lisp, add a new file Lisp.sublime-settings with:

{
    "cmd": ["sbcl", "--script", "$file"],
    "working_dir": "${project_path:${folder}}",
    "selector": "source.lisp",
    "osx":
    {
        "cmd": ["/opt/local/bin/sbcl", "--script", "$file"]
    },
    // exemple in windows with CLISP
    "windows":
    {
        "cmd": ["clisp", "$file"]
    },
    // exemple in windows with SBCL
    // "windows":
    // {
    //  "cmd": ["sbcl", "--script", "$file"]
    // }

	"extensions": ["lisp", "scm", "ss", "cl"]
}

Now Sublime knows about SBCL, but SublimeREPL doesn’t yet. For this you need to edit the Packages/Default/Default.sublime-commands and Packages/SublimeREPL/config/Lisp/Main.sublime-menu files to tell it to support Lisp, show it on the menu, and connect to the Lisp installation you’ve just told Sublime about. I found out about these through finding Jeff Thompson‘s github repo with his settings in.

in Default.sublime-commands, add:

{
    "caption": "SublimeREPL: Lisp",
    "command": "run_existing_window_command", "args":
    {
        "id": "repl_lisp",
        "file": "config/Lisp/Main.sublime-menu"
    }
}

and in Main.sublime-menu add:

[
     {
        "id": "tools",
        "children":
        [{
            "caption": "SublimeREPL",
            "mnemonic": "r",
            "id": "SublimeREPL",
            "children":
            [
                {"command": "repl_open", 
                 "caption": "Lisp",
                 "id": "repl_lisp",
                 "mnemonic": "q",
                 "args": {
                    "type": "subprocess",
                    "encoding": "utf8",
                    "cmd": ["sbcl", "-i"],
                    "cwd": "$file_path",
                    "external_id": "lisp",
                    "syntax": "Packages/Lisp/Lisp.tmLanguage"
                    }
                }
            ]   
        }]
    }
]

You can check out my own (very new) settings repo I’ve made, where I’ve put these files in the right place for my OS X installation of ST2.

Testing it out

So now you should be all set up. Try it out – make a new file in ST2, add the following to it:

(defun hello-world () (format t "hello world"))

and save it with the “.lisp” extension; you should first of all notice that ST2 recognises the file and gives you code highlighting.

Now open the SublimeREPL Lisp window using Tools > SublimeREPL > Lisp. You can put it in a new pane within ST2  by going to View > Layout > Rows: 2.

Put your focus back into your lisp file, and use the menu Tools > sublimeREPL > Eval in REPL > file, or the keyboard shortcut ^,,f (which I had to look up – it’s ‘Control’ and the comma key followed by ‘F’ on OS X). You should see the REPL respond with output something like the following, as it accepts the function definition:

*
HELLO-WORLD
*

Note: the * character is your command prompt. Now put your cursor at the end of the REPL and call your function by typing the following (and watch as Sublime’s auto-complete kicks in!)

(hello-world)

and you should see something like:

* (HELLO-WORLD)
hello world
NIL
*

- in which case you’re up and running – now go play!

(oh one last note, if you do run into an error and the REPL enters debug mode, use ’0′ (zero) to exit and go back to your prompt.)

Using Google Apps Scripts as a back end

I recently built a quick internal web app for mobiles using Sencha Touch, and I needed a back end system for very simple content management. As this was a lightweight project without a need for ongoing support and with no expectation of high use, I didn’t want to build a whole back end, or pay for an out of the box solution that would feel like overkill – and not having control of the server was a factor too.

I decided to use Google Spreadsheets for the ad hoc back end, knowing they had an API that supports JSON, but not having explored it before. Sencha Touch comes with handy AJAX and JSONP handlers that I wanted to use as well; the JSON needs to be clean and in a useful format for this to work out of the box – unfortunately what the Spreadsheets API gives you is very complete and generic, and doesn’t for example associate column headers with every JSON object it returns; you need to parse the data yourself to make the format clean enough for Sencha to consume.

Here’s a sample of the sort of data I wanted to administer

Here’s the Spreadsheet API’s default output for that sheet…

{
   "version":"1.0",
   "encoding":"UTF-8",
   "feed":{
      "xmlns":"http://www.w3.org/2005/Atom",
      "xmlns$openSearch":"http://a9.com/-/spec/opensearchrss/1.0/",
      "xmlns$gsx":"http://schemas.google.com/spreadsheets/2006/extended",
      "id":{
         "$t":"https://spreadsheets.google.com/feeds/list/0AsVzV2H8zZOedHc4bDUwX0QzYnhXdDFFWE1jcEpXN0E/1/public/basic"
      },
      "updated":{
         "$t":"2013-05-16T11:41:31.516Z"
      },
      "category":[
         {
            "scheme":"http://schemas.google.com/spreadsheets/2006",
            "term":"http://schemas.google.com/spreadsheets/2006#list"
         }
      ],
      "title":{
         "type":"text",
         "$t":"Sheet1"
      },
      "link":[
         {
            "rel":"alternate",
            "type":"text/html",
            "href":"https://spreadsheets.google.com/pub?key\u003d0AsVzV2H8zZOedHc4bDUwX0QzYnhXdDFFWE1jcEpXN0E"
         },
         {
            "rel":"http://schemas.google.com/g/2005#feed",
            "type":"application/atom+xml",
            "href":"https://spreadsheets.google.com/feeds/list/0AsVzV2H8zZOedHc4bDUwX0QzYnhXdDFFWE1jcEpXN0E/1/public/basic"
         },
         {
            "rel":"self",
            "type":"application/atom+xml",
            "href":"https://spreadsheets.google.com/feeds/list/0AsVzV2H8zZOedHc4bDUwX0QzYnhXdDFFWE1jcEpXN0E/1/public/basic?alt\u003djson"
         }
      ],
      "author":[
         {
            "name":{
               "$t":"me"
            },
            "email":{
               "$t":"me@marcelkornblum.com"
            }
         }
      ],
      "openSearch$totalResults":{
         "$t":"2"
      },
      "openSearch$startIndex":{
         "$t":"1"
      },
      "entry":[
         {
            "id":{
               "$t":"https://spreadsheets.google.com/feeds/list/0AsVzV2H8zZOedHc4bDUwX0QzYnhXdDFFWE1jcEpXN0E/1/public/basic/cokwr"
            },
            "updated":{
               "$t":"2013-05-16T11:41:31.516Z"
            },
            "category":[
               {
                  "scheme":"http://schemas.google.com/spreadsheets/2006",
                  "term":"http://schemas.google.com/spreadsheets/2006#list"
               }
            ],
            "title":{
               "type":"text",
               "$t":"Marcel Kornblum"
            },
            "content":{
               "type":"text",
               "$t":"emailaddress: me@mydomain.com, customerid: 0000001"
            },
            "link":[
               {
                  "rel":"self",
                  "type":"application/atom+xml",
                  "href":"https://spreadsheets.google.com/feeds/list/0AsVzV2H8zZOedHc4bDUwX0QzYnhXdDFFWE1jcEpXN0E/1/public/basic/cokwr"
               }
            ]
         },
         {
            "id":{
               "$t":"https://spreadsheets.google.com/feeds/list/0AsVzV2H8zZOedHc4bDUwX0QzYnhXdDFFWE1jcEpXN0E/1/public/basic/cpzh4"
            },
            "updated":{
               "$t":"2013-05-16T11:41:31.516Z"
            },
            "category":[
               {
                  "scheme":"http://schemas.google.com/spreadsheets/2006",
                  "term":"http://schemas.google.com/spreadsheets/2006#list"
               }
            ],
            "title":{
               "type":"text",
               "$t":"Someone Else"
            },
            "content":{
               "type":"text",
               "$t":"emailaddress: someone@else.com, customerid: 0000002"
            },
            "link":[
               {
                  "rel":"self",
                  "type":"application/atom+xml",
                  "href":"https://spreadsheets.google.com/feeds/list/0AsVzV2H8zZOedHc4bDUwX0QzYnhXdDFFWE1jcEpXN0E/1/public/basic/cpzh4"
               }
            ]
         }
      ]
   }
}

…which you can see is pretty complex and raw. Here’s the JSONP response I was hoping for:

callbackFunction({
  data: [
    {
      name: "Marcel Kornblum",
      emailAddress: "me@mydomain.com",
      customerid: "0000001"
    },
    {
      name: "Someone Else",
      emailAddress: "someone@else.com",
      customerid: "0000002"
    }
  ]
})

After a bit of digging around I found Google Apps Scripts: I’d heard of these before but never figured out how to use them or what they were for. In a nutshell, it’s a very powerful and flexible system that plugs into your Google Docs and that can be deployed in a huge number of ways, from allowing you to create custom functions that are exposed within specific spreadsheets (so users can put their data through your specific algorithm), all the way to creating public facing web forms and sites. One of the things it allows is JSON and JSONP responses to public facing URLs, supporting GET variables and having easy Docs API handlers.

The documentation is extensive, but it took me a long time to figure out whether or not this would be possible, which is why I’m writing this post – basically a how-to for getting spreadsheet and calendar data into a nicely formatted and sensible JSONP structure that’s web-accessible, and that can then be used to power your prototypes and small web apps.

So let’s get started.

Obviously you need a spreadsheet to start with – here’s the one I’m using for this tutorial. First of all, you should publish it to the web; this will allow you to access the data from the Scripts API without having to implement authentication (although you can turn that on if you’d prefer).

Google Spreadsheet Publish to Web menu option

Next, you need to define a Data Range. This is basically a named way of accessing a set of cells you define as important, and we’ll use it in the Script. The easiest way to set one up is to highlight all the content cells (don’t include the headers) and add as many extra rows as you think you might need. Then use the menu to get to Data > Named and Protected Ranges, and you’ll see a sidebar appear, as in the screenshots below.

Google Spreadsheet Data Range sidebar Google Spreadsheet Data Range menu option

Once you’ve got that set up, put some dummy data in so we can see when it’s working, and it’ll be time to start scripting.

The first time you make a Google Apps Script you need to set it up in your Drive account, as follows:

Go to Google Drive and select Create > Connect more apps as shown below. You’ll see an overlay with a huge number of options for apps you can connect to your account – you can search for ‘script’ and it should be your top result.

Google Drive Create window            Google Drive Apps overlay

Once you’ve connected the Script app, the next time you hit “create” you’ll get an option to make a script. Go ahead and make one – I followed this tutorial, which goes through connecting to the spreadsheet, getting the date out of the ranges, associating it with header cells and formatting the JSON objects.

In order to make your script accessible on a URL you need to save a version of it, and then deploy it as a web app; although you can access the unversioned script for testing through the tiny link on the web app confirmation overlay as seen here.

Google App Script Deploy Web App

To make your script return JSONP you need to wrap the JSON inside a function call; the function name needs to be variable in order to work with Sencha’s system out of the box.

Luckily Google Apps Scripts can accept GET variables which allow you to easily capture the function name you want and then simply string concatenate the output like so:

function doGet(e) {
var cb = e.parameter.cb;
var outputStr = cb + '(' + JSON.stringify(output) + ');';
return ContentService.createTextOutput(outputStr).setMimeType(ContentService.MimeType.JSON);
}

The only other things I ended up changing were allowing the spreadsheet ID, worksheet name and data range to all be passed in as variables (because I wanted to use different pieces of managed data in my app without duplicating the script), and a tweak letting empty cells show up as object properties with empty strings instead of being skipped.

Here’s my final code (and here as a script) – it’s almost identical to the guide code I followed as you’ll see :)

function doGet(e) {
  var cb = e.parameter.cb;
  var sheetno = e.parameter.sn;
  var sheetid = e.parameter.s;
  var datarange = e.parameter.dr;
  var ss = SpreadsheetApp.openById(sheetid);
  var sheet = ss.getSheetByName(sheetno);

  // Get the range of cells that store employee data.
  var eDataRange = ss.getRangeByName(datarange);

  // For every row of employee data, generate an employee object.
  var resultObjects = getRowsData(sheet, eDataRange);
  
  var output = {'data':resultObjects}
  var outputStr = cb + '(' + JSON.stringify(output) + ');'
  return ContentService.createTextOutput(outputStr).setMimeType(ContentService.MimeType.JSON);

  // get DB obj
  //var db = ScriptDb.getMyDb();
  //var saveResults = db.saveBatch(resultObjects, false);
  //Logger.log(resultObjects);
}


// getRowsData iterates row by row in the input range and returns an array of objects.
// Each object contains all the data for a given row, indexed by its normalized column name.
// Arguments:
//   - sheet: the sheet object that contains the data to be processed
//   - range: the exact range of cells where the data is stored
//   - columnHeadersRowIndex: specifies the row number where the column names are stored.
//       This argument is optional and it defaults to the row immediately above range;
// Returns an Array of objects.
function getRowsData(sheet, range, columnHeadersRowIndex) {
  columnHeadersRowIndex = columnHeadersRowIndex || range.getRowIndex() - 1;
  var numColumns = range.getLastColumn() - range.getColumn() + 1;
  var headersRange = sheet.getRange(columnHeadersRowIndex, range.getColumn(), 1, numColumns);
  var headers = headersRange.getValues()[0];
  return getObjects(range.getValues(), normalizeHeaders(headers));
}

// getColumnsData iterates column by column in the input range and returns an array of objects.
// Each object contains all the data for a given column, indexed by its normalized row name.
// Arguments:
//   - sheet: the sheet object that contains the data to be processed
//   - range: the exact range of cells where the data is stored
//   - rowHeadersColumnIndex: specifies the column number where the row names are stored.
//       This argument is optional and it defaults to the column immediately left of the range;
// Returns an Array of objects.
function getColumnsData(sheet, range, rowHeadersColumnIndex) {
  rowHeadersColumnIndex = rowHeadersColumnIndex || range.getColumnIndex() - 1;
  var headersTmp = sheet.getRange(range.getRow(), rowHeadersColumnIndex, range.getNumRows(), 1).getValues();
  var headers = normalizeHeaders(arrayTranspose(headersTmp)[0]);
  return getObjects(arrayTranspose(range.getValues()), headers);
}


// For every row of data in data, generates an object that contains the data. Names of
// object fields are defined in keys.
// Arguments:
//   - data: JavaScript 2d array
//   - keys: Array of Strings that define the property names for the objects to create
function getObjects(data, keys) {
  var objects = [];
  for (var i = 0; i < data.length; ++i) {
    var object = {};
    var hasData = false;
    for (var j = 0; j < data[i].length; ++j) {
      var cellData = data[i][j];
      
      /*
      
      EDITED TO HAVE EMPTY STRINGS FOR EMPTY CELLS
      
      if (isCellEmpty(cellData)) {
        continue;
      }
      */
      
      if (isCellEmpty(cellData)) {
        cellData = '';
      }
      else
      {
        hasData = true;
      }
      object[keys[j]] = cellData;
      
    }
    if (hasData) {
      objects.push(object);
    }
  }
  return objects;
}

// Returns an Array of normalized Strings.
// Arguments:
//   - headers: Array of Strings to normalize
function normalizeHeaders(headers) {
  var keys = [];
  for (var i = 0; i < headers.length; ++i) {
    var key = normalizeHeader(headers[i]);
    if (key.length > 0) {
      keys.push(key);
    }
  }
  return keys;
}

// Normalizes a string, by removing all alphanumeric characters and using mixed case
// to separate words. The output will always start with a lower case letter.
// This function is designed to produce JavaScript object property names.
// Arguments:
//   - header: string to normalize
// Examples:
//   "First Name" -> "firstName"
//   "Market Cap (millions) -> "marketCapMillions
//   "1 number at the beginning is ignored" -> "numberAtTheBeginningIsIgnored"
function normalizeHeader(header) {
  var key = "";
  var upperCase = false;
  for (var i = 0; i < header.length; ++i) {
    var letter = header[i];
    if (letter == " " && key.length > 0) {
      upperCase = true;
      continue;
    }
    if (!isAlnum(letter)) {
      continue;
    }
    if (key.length == 0 && isDigit(letter)) {
      continue; // first character must be a letter
    }
    if (upperCase) {
      upperCase = false;
      key += letter.toUpperCase();
    } else {
      key += letter.toLowerCase();
    }
  }
  return key;
}

// Returns true if the cell where cellData was read from is empty.
// Arguments:
//   - cellData: string
function isCellEmpty(cellData) {
  return typeof(cellData) == "string" && cellData == "";
}

// Returns true if the character char is alphabetical, false otherwise.
function isAlnum(char) {
  return char >= 'A' && char <= 'Z' ||
    char >= 'a' && char <= 'z' ||
    isDigit(char);
}

// Returns true if the character char is a digit, false otherwise.
function isDigit(char) {
  return char >= '0' && char <= '9';
}

// Given a JavaScript 2d Array, this function returns the transposed table.
// Arguments:
//   - data: JavaScript 2d Array
// Returns a JavaScript 2d Array
// Example: arrayTranspose([[1,2,3],[4,5,6]]) returns [[1,4],[2,5],[3,6]].
function arrayTranspose(data) {
  if (data.length == 0 || data[0].length == 0) {
    return null;
  }

  var ret = [];
  for (var i = 0; i < data[0].length; ++i) {
    ret.push([]);
  }

  for (var i = 0; i < data.length; ++i) {
    for (var j = 0; j < data[i].length; ++j) {
      ret[j][i] = data[i][j];
    }
  }

  return ret;
}

Here you can see the JSONP output of the script acting on the spreadsheet – and doesn’t it look lovely!

https://script.google.com/macros/s/AKfycby8snUVINBHncRzPumnjMGgbg0r23gpoDzpzJ4HzK85t_bDFo4/exec?s=0AsVzV2H8zZOedHc4bDUwX0QzYnhXdDFFWE1jcEpXN0E&sn=Sheet1&dr=testRange&cb=callbackFunction

Of course, this probably isn’t the best thing to use in a production environment as there might well be usage restrictions and it’s definitely not built to be bulletproof, but for knocking together  a quick prototype and letting staff edit content in an environment they’re used to, it seems like a decent enough solution to me.

If you’re interested, this is how I called it from the Sencha Store JSONP Proxy:

Ext.define('MyApp.store.Demo', {
	extend: 'Ext.data.Store',
	config: {
		storeId: "demoStore",
		model: "MyApp.model.Demo",
		autoLoad: true,
		proxy: {
			type: 'jsonp',
			callbackKey: 'cb',
			url: 'https://script.google.com/macros/s/AKfycby8snUVINBHncRzPumnjMGgbg0r23gpoDzpzJ4HzK85t_bDFo4/exec?s=0AsVzV2H8zZOedHc4bDUwX0QzYnhXdDFFWE1jcEpXN0E&amp;sn=Sheet1&amp;dr=testRange',
			reader: {
				type: 'json',
				rootProperty: 'data'
			},
		},
	}
});

I hope this is useful to you!

Does enjoying your job make it more expensive?

Or does it being expensive mean you enjoy it more?

[This post was originally published accidentally in draft form; I've written it up now - sorry if it was confusing before!]

The beginning of the train of thought that got me here was an impression that quality work is created by passionate people. I’m currently in Amsterdam at the Craften festival, where the theme really is about putting passion into your work and getting really polished results. A lot of digital work is undertaken in sweatshop-style environments, usually (but not entirely) offshore.

People often hire offshore developers to do well understood, well defined tasks: creativity not required. This is because the perception is that the quality of creative there is lower – and of course because anyone can do the boring stuff, so why not send it somewhere cheap, right?

As a market develops there is more call for creative in all aspects of production – be it woodwork or digital – as people become more sophisticated in their tastes. This gives makers the opportunity to stretch their creative wings, which I personally equate to your job being more enjoyable.

I feel sorry for creatively minded codes and designers in young markets, where the briefs they see tend to be very dry and boring. At the same time, the majority of offshore developers don’t seem to merit more creative briefs (with some notable exceptions) – there are simply so many people around the world willing to work for low salaries (as seen from the UK market) that the crowd in general gets a reputation for bad work.

All of which leads me to the question: as you start to enjoy your work, and presumably to do better, more effective work, does that lead to it being more expensive? Or does the marketplace developing to support more expensive services create the space for people to work in those roles, and for those people to enjoy their jobs more.

I suspect – like many things – that it’s a bit of both that have to develop in an iterative cycle…

Rediscovering Tasker for Android

If you use Android and, like me, you enjoy tinkering with settings and customising your phone, you should really have already heard about Tasker. It just received an update making the UI look more friendly, but it’s the same app it’s been for the last few years. At £1,99 it’s a real bargain and you should definitely get it.

I hadn’t used it for a while but after buying my Nexus 4 I decided I wanted a few personalised settings and downloaded it again; it’s been really amazing exploring the interface and rediscovering its power.

Basically the premise of Tasker is that you can script any action on your phone (from settings to App launching to triggering external events), based on any input your phone can detect (from which cell towers you’re near, what’s in your calendar or what time it is). This means that, once you’ve got your head around the (admittedly confusing) interface you can really customise any aspect of your phone. Once you factor in all the plugins available, you’ll be able to make all sorts of security settings changes as well as sending messages between devices (or your browser), make calls or texts, etc.

Personally, I’ve got tasks set up based broadly on Location, Time and Power. For example, when the battery goes below 25% and isn’t being charged, the phone switches off all data connections and adjusts screen brightness and timeout to conserve power. It also puts a message in the notification bar; tapping this overrides the setting so you can use network connections until you go back to the low power settings.

Another useful script I’ve got mutes the phone and leaves vibrate on, if there’s any current event in my Work calendar. At work and home I switch Wifi on and lower the volume, and during nighttime before a work day the phone goes to silent if I’m at home.

Another nice tweak is that every morning, the phone checks the weather report for the day and reads it out to me just after my alarm goes off.

There are absolutely tons of these scripts online, and making them yourself is pretty easy too, though you’ll want to get your head around variables in order to stop the complexity from causing conflicts.

In case you’re interested, here’s an export of all my tasks – you’ll have to customise settings to make it work for your times and locations but feel free to upload and use!

…and let me know which tasks you find most useful, so I can add some more :)