dotfiles and configuration

I’ve long used config scripts to customise my terminal environment, providing aliases to commands I use often. Over the last couple of years I’ve included git completion, git flow tools and a customised PS1. A while ago I decided to version my setup on GitHub and subsequently found out about GitHub dotfiles: a portal introducing a lot of people and approaches to doing that very thing.

Having cribbed from a lot of them I formalised my own setup over the Easter weekend and here’s my setup. Feel free to fork it and use it yourself if you’re interested: https://github.com/marcelkornblum/dotfiles

It includes:

  • Git completion for bash (from here)
  • Customised command prompt including git status (from here)
  • Git flow toolset (from here)
  • Completion for ssh-config host aliases
  • Many aliases for navigating the CLI and git with git flow
  • Extra utility aliases for controlling the OS, mainly for OS X (from here)
  • All aliases implementing autocompletion of arguments (from here)
  • A script for symlinking the repo files into your home directory (from here)
  • My personal Sublime Text settings and solarized colour theme

Enjoy!

Getting the most out of ElasticSearch and Django

This post was written for the YunoJuno tech blog, where I’m currently freelancing, and was first posted there.


ElasticSearch is a powerful, easy-to-install, easy-to-use, low config, clustered search engine based on Lucene, and is a popular choice for all sorts of applications.

In the Django world, Haystack seems to be the de-facto way to implement search outside the ORM – and it certainly seems the fastest to implement.

There are a lot of good reasons to use Haystack: it’s very easy to get up and running, it’s super-simple to configure based on your existing models, it comes with a great SearchQuerySet that lets you chain filters and use Django ORM-like syntax to create queries that lazy load, and it’s portable between ElasticSearch, Solr, Xapian and Whoosh.

Although I’d implemented Lucene some years ago and knew a little about search in general, I’d never used either ES or Haystack before I was asked to improve the existing YunoJuno search functionality. One of the ideas I wanted to implement was a tag search, where users can type into a text box and matching tags will appear as search autocomplete, and can then be used to filter the results.

To do this I realised I’d need an autocomplete-type nGram search that’s superfast for the input typeahead, and also some fancy tokenising to handle multiple-word tags and a synonym feature to keep the total number of tags low.

It turns out that, in exchange for making a powerful search very easy to add to your Django project, Haystack exacts a price, by putting a fairly low ceiling on what you can achieve with the search. In order to preserve portability, some advanced ES features such as aliases and warming aren’t supported in Haystack.

We can live with this, but I was a little disappointed by a couple of the other design decisions Haystack makes (presumably for the same reasons). It can only run queries against a single field on the document, which you implement as a kind of munged data catchall using Django’s template language to combine all the model fields you want indexed. It also puts all your models into a single ‘type’ under the index of your choice, so if you want to split them out you have to split them into new indices. Both of which mean you have an artificially large number of artificially large indices.

Worse still, Haystack comes with no out-of-the-box way to change any of the analysers or tokenisers you use, just allowing you to pick between snowball (a great English-focused text analyser) and nGrams, which weren’t even working for me as expected.


I then found this excellent blog post (which resulted in the elastickstack Django app), and followed it to extend Haystack and put my own analyser together for my special tag-search requirements.

That was when the real fun began. Nothing seemed to work. Sure it was my inexperience, I played around with analysers and tokenisers, validating them all through the ElasticSearch analyse and mapping APIs over the course of a couple of days.

The truth was slowly dawning on me: although Haystack was accepting my custom analysers, and ES was being told about them when I indexed my files, the analysers simply weren’t being applied to the fields as I was indexing them; only on the queries I was running. Which means that, at best, my custom analysis was being ignored and, at worst that it was being applied only to one side of the equation, giving me bogus – or no – results.

Which isn’t to say that elasticstack doesn’t work – if all you want is synonyms it should do the job really well, but don’t expect anything really complex from Haystack.


So now we’re starting the process of communicating directly with elasticsearch-py, the official ES Python library. We’re writing a lightweight wrapper that emphasises ES configuration and style over Django, in order to allow full access to all ES functionality in exchange for portability.

Luckily our codebase only touches ES in a couple of places…

OST-Header

Projects for OstModern: a healthcare startup and a huge Football video site

I’ve just finished freelancing at Ostmodern, a digital agency specialising in digital products and servicing a range of clients including the Beeb, BFI and News Corp.

I was ostensibly* brought in to work on a prototype healthcare application for use in hospitals, coded in Django and AngularJS — my favourite combination at the moment. I worked on the Django side and created a REST API instead of an HTML templated interface.

Some of the more interesting functionality I implemented was a full audit history (every change tracked and pre-change objects restorable), online/offline login with the same password (via the API, and including JWT access tokens), and a secured socket connection for realtime multi-user updates. For the latter two parts I implemented both the Django back end and the Angular front end.

I hope to write about some of these in more detail if I get permission from Ost to do so.

In addition to that project, I helped work on BallBall, a multilingual, minute-by-minuted updated website making football video highlights form the European leagues available to fans, mainly based in Southeast Asia. I implemented a little functionality in Django for the site, but the main focus of my work was Sysadmin.

BallBall front page in Vietnamese

BallBall front page in Vietnamese

The project has 5 online environments, and each environment comprises 6 systems running distinct codebases; not to mention static file hosting, managed database services and caching — both using Memcached and Akamai. All of this was hosted on Amazon, and each functional unit comprised at least two servers, managed by a Scaling Group, inside an individual Security Group, and accessed via its own Elastic Load Balancer. All of this was managed via a series of CloudFormation templates.

My sysadmin work on this project was mainly in optimisation (including implementing NewRelic tracking to analyse what could best be optimised) and migrating their environments to new AWS accounts — no trivial task.

Assuming I haven’t burned my bridges with my Christmas party performance (not to be published online!) I hope to work there again some time…

* see what I did there? ;-)

Project: A secret Agile startup project

In late 2013 I helped a well known investor collaborate with a retail service company in creating a really innovative new digital product. It’s not launched so I can’t mention any names, but it was codenamed ReTech and initially branded as Rapt :)

I assembled a small but brilliant freelance team: myself as part back end developer, part EP and part product owner; an iOS developer (Nick Wood); a brand/interface/UX designer (Paul Macgregor) and a scrum master (Smita Das). Never having worked in a fully agile project, it was a good opportunity to learn and our scrum master was excellent in steering us into really owning the process.

We spent a total of about 6 weeks working in 2-week sprints, and by the end we had an end-to-end working iOS application that demonstrated the core functionality of the startup, albeit with a limited data set. It looked great and worked pretty solidly — and perhaps most impressively we hadn’t worked past 6pm once…

Unfortunately the investor and the company with the domain-specific knowledge had a fallout, and so our 10-week project never culminated in the popup retail experience we’d been aiming for, and I’m still not really sure of the status of the project.

It’s a huge shame as the potential was enormous and I think we were all really proud of what we’d achieved and learned; we still send each other sad emails with links to startups who are implementing parts of the functionality we’d brainstormed together (although obviously their implementation aren’t as good as our would have been!)

If you get the chance to work in a pure agile way, where there can be full transparency between the product team and the people paying, I’d really recommend it — it’s amazingly productive and once you learn what you can and can’t commit to, you find yourself able to make promises you can keep to every time.

15

Project: ASOS Urban Tour

In August 2011 ASOS launched their new line of menswear with the Urban Tour promotion, concepted and designed by BBH and built by Stinkdigital, with film by Stink and Pulse, post by MPC, sound by Hear no Evil and interactive sound by Plan8.

A screenshot of the experience

A screenshot of the main experience interface. This image is taken from the Stinkdigital portfolio page for this project.

This was a hugely ambitious project that I am very proud to have been involved with while at Stinkdigital. I was the technical lead on the digital part of this project, as well as Creative Technologist and an Executive Producer.

I played a small role in shaping the project creatively, and a large one in working out exactly how it would all be achieved, and then making sure everybody understood everything. With a project like this there are so many moving parts that decisions and trade offs have to be made constantly as it progresses, and any specification goes out of the window in favour of doing something amazing in the time available.

I strongly encourage you to have a play with the project itself; BBH are hosting an archive version here: http://urbantour.bartleboglehegarty.com/urban-tour/. After that, come back to watch BBH’s case study below, and then read a bit more about what was involved.

This huge project has a Flash-based interactive film at its core with baked-in e-commerce support and a fallback HTML5 version that works on iPads. In addition, it brings together CMS-driven blog-style content featuring films shot around the world, with a custom rich transition sequence that puts each page into a global context.

The hero film itself was a huge challenge: a “one-take” handheld piece with 5 dancers loosely choreographed, 15 breakout moments that had to match specific points in the film exactly and involved “frozen time” (basically people being very still, often supported, in challenging positions) — all shot in two days. The intro sequence involved finding and licensing NASA imagery of the Earth and laying it onto an interactive 3D model; shooting stills from a helicopter over London and then compositing all of that and stitching it into large crane shot that had to match the handheld footage.

Helicopter shot of London.

Hi-res helicopter stills of London were comped together and zoomed into. This image is taken from the Stinkdigital portfolio page for this project.

Add to the pressures of the shoot and post process that the deadline from signoff was only 7 weeks, and it shouldn’t be a surprise to find out that the Stinkdigital team of around a dozen people worked 200 hours each in the final two weeks before launch. Those were dark days. We sadly had to drop many of the features we had spent a lot of time on, but it turned out pretty well in any case!

One of the behind-the-scenes pieces of work that we had to do was building a custom player for MPC to use in order to review the transitions between the main and the breakout films. Because the breakout films’ first frames had to match frames at specific points in the main film, we would scrub the timeline after a user’s click in order to transition them to their breakout. But the scrubbing would happen at variable speed (depending how far away from the transition point the user was at the time) and could happen forward or in reverse, which made MPC’s job very hard — and meant they needed a custom interactive player to try out their transition effects with.

Lil' Buck, Baby Bang, Looney, Zeus and Marcio taking a break from holding very still in awkward poses

Lil’ Buck, Zeus and Marcio taking a break between shots

Creating the e-commerce integration was itself a challenge; ASOS was launching their mobile application just after the campaign launch, and they were in the process of building the e-commerce API to support that. As a result we were often building against API features that weren’t yet in place or had unexpected bugs.

The e-commerce interface

The e-commerce interface. This image is taken from the Stinkdigital portfolio page for this project.

As the hero interactive piece represented London in the site’s global navigation, transitioning to each of the other cities needed to be richer than a simple page load. We built a custom effect on top of a gaming tile engine to give the effect of looking downward at the globe and speeding very fast between cities. The transition between London and Berlin is quick, has a small amount of sea and then a fair bit of greenery along the way, while the transition between London and New York takes longer and is almost entirely over a seascape.

The backgrounds to the cities’ blog-style pages were initially built as custom, live Flash cityscapes including moving traffic and sounds – each specific to its city. Unfortunately we weren’t able to optimise this in the time available and the entire feature was dropped :(

The blog-style interface.

The blog-style interface. This image is taken from the Stinkdigital portfolio page for this project.

Another feature that was dropped was the variable preloader; initially the zoom into the planet from space was going to take as long as the time required to preload the assets needed; a fast connection would see a quick zoom while a slower connection might take some time to drift into the planet.

Luckily, despite having to drop some parts of the project at the last minute, the talents of Lil’ Buck, Baby Bang, Looney, Zeus and Marcio, as well as Sebastian Strasser, Dom Goldman and everyone on the Stinkdigital team meant this still turned out to be a great project that everyone was very proud of.

It ended up winning 3 Cannes Lions (Gold for design, 2x Bronze for Cyber), a Webby and FWA Site of the Day.

Projects: online shooting, truck tracking and responsive experiences

I once built a site that let real people on the internet fire a real machine gun in real time. For real.

It was live for two days and during that time we were running 102 servers, with a total of 6 environments and crazy security between them, as well as on site, of course. We had live video being streamed from 4 satellite dishes in the Nevada desert, we used peer-to-peer video streaming for the users in control (for a lower latency experience), and we had a social- and socket-powered dynamic queuing system that let visitors see stats like how many rounds had been fired, in real time.

A dummy rifle representing the SAW M249 in the hydraulic, software controlled aiming mount we had custom built.

A dummy rifle representing the SAW M249 in the hydraulic, software controlled aiming mount we had custom built.

We built this all in 6 weeks, which is why there were a few bugs in the system and why myself and another developer spent 48 hours in a caravan in Nevada without any sleep; and why despite that quite a few people had a less than optimal experience. And why I’m not going to mention who the job was for or what it was about, in case I’m not supposed to. But it did happen, and if you Google the right words you’ll find references to it.

I was CT and tech lead, I did most of the sysadmin and about half of the back end coding, as well as coordinating everyone on the tech side and running the technical client communications, which on this job were massive. Ask me about it in private and I’ll tell you the story.

I’m proud of what we managed to do :)

The funnest thing about this Skittles job was that we had to build a Facebook app with lots of interesting functionality (including live tracking a real truck delivering some sweets to the UK, by giving the driver a mobile phone we’d set up with tracking software on it) — but make the whole thing look like native Facebook functionality.

Lastly for this post, I love what we did with the brief for Dick’s Sporting Goods. They wanted an interactive film with a scavenger hunt feel, but couldn’t afford to shoot film in real snow. And they wanted it to work on everything. So we convinced them to do it as a glorified, stylised slideshow.

I’m really happy with what it turned out to be; a really responsive rich media experience that works well on all modern devices. It’s got 360 pannable images that fly through to tell a story and let you hunt for the clothes to dress the cute teens in so they can hit the slopes in time. FWA gave this both Site of the Day and Mobile Site of the Day.

Take a look: http://winterready.archive.stinkdigital.com/

A product detail view on an iPhone. The whole rich experience worked across devices in a real responsive way

A product detail view on an iPhone. The whole rich experience worked across devices in a real responsive way. This image is from the Stinkdigital portfolio page for the project.

Projects: Nike, Diesel and Lexus

While at Stinkdigital I worked on a lot of interactive films. Some had great directors, while some were a bit less polished — but the three here are all special for one reason or another.

My Time Is Now was Nike’s enormous ad campaign around the time of the UEFA Euro ’12. Concepted by Weiden+Kennedy, it’s by far the biggest job I’ve ever been involved in. An incredible film budget with some serious techniques in play was matched by a hugely ambitious digital build, including a playable Sonic level made in JS, as well as many other interactive pieces and linear films all seamlessly joined to the master timeline.

Nike My Time Is Now

Nike My Time Is Now. This image is taken from the Stinkdigital website.

I was Creative Technologist for this job and helped figure out how some of it would work, including an editable breakout that would be customisable for each territory – and yet fit seamlessly into the film…

It originally played in YouTube; here’s an archived version to take a look at: http://nikemytimeisnow.archive.stinkdigital.com/


Although I’ve written a post about another Diesel job I worked on with Stinkdigital, the AW11 campaign is really worth a mention. As CT on this job, it was a joy and a challenge to be told that we needed to come up with a concept that would let us make an interactive piece without shooting any extra models or garments — all the approved shots were already in the bag.

We came up with this idea for Santo, and Nieto and his team in Paris really knocked it out of the park. Check it out: http://diesel-island-fw11.archive.stinkdigital.com/

The Diesel FW11 tabletop set

The Diesel FW11 tabletop set


Last (for now) but not least, I want to mention Lexus Dark Ride, that we made for Skinny in NYC. This shoot was over 5 nights in LA and I ended up staying there for a month in order to work with the post company on a technique for treating the film, so that we’d be able to let users pan around without the distortions looking too crazy.

In the end there’s almost a 180° field of view thanks to a 6mm lens (shooting almost entirely without special lights, in the dark). The car wasn’t yet in production and we had to make do with a lighting reference car for the exterior shots, and the interiors were shot in the actual reference model used to tool up the factories — it was super well protected, but that didn’t stop the steering wheel coming off in an action shot!

Most interesting from a digital perspective though, as well as your Facebook profile appearing in the film you actually get to say some lines. If you go through the whole introductory experience you are asked some ‘psychometric’ test questions, and we surreptitiously record the answers and use them as lines in the dialogue later. It’s a very weird feeling hearing yourself say a line you weren’t expecting!

You can see for yourself here: http://darkride.archive.stinkdigital.com/

An interactive moment

An interactive moment. Image from the Stinkdigital portfolio page.

The same moment, with all the footage users could pan to see.

The same moment, with all the footage users could pan around to see. Image from the Stinkdigital project page.