Speeding up Xcode builds by Paulo Fierro

My main development machine used to be a 2010 MacBook Pro with a 512 GB SSD. Expensive, but way fast. Unfortunately the GPU died about a year ago so I've been using a 2011 iMac. Its also fast, if not faster than the MBP for many things but disk operations are not.

A few weeks ago on Dave Verwer's brilliant iOS Dev Weekly I found a blog post explaining how you could speed up Xcode (and AppCode) build times. This is done by moving Xcode's DerivedData folder as well as iOS Simulator Data to a RAM disk.

A RAM disk is taking a chunk of memory and treating it as a if it were a drive. Now SSD's are fast, but memory is still much faster. If you have extra memory lying around I recommend giving this a shot.

I didn't go the Terminal route but instead used iRamDisk from the Mac App Store. I was skeptical, but it definitely works.

Building and running an app I'm currently working on into a clean iOS Simulator used to take 20 seconds. Using iRamDisk it now takes 7. That's huge.

Regular builds are also much faster and very noticeable after doing a Product > Clean.

DerivedData gets a gig

The Simulator gets 512MB

The dropdown menu in the menu bar also lets you know how full each disk is and you can easily flush them via a menu option.

Menu options

Menu options

If you have extra memory lying around I'd give it a shot.

Serving compressed JSON on S3 by Paulo Fierro

Last week we shipped Brainfeed, an iPad app we've been working on for a client that presents educational videos for kids. We also developed a backend system for the curators to use to add and tag videos for the app.

This backend is built in Ruby with Padrino (on top of Sinatra) and runs on Heroku. Due to the nature of the app it only needs a single dyno and pushes content changes over to a bucket on Amazon S3 which is consumed by the iPad app. This means that the app can continue to work independently from the backend and any scaling issues can be handled on the S3 side of things, which has the added bonus of keeping the overall cost down.

Previously the app was consuming a JSON file provided by the backend via an API. When you're serving JSON (or any data) you really want to serve it compressed to save on bandwidth and overall load time.

In Apache you can add a line to your .htaccess file that indicates this like so:

AddOutputFilterByType DEFLATE application/json

This gzips JSON content on the fly automatically. This is normally already set in Padrino/Sinatra apps using Rack::Deflater.

However, when I started pushing the JSON over to S3 I needed a way to tell S3 to serve the content in a compressed fashion. Using the AWS SDK for Ruby you can do this like so:

Before the gzip compression we were serving about 960KB. Now that's down to 240KB which is much more manageable. Due to the nature of the app (watching online videos) we know the users are going to be on wi-fi so its not that big a deal, but every little bit helps.

For some reason this took way too long to figure out, so here it is for next time.

Brainfeed — Educational Videos for Kids by Paulo Fierro

This week we had the pleasure of shipping Brainfeed, an iPad app I've been working on the last few months aimed at providing educational videos for kids, 7-years and older.

Brainfeed

Each video is handpicked by a team of enthusiastic educators from around the globe who are tasked with finding short (under 10 minutes) and documentary style videos that are curriculum based, entertaining & engaging, visually stimulating, age-appropriate and child-friendly.

The app itself is free, but there to unlock all of the content you have to sign up for a subscription. The videos are great so its well worth it.

We were responsible for building the iPad app as well as the Sinatra based backend system for adding and tagging videos.

Its something I'm really proud of and its now available on the App Store.

What lurks on ports 5050-5500 by Paulo Fierro

This morning I was using Charles to monitor some HTTP requests like I do. I noticed some odd requests were showing up every time I changed tabs in Safari, but also when swapping to and from Safari.

The requests were going to something on localhost that was running on a range of ports and accepted a request at /snap/new. So something is trying to take snapshots of every page I visit in Safari.

This can't be good.

Suspect requests

After looking around I checked the Safari Extensions to see if I had installed a rogue piece of software and there it was. Nothing fishy at all, just Ember. There to do its job which is to take screenshots.

Disabling the extension got rid of the requests and I could get back to what I was doing without poor Charles getting flooded.

Console flooding, USB timeouts & Garmin ANT+ by Paulo Fierro

While I was working this morning I ran into some issues with an app and went to check if the Console (under /Applications/Utilities) was reporting anything.

Unfortunately the log was filled with some USB device timing out with the following error:

AppleUSBEHCI::Found a transaction past the completion deadline on bus 0xfa, timing out! (Addr: 5, EP: 1)

A very noisy log

This timeout was annoyingly happening every few seconds. The error did give me a few clues. It was happening on USB bus 0xfa on a device attached to address 5.

To find the device in question I generated a System Report ( > About this Mac > More Info > System Report) and looked for the USB bus with the same address.

Finding the USB bus number

Finding the USB bus number

Then it was just a question of going through the devices connected to this bus that had the address in question.

The address of the USB device

The address of the USB device

Found you! The device in question was the Garmin ANT+ stick that transmits data wirelessly from my Garmin Forerunner 610 watch.

Unplugging it fixed the problem and the Console stopped being flooded.

fry.jpg