Xcode 8 and XcodeColors by Paulo Fierro

I ❤️ XcodeColors.

Being able to use different text color for log levels in Xcode is invaluable to me, especially on larger projects. It is one of several Xcode plugins I've been using via Alcatraz — the unofficial package manager for Xcode — a great tool and an easy way to browse through a collection of very useful plugins.

Unfortunately Xcode 8 crashes this party.

Xcode 8 now uses library validation which is good because it should prevent future occurrences of XcodeGhost style malware. Simultaneously it sucks because it means Xcode 8 no longer supports plugins. Instead, Apple introduced Source Editor Extensions at WWDC 2016 but unfortunately they are currently quite limited and only support text manipulation.

Fortunately there's a workaround, but its dodgy, dirty and potentially risky.

You probably shouldn't do this.

Seriously.

MakeXcodeGr8Again is a Mac app that duplicates Xcode.app and after a few minutes creates an unsigned copy of Xcode 8 in your Applications folder called XcodeGr8. This copy will happily load plugin bundles but as its unsigned its also open to vulnerabilities. You probably shouldn't do this, and you definitely shouldn't submit any apps with it.

If you don't have ~12GB to spare, you can use this workaround which makes a copy of, and unsigns the app binary (12KB). You then toggle which one you want to use by linking to it in the Terminal.

Personally I prefer running a completely separate app so I can easily tell which mode I'm in.

I also made a quick icon to so I can tell them apart in the Dock.

Spot the app

I feel very dirty, but I do have my colors back.

Please file or duplicate a radar and hopefully Source Editor Extensions will give us back the flexibility we want without requiring us to jump through these ridiculous hoops and make ourselves vulnerable.

Making Get to the CHOPPER by Paulo Fierro

Yesterday I uploaded a video from back in October 2015, where Niqui and I surprised our guests that were visiting with a helicopter tour around our lovely island, Grand Cayman.

The video itself was shot on 3 cameras – a GoPro 3 HERO Black, my trusty iPhone 6s Plus with an Olloclip and Niqui’s iPhone 6s. The footage looked good but unless you’re well known with the island its not that easy to figure out where we are at any given point in time.

Luckily Chris was wearing his new Ambit and tracked our flight path. This is exactly what I needed.

The flight path

The idea was to show the map of the flight path with the image of a chopper showing our current position in the lower-right hand corner as the video played.

Enter Motion.

I’m very much a Motion newbie — I’ve used it a couple times with results I’m mostly happy with, but still look quite amateurish. Maybe that’s part of the charm.

Initially I was thinking maybe I could get the flight-path file and parse the GPS coordinates and build something automatically. Turns out its a lot easier to trace the path instead.

I used the flight-path image from Chris as a base and used the bezier tool to trace the path. Once I was happy with that I launched Google Earth, zoomed in to the island to find the right location and took a screenshot. This was the background.

The background

Next I needed a helicopter. Found this lovely, 1987 model from a TurboGrafx-16 game called J.J. & Jeff.

Chopper

Chopper

Perfect.

Motion has a built in behavior called “Motion Path” which I applied to the image. Motion now asked me to draw the path. I’d already done this once so didn’t really feel like doing it again. Much searching later I found that the motion path’s “Path Shape” property was set to the default, “Open Spline”.

"Open Spline", you're not the one that I want

"Open Spline", you're not the one that I want

Changing this to “Geometry” allowed me to provide my existing, traced path as the “Shape Source”. Perfect! I now had my helicopter image moving along the path which was exactly what I was looking for.

There are a few times where the helicopter changed direction and it looked a bit weird for the chopper to fly backwards. Ok, we need a horizontal flip. But there is no way to flip horizontally.

I thought maybe I could set up a keyframe and swap the image with one that was pre-flipped, but that seemed wrong.

Much DuckDuckGo’ing later I found that setting the “Scale X” property to -100% means the same thing as “flip horizontally”. Cool.

Export the movie and we’re done.

Nearly. That looks good, but its a bit boring. It would be much cooler if the helicopter was animated.

I opened up the original image in Pixelmator and copied pixels until I had three states: left blade, both blades with rotated rotor, right blade.

Layers of pixels

Chopped chopper. By no means perfect, but good enough

Chopped chopper. By no means perfect, but good enough

I couldn’t figure out how to animate all of these together so I created another Motion project which lasted 3 seconds, with each of the states above lasting 1 second. Looking back now I should have made it 3 frames instead.

I exported this as a QuickTime movie ensuring that I set “Color+Alpha” in the Render options and set the project background to be transparent.

Export settings

Export settings

No matter what I tried, when I imported this into the main Motion project it wasn’t transparent. I searched and searched but found no solution so in the end I made the background a bright green and used the Keyer filter to get rid of it — that worked like a charm out of the box.

Well mostly. On a certain part of the video the animated chopper had a visible left border.

An annoying visible border

An annoying visible border

To get rid of that I set the Matte Tools’ “Shrink/Expand” to -4.

Matte Tools > Shrink/Expand

This got rid of the border but now the chopper looked a bit thin. Nothing a good dose of drop shadow can’t fix.

Border be gone!

Border be gone!

Looking good. To speed up the slow, 3-second animation I set the timing to 800%. And we’re done!

I exported the movie and brought it into Final Cut Pro X where I scaled it down to 25% and put it in the lower-right corner. At that size it was pretty hard to see the chopper, so back in Motion I scaled the animated chopper up to 200% and also added a Sharpen filter.

Finally, I made the clip as long as the edited shots to make syncing up a little easier and to keep the blades animating at full speed. Otherwise it would have looked odd because I’d have to stretch the 10 seconds out to 3 minutes. Now its actually done.

Back in FCPX land the rest of the video was put together. It was basically two multi-cam clips but of course I hadn’t considered how we were going to sync up the video shot on different cameras. Luckily FCPX can sync up video from multiple sources based on their audio signature. That’s some serious magic.

Color grading was done in ColorFinale which is my new favorite tool for this.

And that's it. It was a hell of a lot of fun making it — I certainly learned a lot, though the most fun, the most fun was definitely being on that chopper.

Listening for video playback within a WKWebView by Paulo Fierro

I’m currently working on an app for a client where certain features are disabled when a video starts to play. Specifically, a video that’s embedded in a WKWebView.

One way of doing this would be to inject Javascript into the page like I’ve written about before and let that attach listeners to every <video> tag on the page. When a video changes its play state our listener could post a message to the app via webkit.messageHandlers[_HANDLER_].postMessage(_OBJECT_); and we would need a handler in the app to do whatever it is we want it to do.

That's a little messy.

When a video starts to play in a WKWebView the <video> is replaced by the native video player. However, within the app we can’t get a reference to this video component in order to query its playing state, so I wondered if maybe this might just fire an NSNotification we could listen for.

I couldn’t find any documentation for this so I decided to listen for all notifications and see if I could find one that would do the trick.

To do that I added a notification center observer with no specific name, which translates to give me ALL of the notifications:

Clearing the console before tapping play on the video led me to find that there actually is a notification that gets fired from the web view — its called SomeClientPlayingDidChange.

Inside the notification’s userInfo dictionary lies an IsPlaying key, with exactly what we want.

This meant that now we could add an observer for this notification and a handler to do whatever it is we need to do:

Caveats: this notification doesn’t appear to be documented and could change at any time. It probably will. At the time of writing this works on iOS 8.x/9.x - however since we’re simply listening to the notification, the app might lose this functionality but shouldn’t lead to any crashes.

Two Things I Learned About My Phone by Paulo Fierro

Today I learned two things about my iPhone I didn’t know you could do. I’m not sure how long they’ve been iOS features, but I wasn’t aware of them — for all I know they’ve been there since the beginning.

The first involves receiving a call. Any time someone calls me, whether its a regular phone call or FaceTime, my desk explodes into a cacophony of ring tones, all slightly out of sync of one another, and sometimes just about loud enough to knock me out of my chair. Its close.

Today I found this screen in iOS 9, Settings > Phone > Calls on Other Devices.

I unselect the devices that I’m signed into but don’t want to ring. So simple, so useful.

The second involves receiving a call while using Bluetooth headphones. I’ve been bitten by the wireless headphone craze and have a pair of Beats Studio Wireless headphones that I use at my desk (they’re great) and Powerbeats 2 while on the go.

However every time my phone rings and my wireless headphones are connected I have to tap the Audio button and select the headphones as the audio source. This means that whenever I get a call I have to say “hold on, one sec” while I route the audio to the headphones — its a little annoying.

Choose your own audio route

Today I found a gem hidden in Settings > General > Accessibility > Call Audio Routing.

If you set this to Bluetooth Headset the audio routes automatically and the problem goes away.

Very cool.

How not to lose Apple Watch Activity + achievements when migrating to a new iPhone by Paulo Fierro

As an app developer I have lots (hundreds?) of apps installed, some I use and others are to see how certain unique features work or to compare implementations of certain ideas.

As an app developer I also upgrade my iPhone every year in order to test and build against the latest hardware. I always set my phone up from scratch, as a new device, and take this time as a spring cleaning of sorts. When I picked up my 6s Plus last Friday I realised that this would not be an option this time around.

This was because of two reasons:

  1. for privacy reasons health data is not backed up to iCloud
  2. an Apple Watch can only be paired with one phone at a time.

This means that I would lose all of my Activity data from my watch, but most importantly all of my Achievements too. I guess those Achievements work if I care so much about them — some are damn hard to get.

Achievements

So in order to not lose that data we have to do a little dance:

  1. Unpair your watch from the old phone. This automatically takes a backup of the watch data onto the phone — there is no way to do this manually that I’m aware of.
  2. Take a backup of the old phone. The fastest way to do this is via iTunes, and remember to set a password. Encrypted backups contain certain information other backups don’t, including your health data.
  3. Connect your new phone to iTunes and restore the backup you just made.
  4. Pair your watch to the new phone and restore the Watch backup when prompted.

Once this is complete you should be exactly where you left off with all your apps and settings including activity history and achievements.

In retrospect this was so much easier than setting it up from scratch that I think I'll be doing this in the future regardless.