Moving to SF Symbols by Paulo Fierro

I'm slowly working on replacing existing assets with SF Symbols wherever possible in this app I'm working on. In order to maintain a consistent look on iOS 12 and below I'm replacing the existing 1/2x/3x PNGs with a single PDF, by exporting the SVG from the SF Symbols app into Sketch.

By naming the bundled assets the same as the system names, I can use this handy extension to avoid the "if available" dance every time:

At first I wasn't sure if I wanted to update the older assets, but I think I prefer a consistent look regardless of iOS version even if it does mean a bit of extra work. Moving to PDF assets also means less files to update and hopefully a smaller app size.

I ❤️ cocoapods-binary by Paulo Fierro

I've been using Cocoapods for dependencies for as long as I can remember. Personally I never really got along with Carthage, but some people prefer it.

Over the weekend I came across a post titled "Using CocoaPods with pre-built frameworks instead of source files" which has radically decreased the build time on some of my projects. It describes how you can set up the cocoapods-binary plugin to precompile your dependencies which dramatically reduces the amount of time Xcode needs to build your app.

In two small-to-medium sized apps with 5-10 dependencies I saw upto 10x improvements. A clean build (with nuked derived data) going from about 90 seconds down to 9. 🤯🤯🤯

If you're using Cocoapods you really should try it out.

Making Get to the CHOPPER by Paulo Fierro

Yesterday I uploaded a video from back in October 2015, where Niqui and I surprised our guests that were visiting with a helicopter tour around our lovely island, Grand Cayman.

The video itself was shot on 3 cameras – a GoPro 3 HERO Black, my trusty iPhone 6s Plus with an Olloclip and Niqui’s iPhone 6s. The footage looked good but unless you’re well known with the island its not that easy to figure out where we are at any given point in time.

Luckily Chris was wearing his new Ambit and tracked our flight path. This is exactly what I needed.

The flight path

The idea was to show the map of the flight path with the image of a chopper showing our current position in the lower-right hand corner as the video played.

Enter Motion.

I’m very much a Motion newbie — I’ve used it a couple times with results I’m mostly happy with, but still look quite amateurish. Maybe that’s part of the charm.

Initially I was thinking maybe I could get the flight-path file and parse the GPS coordinates and build something automatically. Turns out its a lot easier to trace the path instead.

I used the flight-path image from Chris as a base and used the bezier tool to trace the path. Once I was happy with that I launched Google Earth, zoomed in to the island to find the right location and took a screenshot. This was the background.

The background

Next I needed a helicopter. Found this lovely, 1987 model from a TurboGrafx-16 game called J.J. & Jeff.

Chopper

Chopper

Perfect.

Motion has a built in behavior called “Motion Path” which I applied to the image. Motion now asked me to draw the path. I’d already done this once so didn’t really feel like doing it again. Much searching later I found that the motion path’s “Path Shape” property was set to the default, “Open Spline”.

"Open Spline", you're not the one that I want

"Open Spline", you're not the one that I want

Changing this to “Geometry” allowed me to provide my existing, traced path as the “Shape Source”. Perfect! I now had my helicopter image moving along the path which was exactly what I was looking for.

There are a few times where the helicopter changed direction and it looked a bit weird for the chopper to fly backwards. Ok, we need a horizontal flip. But there is no way to flip horizontally.

I thought maybe I could set up a keyframe and swap the image with one that was pre-flipped, but that seemed wrong.

Much DuckDuckGo’ing later I found that setting the “Scale X” property to -100% means the same thing as “flip horizontally”. Cool.

Export the movie and we’re done.

Nearly. That looks good, but its a bit boring. It would be much cooler if the helicopter was animated.

I opened up the original image in Pixelmator and copied pixels until I had three states: left blade, both blades with rotated rotor, right blade.

Layers of pixels

Chopped chopper. By no means perfect, but good enough

Chopped chopper. By no means perfect, but good enough

I couldn’t figure out how to animate all of these together so I created another Motion project which lasted 3 seconds, with each of the states above lasting 1 second. Looking back now I should have made it 3 frames instead.

I exported this as a QuickTime movie ensuring that I set “Color+Alpha” in the Render options and set the project background to be transparent.

Export settings

Export settings

No matter what I tried, when I imported this into the main Motion project it wasn’t transparent. I searched and searched but found no solution so in the end I made the background a bright green and used the Keyer filter to get rid of it — that worked like a charm out of the box.

Well mostly. On a certain part of the video the animated chopper had a visible left border.

An annoying visible border

An annoying visible border

To get rid of that I set the Matte Tools’ “Shrink/Expand” to -4.

Matte Tools > Shrink/Expand

This got rid of the border but now the chopper looked a bit thin. Nothing a good dose of drop shadow can’t fix.

Border be gone!

Border be gone!

Looking good. To speed up the slow, 3-second animation I set the timing to 800%. And we’re done!

I exported the movie and brought it into Final Cut Pro X where I scaled it down to 25% and put it in the lower-right corner. At that size it was pretty hard to see the chopper, so back in Motion I scaled the animated chopper up to 200% and also added a Sharpen filter.

Finally, I made the clip as long as the edited shots to make syncing up a little easier and to keep the blades animating at full speed. Otherwise it would have looked odd because I’d have to stretch the 10 seconds out to 3 minutes. Now its actually done.

Back in FCPX land the rest of the video was put together. It was basically two multi-cam clips but of course I hadn’t considered how we were going to sync up the video shot on different cameras. Luckily FCPX can sync up video from multiple sources based on their audio signature. That’s some serious magic.

Color grading was done in ColorFinale which is my new favorite tool for this.

And that's it. It was a hell of a lot of fun making it — I certainly learned a lot, though the most fun, the most fun was definitely being on that chopper.

Listening for video playback within a WKWebView by Paulo Fierro

I’m currently working on an app for a client where certain features are disabled when a video starts to play. Specifically, a video that’s embedded in a WKWebView.

One way of doing this would be to inject Javascript into the page like I’ve written about before and let that attach listeners to every <video> tag on the page. When a video changes its play state our listener could post a message to the app via webkit.messageHandlers[_HANDLER_].postMessage(_OBJECT_); and we would need a handler in the app to do whatever it is we want it to do.

That's a little messy.

When a video starts to play in a WKWebView the <video> is replaced by the native video player. However, within the app we can’t get a reference to this video component in order to query its playing state, so I wondered if maybe this might just fire an NSNotification we could listen for.

I couldn’t find any documentation for this so I decided to listen for all notifications and see if I could find one that would do the trick.

To do that I added a notification center observer with no specific name, which translates to give me ALL of the notifications:

Clearing the console before tapping play on the video led me to find that there actually is a notification that gets fired from the web view — its called SomeClientPlayingDidChange.

Inside the notification’s userInfo dictionary lies an IsPlaying key, with exactly what we want.

This meant that now we could add an observer for this notification and a handler to do whatever it is we need to do:

Caveats: this notification doesn’t appear to be documented and could change at any time. It probably will. At the time of writing this works on iOS 8.x/9.x - however since we’re simply listening to the notification, the app might lose this functionality but shouldn’t lead to any crashes.

Two Things I Learned About My Phone by Paulo Fierro

Today I learned two things about my iPhone I didn’t know you could do. I’m not sure how long they’ve been iOS features, but I wasn’t aware of them — for all I know they’ve been there since the beginning.

The first involves receiving a call. Any time someone calls me, whether its a regular phone call or FaceTime, my desk explodes into a cacophony of ring tones, all slightly out of sync of one another, and sometimes just about loud enough to knock me out of my chair. Its close.

Today I found this screen in iOS 9, Settings > Phone > Calls on Other Devices.

I unselect the devices that I’m signed into but don’t want to ring. So simple, so useful.

The second involves receiving a call while using Bluetooth headphones. I’ve been bitten by the wireless headphone craze and have a pair of Beats Studio Wireless headphones that I use at my desk (they’re great) and Powerbeats 2 while on the go.

However every time my phone rings and my wireless headphones are connected I have to tap the Audio button and select the headphones as the audio source. This means that whenever I get a call I have to say “hold on, one sec” while I route the audio to the headphones — its a little annoying.

Choose your own audio route

Today I found a gem hidden in Settings > General > Accessibility > Call Audio Routing.

If you set this to Bluetooth Headset the audio routes automatically and the problem goes away.

Very cool.