Intro to UIFontPickerViewController

A number of new font-management APIs were announced at WWDC 2019, including a standard system font picker for iOS and Catalyst apps (finally!). The session Font Management and Text Scaling covers the new APIs in detail; however, the sample code only exists on the slides and the documentation is currently very sparse. I thought it might be helpful to write out the sample code in a blog post, along with a little explanation.

UIFontPickerViewController is a customizable system font picker that, by default, has the following configuration:

  • displays all system fonts (including any fonts you include in your app bundle and that are specific to your app only)
  • shows only font families and not individual faces
  • uses a WYSIWYG presentation

To change these defaults, you can create a configuration object, initialize the font picker, and present it like so:

let config = UIFontPickerViewController.Configuration()
config.includeFaces = true
let fontPickerViewController = UIFontPickerViewController(configuration: config)
fontPickerViewController.delegate = self
self.present(fontPickerViewController, animated: true)

To display all font names using the system font, set displayUsingSystemFont to true.

You can also filter the font list by supplying an array of symbolic traits. For example, to display only monospaced fonts, you would do something like this:

let traits = UIFontDescriptor.SymbolicTraits(arrayLiteral: [.traitMonoSpace])
config.filteredTraits = traits

There are two delegate methods that allow you to control what happens after a user chooses a font or cancels font selection.

extension MyCustomViewController: UIFontPickerViewControllerDelegate {
    func fontPickerViewControllerDidPickFont(_ viewController: UIFontPickerViewController) {
        guard let fontDescriptor = viewController.selectedFontDescriptor else { return }
        let font = UIFont(descriptor: fontDescriptor, size: 48.0)
        textView.font = font
    }

    func fontPickerViewControllerDidCancel(_ viewController: UIFontPickerViewController) {
        print("User selected cancel.")
    }
}

Finally, if you want to display user-installed fonts, you need to add an entitlement to your app. In Xcode 11, go to your project settings, tap “Signing and Capabilities,” and add a new capability called “Fonts.” Under the Fonts entitlement are two options next to “Privileges.” Check the box by “Use Installed Fonts” to indicate that you want your app to read fonts that were installed by other apps.

App Launch Map ?

This originally started out as a brief link-style post to Aleen Simms’s recently released guide for preparing to launch an app, App Launch Map. However, I feel I have enough to say about it to warrant something a little more substantial.

I volunteered to be a beta reader for App Launch Map. To be honest, I wasn’t expecting to glean a lot from it, as I had already launched an app and had read just everything I could possibly find about indie app promotion, from endless lists of marketing strategies to wild success stories and painful postmortems. What more was there to say?

I mean, there’s no formula for success in the App Store. This is often distressing to people like me, who literally spend most of their time thinking about and crafting formulas in code. There’s hope, though, in that marketing is largely about storytelling. Storytelling, like programming, is a craft that one can improve upon with practice and guidance. That’s where App Launch Map comes in.

Along with a bunch of really practical advice about things like building your product page, creating a website and press kit, and contacting journalists, App Launch Map begins with an incredibly useful section on crafting your app’s story. If you don’t have a strong, cohesive story surrounding your app—what it does, why it matters, who it’s for— it’s darn near impossible to talk about it to others in a clear, confident, and convincing way.

As I read through the guide on my iPad Pro, I opened up a blank note next to it and began to apply the step-by-step writing prompts to Snapthread. I quickly realized that Snapthread’s focus had changed so much from version 1.0 that I had sort of lost track of its narrative. As a result, all of my marketing efforts have suffered, from my screenshots to my emails to the press.

Aleen’s guide has brought me clarity. I just spent 3-4 hours creating new screenshots for Snapthread, and for the first time, I’m proud of them. Next on my list is a new press kit.

If, like me, you struggle with marketing, go check out App Launch Map. It’s $40 and includes all future updates for free. If you’re an indie dev with slow sales and are thinking about throwing some money into advertising, consider buying this instead. It’s very likely that there are things you can do to improve upon your existing product page/website that will help you more than buying ads.

How to Flip an App for Profit

Late last year, David Barnard wrote a great post entitled “How to Game the App Store,” outlining some of the numerous hacks that developers can use to send their apps soaring up the App Store charts. This post is written in that same spirit, which is to say: I don’t recommend actually doing any of these things if you have any aspirations of being a decent human being.

Yesterday, after browsing the Top Free Photo/Video apps chart, I posted this tweet in frustration:

It sort of blew up (is there a word for getting fireballed, but on Twitter?), and some of the replies indicated a bit of ignorance or misunderstanding of what exactly the problem is with apps like Background (which, at least as of 3pm CST today, has been removed from the App Store).

Background used to be a good app. You can tell from its early reviews that its users genuinely enjoyed browsing and making use of its hand-curated selection of iPhone wallpapers. In fact, its reviews are generally positive up until late June, when an update began causing some issues. From that point on it becomes clear that Background is no longer owned or updated by its original developer. It’s been flipped.

So how does an app get flipped? Read on to discover the ultimate secret to making millions on the iOS App Store.

Step 1: Find a good app with a long history of positive reviews and purchase it from the developer

This isn’t too hard if you’ve got the cash. Background is just one example of many apps that, while beloved by a small community of users, struggled to attain a wider reach. It was acquired by Tron Apps, probably sometime this summer. From what I can tell, the worst app-flipping offender in the Photo/Video space is Bending Spoons, LLC. A few of the apps they’ve purchased include Splice, Photo Editorº (purchased under the name Easy Tiger Apps) and Video Maker with Music Editor. In the reviews, you can find the exact moment when everything went to crap.

Step 2: Riddle the app with ridiculously expensive subscription options

The gold standard seems to be a 3-day trial that moves into a $9.99/week subscription, but there’s flexibility here, depending on precisely how evil you want to be. Make sure to hide these new payment options from your pre-acquisition users. After all, you don’t want them updating their glowing past reviews. Oh, and for those new users you’re about to acquire? Make sure it’s darn near impossible for them to find the “x” to close your subscription view (or, for fun, make it completely nonfunctional!).

Step 3: Design the perfect false advertisement

You’re going to need a super polished video that absolutely does not show any actual footage from your app. In fact, feel free to steal images from other apps. For a good example, view Background’s Facebook ad, helpfully listed on SensorTower.

Step 4: Spend a stupid amount of money on user acquisition

That beautiful bundle of lies you just made? Blast it on Facebook, Snapchat, Instagram, in-app ad networks, whatever you can find. Buy search ads. If you’re not spending between $10k and $500,000k (oh, who are we kidding? There’s no upper limit), you’re not doing it right. This should help you get on the Top 200 Free chart, and hopefully into the top 50 or so, which tends to create its own buzz.

Step 5: Buy fake reviews

Once the number of one-star reviews written by real humans whose time and money you’ve wasted starts to actually affect the app’s overall star rating, it’s time to buy some fake ones. For some examples of how they’ll look, check out “PicsHub -Art Effects & Get old.” I don’t actually know how you go about buying fake reviews but it’s probably not hard. Just google “how to be the biggest wanker” and that should get you close.

l e g i t i m a t e

Step 6: Profit!!$$!

Congratulations! You’re now rich, and I despise you.

Memory Mischief

A few days ago, as I was attempting to locate and fix Snapthread’s worst shipping bug ever, I encountered a side quest. I haven’t spent much time using Instruments for debugging, mostly because it rarely works for me. either the run terminates, or Instruments itself hangs or crashes. However, I had managed to start an Allocations/Leaks run, and was watching everything hum along nicely; that is, until I decided to tap on a project that Snapthread automatically generated for me. It contained a bunch of large still photos taken with my DSLR and each time one of them was composited for insertion into the final video, memory usage climbed until the app crashed.

I checked the allocations. Hundreds of persistent megabytes were in the VM:IOSurface category. Exploring further, the responsible callers were either “FigPhotoCreateImageSurface” or “CreateCachedSurface” and the stack traces were filled with methods prefaced by “CA”, “CI,” or “CG.” After perusing Stack Overflow, I wrongly assumed that the objects were being retained due to some internal caching that CIContext was doing. After all, I had wrapped my photo/video iteration in an autoreleasepool, and I was only using a single CIContext. The memory should have been dropping back down each time, right?

Here’s what happens when Snapthread creates a video out of a still photo:

  1. If the photo needs to be letterboxed, it is first run through a CIFilter that blurs it, converts it from a CIImage to a CGImage, and sets it as the content of a CALayer. The original photo is then added to its own layer and positioned over the top of the blurred layer.
  2. If the photo doesn’t need to be letterboxed, it’s simply set as the content of a CALayer.
  3. The layer from step 1 or 2 is then composited over a short blank video using AVVideoCompositionCoreAnimationTool.
  4. The AVVideoCompositionCoreAnimationTool is assigned to the AVMutableVideoComposition, which is then assigned to an AVExportSession.

Now allow me to let you in on a little secret: my code is terrible. Can you guess which object calls the code that I just described above? It should be some sort of model controller object, but nope. It’s the “StillPhoto” object itself. Currently, each video and still photo is responsible for providing its own final formatted version to the video compositor. Yeah, I know. I’ll fix it eventually. Continuing on…

After a few hours of trying to figure out how to purge that internal cache, I started to entertain the idea that maybe it was me retaining the objects somehow. I ran the allocations instrument a few more times and discovered a curious stack trace. It highlighted AVAssetExportSession’s “exportAsynchronously(completionHandler handler: @escaping () -> Void)” as retaining over 500MBs. Huh?

Then it hit me. I was storing a reference to the export session, in case it needed to be cancelled. Y’all. Let me say it again. I WAS STORING A REFERENCE TO THE GOSH DARN EXPORT SESSION WHICH STORED A REFERENCE TO THE FLIBBERTY FLANGIN’ ANIMATION TOOL WHICH STORED A REFERENCE TO A ZILLION MEGABYTES WORTH OF IMAGE DATA. And I never set it to nil after the session ended.

fjdisoajefiajfdsjafdslafdadshfuewahfsoik

Can I fire myself?

Wrapping Algorithms in Empathy

One feature I’d like to add to Snapthread is something akin to the “For You” section of Photos: a small number of auto-generated movies that the user might find useful or meaningful. In Photos, these movies are based on common image properties such as date, location, relationships gleaned from facial recognition and contacts data, and image content classified via machine learning such as dogs, cats, and bodies of water.

I don’t have access to the facial recognition data that Photos collects, and as anyone who’s had the pleasure of syncing their iCloud Photo Library to a new device knows, feeding tens of thousands of photos through an image classification algorithm takes a long time and can burn processing and battery power like nobody’s business. That leaves me with two methods for grouping photos and videos: date and location.

Attempting to smartly group a user’s photos by location without knowing where they consider “home” is pretty much impossible. Are those vacation photos, or just bathroom selfies? I don’t know. I don’t want to know. That leaves me with the safest, least creepy option: capture date.

At the surface level, organizing photos for a user based on their date seems super innocuous—that is, until we stop to recall what a disaster it was for Facebook when they first presented auto-generated “Year in Review” videos to every single user. While some people smiled at memories of a happy year, others were intensely and abruptly reminded of lost loved ones, breakups, illnesses, and other emotional events. In fact, it was re-reading Eric Meyer’s heartbreaking blog post about it that made me pause and think twice about adding this feature.

Some years just aren’t good years. There’s no way for an algorithm to know that. There are, however, steps I can take as a developer to design my “For You” feature with the worst case scenarios in mind:

  1. Make the feature opt-in. This would involve unobtrusively asking the user if they’d like to see some suggested movie projects. The preference would also be available as a toggle in the app’s settings.
  2. Don’t auto-play anything. Even if a user has opted in, they may not want to view a particular suggested movie for whatever reason. I don’t want to force it on them.
  3. Make the whole “For You” section collapsible. Maybe a user just doesn’t like that particular day’s suggestions. Let them hide the thumbnails so they don’t have to look at them.
  4. Make the movies editable. Maybe there’s just one or two videos that ruin an otherwise great movie. Let users delete/replace them.
  5. Don’t add any titles or captions that suggest a particular mood, like “Summer Fun” or “My Great Year” etc. Just stick to dates.

There are two types of auto-generated movies I’d like to offer: ones based on recent photos/videos (such as “last month” or “today”) that are designed to get users up and running quickly, and memories from awhile ago, such as “on this day.” I don’t think the recent ones need safeguards: after all, those are photos you’d see if you opened up your library anyway. It’s the ones from years ago that I need to be sensitive about.

Curating someone’s personal memories is challenging. At best, it can surprise and delight; at worst, it can be painful, invasive, and just downright creepy. We app devs may not have to take any sort of Hippocratic oath, but we probably should. If, like me, you’re considering adding some kind of personalized content to your app, tread carefully, and design with worst case scenarios in mind.

iOS 13 Summer Plans

Yesterday I finally had some time to sit and think about what improvements I want to make to Snapthread this summer. I still want to rewrite the app using SwiftUI; however, after a bit of exploration, I think I may need to wait until it’s a little more capable. Here’s what I’m planning to do instead.

Phase 1

I want to leave the app in a good place for my iOS 11 and 12 users. To do that, I want to add a few more soundtracks to choose from and a tool for adjusting a video clip’s speed.

Phase 2

Based on everything that was revealed at WWDC, here’s what I want to do after I set the minimum OS target to iOS 13:

  • Rewrite my UICollectionView code to use the new compositional layout class and diffable data source
  • Redesign my photo library picker. Apple has deprecated the methods I was using to fetch “Moments,” so I will need to do something else to help users find the photos and videos they’re looking for.
  • Explore some of the new property wrappers, like @UserDefault
  • Replace my icons with SF Symbols and design a few custom ones
  • Replace my colors and font sizes with semantic ones and set the whole app to use dark mode
  • Use the new system provided font picker
  • Possibly rewrite two view controllers in SwiftUI: Settings and Soundtracks
  • If I have time, create some more custom Core Image filters

Doing everything on that list should help rid my code of most of its remaining bugs and set the app up well for the future. I can’t wait to get started!

Personal Takeaways from WWDC 2019

Wow, what a conference, eh? Like most, I’m still processing the many new frameworks and APIs that Apple presented to us last week. So far I’ve watched 12 session videos, taken copious amounts of notes, and spent lots of time thinking about what all of this could mean for my app. As such, this post will be an attempt to organize those thoughts.

SF Symbols

When I wished for more standard system icons that could be used anywhere, I definitely did not expect Apple to deliver over 1500 of them. I feel particularly validated by Apple’s instructions for creating custom icons: find a symbol in SF Symbols that resembles what you’re looking for and edit the SVG. I feel validated because that’s exactly what I’ve been doing to create all my icons in Snapthread, except that my custom icons are based on a $25 set of 200 icons from Glyphish. Browsing through SF Symbols, I think I can replace nearly all of my icons with them, with maybe two exceptions.

The Big Functionality Giveaway

One huge point that nearly every presenter hammered on was that if you follow the Human Interface Guidelines and use Apple’s frameworks as-is, you get a TON of functionality for free. In fact, one major goal of SwiftUI is to handle all of the basic features of your app for you, so you can focus on perfecting your app’s cool, custom features. For example, if you use SwiftUI correctly, the system will automatically handle animating view changes beautifully. If you use semantic colors, Dark Mode just works. Localization behaviors for right-to-left languages, Dynamic Type—these are all things you get for free if you use Apple’s semantic font sizes and SF Symbols.

I think it was Mike Stern who said something like, “if you spent time recreating what UIKit gives you for free, with custom controls, you may want to…I don’t know how else to say this…stop doing that.” Launch Storyboards, resizable interfaces, and support for split view multitasking will all be requirements starting in April of 2020. I don’t think the message has ever been clearer: follow the HIG, use the tools we’ve given you, be a good platform citizen. Just do it.

The New Peek & Pop

If you haven’t watched “What’s New in iOS Design,” you should. Peek and pop have become “contextual menus” that are now available and accessible on all devices. “Use them everywhere!” Mike says in the session. Apple wants these contextual menus to be so pervasive that their users expect to find them all over the place. An important thing to note is that any functionality placed into a contextual menu should also be accessible from elsewhere in the app. There are convenience methods for adding contextual menus to table view and collection view items, which I plan to use so that users can perform common actions on video clips in their timeline. Overall, I think this is a great change.

Dark Mode

I wasn’t particularly excited about dark mode prior to the conference because my app, like most other video and photo editors, already has a dark appearance. However, now that I’ve learned more about it, I really like the way Apple’s colors, fonts, and new “materials” adjust to trait changes. For instance, if you use semantic background colors, there are slight variations for “base” and “elevated” states. Apps are elevated when in split view multitasking so that the black separator between apps can be seen more clearly, and controllers and views are considered elevated when they are presented modally. The whole system seems well thought-out, and I plan to adjust my code to use semantic background and font colors, as well as the new “materials” options, and then simply force the whole app to use dark mode (which, incidentally, is as easy as changing an Info.plist value).

Combine

I…don’t understand Combine yet. I mean, I sort of do. I don’t feel like I need to understand it yet, though, because there are only a few places in Snapthread where I could make use of it. I observe values on my AVPlayerItems, there’s a few UserDefaults I keep track of, and maybe a handful of Notifications. Anyway, I’m sure it’s really awesome; I just need to re-watch the videos and read a few more articles before I can grok it.

Collection View Improvements

Collection views got a major API upgrade this year with completely new ways to lay them out and configure their data sources. Like SwiftUI, the new layout API is both compositional, and declarative. The most common crash in Snapthread has to do with the collection view inside my custom photo/video picker, and I still haven’t managed to figure out what’s causing it. This probably sounds terrible, but: I’m hoping that by using these new APIs, the problem might just go away!

In fact, I’m hoping a whole pile of layout-related bugs will be eliminated, which brings me to…

SwiftUI

My code sucks. It just does. I’m inexperienced, I’ve had no mentors or code reviews (by choice—I’ve had offers from many great people!), and there are fundamental concepts of programming that I only have a tenuous grasp of, at best. Despite my best efforts, I’ve utterly failed at using the MVC model. My views are all up in my model’s business, I probably have delegates where I don’t need them, or, on the flip side, other weird hacky ways of communicating between view controllers (like via viewWillDisappear and unwind segues, and all sorts of odd places) where I should have just used a delegate.

With SwiftUI, I feel like I can finally just burn it all to the ground. SwiftUI makes sense to me because it is declarative, and I love it because it forces its views to rely on a single source of truth. One of the items on my wishlist was for “every visual customization that is possible for a UI component [to] be editable in Interface Builder.” As a modern replacement for Interface Builder, SwiftUI delivers on this request with gusto. There’s a TON of advanced drawing stuff you can do with SwiftUI and all of it is immediately preview-able without building and running the app. That blows my mind!

SwiftUI has some missing pieces. There’s no control that provides the functionality of a collection view. You could probably hack together some HStacks and VStacks, but you wouldn’t get caching or cell reuse. For now, UICollectionViews can be wrapped in a UIViewRepresentable-conforming object to be integrated into SwiftUI. If you’re working with videos, you still have to work with AVPlayerLayers. Live Photos are still previewed in a PHLivePhotoView. I’m sure there are many other frameworks that make use of UIKit classes as well.

Still, my urge to re-write Snapthread is strong. By re-writing most of the app to use SwiftUI, I’m confident that I’ll be able to edit it on the go next year, when a first party, Xcode-like code editor will likely arrive on iPad. I’m also confident that it’ll be way less buggy, and way easier for future me to understand, since all dependencies will be so clearly defined. I’ll try to share some of my new SwiftUI knowledge as I go!

I’ll have to drop support for iOS 11 and 12. Before I do that, I want to add one more feature and maybe some more music to the soundtracks list. It’s going to be a busy summer!

Preparing for the macOS 10.15 Beta

A few years ago around WWDC time I made the mistake of installing the fresh new beta of OS X on my only Mac. Shortly after that, I needed to submit an update to one of my apps…only to find out that you can’t submit release builds to App Store Connect (then iTunes Connect) from a beta version of Mac OS. After some furious googling, disabling system integrity protection, and editing some plist that I was undoubtedly not supposed to touch, I tricked Xcode into thinking I was using the previous version of the OS. Lesson learned.

Since then, I’ve waited until September to update my Mac. This year, however, is different. This year is Marzipan.

So, I took to Twitter and asked for recommendations for external SSDs. Several people recommended the Samsung T5 Portable SSD, so that’s what I got. Fortunately it arrived today, just in time to install macOS 10.15 on Monday!

If, like me, you’ve never run macOS from an external drive, I found some very good instructions over at Macworld. I’m looking forward to exploring all the new features (and Marzipan apps!) that macOS 10.15 will bring without worrying about messing up my main development environment. How about you? Will you be installing the new macOS beta next week?

WWDC 2019 Developer Wishlist and Keynote Predictions

WWDC is now just two weeks away, so I thought I’d share what I’m hoping for in the way of developer tools/APIs.

UIKit Updates

  • A standard system font picker
  • A standard system color picker
  • An easier way to implement multi-column layouts that doesn’t involve nesting UISplitViewControllers
  • A keyboard navigation system for iOS that would allow users to navigate table views and collection views using the arrow keys
  • Greater variety of stock UI components and customization options for components on iOS (i.e. stop making us reimplement an expanding table view cell with a rotating arrow, or a circular progress indicator, or write dumb hacks like adding a blank UIImage as a navigation bar’s “shadow” just to get rid of that line beneath it)
  • More standard system icons that can be used both inside and outside of tool bars and navigation bars (like, on regular UIButtons)
  • A visual redesign of all current stock components on iOS, which, in general, are boring, ugly, and inaccessible
  • More Apple frameworks classes should be made Codable-compliant (i.e. UIColor).
  • I want to be able to force a dark appearance for all elements in my app. Snapthread’s never going to have a “light mode,” but I want to make use of iOS 13’s dark mode appearance for things like action sheets, pop ups, picker views, switches, etc regardless of the user’s display setting.
  • A cleaner way to check if a device is in landscape orientation than checking the orientation of the status bar
  • Support for rearrangeable buttons in a toolbar/nav bar on iOS

Xcode Improvements

  • Interface Builder improvements: basically, every visual customization that is possible for a UI component should be editable in Interface Builder. This includes layer customizations like corner radius, border, and drop shadow.
  • There should be more information than just a progress bar when uploading to App Store Connect from Xcode. At least Application Loader tells you the upload speed and how many MBs are left.
  • Closing the new Music app should not be a requirement for installing Xcode.
  • A way to hide/dismiss warnings in Xcode

Photo and Video

  • Developer access to the more recent Apple-made Core Image filters, such as Silvertone, Dramatic, Vivid, etc
  • A “What’s New in AVFoundation” session…there hasn’t been one since 2016
  • There are rumors that Apple will enable developers to train machine learning models on device. I’d like to see examples of that applied in photo/video contexts.

These are just my wishes… now for a few predictions.

WWDC Keynote Predictions

There are rumors that Apple is planning to grant developers access to its real-time document collaboration API. I think this is going to be a big talking point, along with the ability for apps to have multiple “instances” open at the same time (i.e. being able to compare two Microsoft Word documents in split view on iPad). I’m guessing there’s going to be more than one demo showing off these features, and so my prediction is that Microsoft is going to demo a new Office update, and Adobe will be there to show off real-time collaboration in XD or something like that (somebody’s gotta compete with Figma, right?). Or Affinity maybe? I imagine some sort of photo editor or design/drawing app will at least be shown to demo the new floating panels on iPad.

I have no doubt that Tim Cook, or whoever is doing the watchOS presentation, will mention Apple’s ECG app—how it’s already saved lives and will continue to roll out to more countries. None of the new rumored health apps seem demo-worthy, but there’s rumors of new Siri intents, and I’m betting at least some of those are going to get demoed on the watch—event ticketing, flight/gate information, etc.

I have no clue what third party Marzipan apps we’ll see onstage. Darkroom seems like a good candidate for Mac, as well as Ferrite and LumaFusion. Some weather apps might be interesting, but not that interesting. I’m stumped. Instagram? Instagram for iPad and Mac? (lololol) Games don’t make much sense because engines like Unity already enable devs to make cross-platform games.

Perhaps the biggest mystery of all is how Apple is going to fit all of this into a ~2 hour keynote. Close your eyes for a moment and think these words in Tim Cook’s voice: “iOS 13, with a gorgeous new dark mode…” I mean, you know they’re going to want to spend 15 minutes just talking about dark mode. Judging by Bloomberg and 9to5Mac’s rumor dumps, they’re only going to have about two. There’s just too much to cover. Apple has shown that they can put together a really tight keynote though, so my prediction is that we’ll see another fast-paced, throw-a-bunch-of-features-on-a-slide-and-move-on presentation.

What about you? What are your predictions? Feel free to share your own thoughts/blog posts with me on Twitter @bhansmeyer, or Micro.blog @becky.

Snapthread 2.0 is Now Available!

Snapthread 2.0 is live on the App Store as of yesterday, and so far I’m very happy with how it’s been received. I thought I’d write a bit about some of the new features, and my ideas for the app going forward.

I already wrote about Snapthread’s new data recovery feature, which, to be honest, has lifted an emotional burden that’s been weighing on me ever since I got my first negative review. I feel like I can finally relax and just focus on the cool features I want to add instead of obsessing over every little crash, a few of which have been out of my hands.

Another one of my design goals for 2.0 was to make the app even more user-friendly. It’s possible that Apple will make iOS 13’s default stock buttons look more like…well, buttons… but I didn’t want to wait until June to find out. So, I brushed up on my very basic graphic design skills and got to work making some buttons in Affinity Designer. They have gradients, and drop shadows, and noise, and I think they look nice. The dark gray buttons you see in the app have an image background that has been sliced so that it can easily expand to fit its contents. On iPad, most buttons have both an icon and a text label.

I also moved several buttons to new locations. It bothered me that the top navigation bar had an unbalanced number of icons on each side, so I decided to take two common actions, share and reset, and move them closer to the bottom of the screen. I also heard from some users who wanted a separate “save” button apart from the one in the share sheet, so I added that as well. To regain some space for the video preview, I moved the aspect ratio button to the navigation bar.

Earlier I wrote about how I wanted to refactor the entire app to use container view controllers. Instead of popping up a new modal view controller every time the user selected an editing tool, I wanted to gracefully transition between view controllers by fading them in and out. Now, the app’s main view controller has three containers: a small one at the top for displaying banners and progress bars, a middle one for displaying video content, and a bottom one for displaying various controls. For the iPad version, a fourth container runs vertically along the right side. I’m still working on making the code cleaner (it’s kind of a tangled mess of delegates right now), but it works, and it feels much snappier to me.

Prior to 2.0, there was no way to tell which clip in your timeline was currently playing. Now when you tap the play button, every thumbnail in the timeline darkens slightly except for the currently playing clip. There was also no way to know what a clip’s duration was while you were trimming it…now, there’s a handy label with that information (there’s still more I need to do to make trimming a better experience, but this should help at least a little!). You can now adjust the crop rectangle of landscape and portrait videos when you’re not using letterboxing; previously, that feature was only available if you selected a square aspect ratio. And speaking of square videos: they can now be letterboxed as well. Before, if you added a square video or photo to your project it would force the aspect ratio to be square.

I use a ridiculous hack to get a dark appearance for this action sheet. I’m hoping iOS 13 will eliminate the need for that!

The iPad version now includes a bunch of keyboard shortcuts. A full list of them can be found over at Snapthread’s spiffy new website. One of my wishes for WWDC is for a way to use the keyboard to navigate a collection view. In other words, you could use the arrow keys to say, zip through your photo library and select/preview images and videos. There’s currently no way to implement that (at least that I can figure out), so you still have to reach up and poke the screen to select your clips.

Last but not least, you can now rotate photos and videos, add text overlays, add a bounce effect (from within the Loop tool), and add filters. There’s a mix of Apple’s built-in filters and a few I created myself, named after towns in Nebraska. I also did my best to recreate Apple’s Silvertone filter, as it’s not available to developers (at least not that I could find!). Creating Core Image filters by chaining a bunch of adjustments together is kind of fun, and I definitely plan to add more filters to the list.

I have a long list of improvements to make and features I’d like to add to Snapthread in the future. Some of them I’d like to keep under wraps (just in case they don’t work out, ha), but others are just more basic things you would expect from a video editor: stickers, speed adjustments, an optional Ken Burns effect, etc. I’d also like to make improvements to some of the existing tools before going wild and adding more. For instance, adding text overlays can be a little janky at times. iCloud downloads can be canceled but not restarted, which is frustrating. Trimming could be more precise. The crop tool could allow zooming.

Now, it might be that none of that seems particularly remarkable, and you might wonder why Snapthread 2.0 is a big deal at all, as most video editing apps already have the same basic tools (and more!). It’s a big deal to me because I’m only one person, because I’m still mostly a beginning programmer, and because I really care about it. It might be a big deal to you because it’s one of only two or three apps on the store that merge Live Photos, and now, it’s better than ever. ?

5 Localization Resources for Indie Devs

Let me first say two things. First, these tools aren’t very helpful if you have a lot of text in your app. By “a lot,” I mean long sentences, tutorials, lengthy error messages, etc. Second, my app has enough text that I had to ask for help. I can’t afford translation services, but I have three wonderful volunteers who have helped me make Snapthread available in Spanish, Italian, and French.

That said, I usually find the following resources helpful whenever I attempt to translate single words and short phrases on my own.

1. Babble-on’s “Missing Apple iOS Localization Term Glossary”

A long name for a great resource containing nearly 300 common words and phrases found in iOS apps with translations in the following languages: English, French, Spanish, German, Italian, Portuguese, Japanese, Korean, Dutch, Russian, and Chinese (Simplified & Traditional).

2. Linguee

Linguee is a language dictionary available in quite a few languages. One of the neatest features of Linguee is that if you search for a term, it lists other websites where that term was translated under the heading “External sources.” There it shows a side-by-side of the website’s text in both languages, with your term or phrase highlighted. Linguee warns that these external sources are not reviewed, but you can look at the URLs and judge for yourself. For instance, if I search for “photo library” in the English-German dictionary, I can find instances of its translation on websites from Corel, Sony, Snapfish, and more.

3. Boomerang by Illuminated Bits

Boomerang is an iOS app by Ish ShaBazz and Heidi Helen Pilypas. It helps automate a common task: using Google Translate to translate something to another language, and then translating the result back into the original language, just to double check. I like to use Boomerang as my final check after translating a word or phrase.

4. Apple’s Support Documents

If you’re wondering what Apple calls some of its own features, apps, and technologies in other languages, you might try checking out the help pages at support.apple.com. At the bottom of every page is a country name with a flag icon next to it (mine says “United States”). Selecting it allows you to choose another language to display the page in. Often, the names of features, apps, menus, and buttons will be capitalized, or in a bulleted or numbered list so they’re easy to find.

5. Other Apps: Beg, Borrow, and Steal

Look, being an indie with a shoestring budget is hard. If you’re looking for really common words or phrases like “Exporting,” “Frequently Asked Questions,” or “Restore Purchases,” consider finding a big name app that already did the work for you. This requires a good memory (or a lot of screenshots), as you’ll need to change your device language, navigate through your chosen app, write down (or screenshot) the translation of any strings you need, and then make your way through Settings to switch back to your native language. It’s not for the faint of heart!

Conclusion

If you want to do a really thorough job localizing your app, you’ll probably need to enlist the help of a professional translation service (you might also consider bribing a bilingual friend). A really thorough job would involve translating your app subtitle, description, keywords, screenshots, and every string in your app, including VoiceOver strings. However, if your app is mainly driven by icons and gestures with very little text, the resources and ideas above may be helpful to you!

Data Recovery in Snapthread 2.0

tl;dr: The Codable protocol truly is great for lightweight data storage.

For me, the hardest part about being an app developer has been hearing from users who have lost work (and precious time) due to bugs in my app. Video editing apps in general are notorious for crashing, and I believe it’s partially due to Apple’s poor documentation of AVFoundation and its myriad error codes. There’s also a foggy concept of “media pipelines” that are managed by iOS itself, are used for playing and exporting media, are completely undocumented, and are limited in number based on, from what I can tell, device chip. If you exhaust those pipelines, stuff just doesn’t work. It’s frustrating.

I’ve been largely resistant to the idea of adding project management to Snapthread. It’s an added layer of complexity that feels beyond the scope of Snapthread’s mission to be a light, casual video editor. Perhaps worst of all, it invites a developer’s most dreaded user expectation: syncing. I knew I had to find a way to save users’ work without building a big, heavy data layer.

I decided to explore the Codable protocol, and slowly began conforming all of my model data to it. I quickly ran into some roadblocks, but nothing too serious. First, UIColor and CMTime do not conform to Codable. However, CMTime can be made to comply with a little tweaking, and UIColor just needed a wrapper struct. For title cards, I needed to store a thumbnail image and possibly a custom background image. I tried converting them to a Base64 string and saving them that way, and it worked great!

I decided not to cache any other media files like photos, videos, or music. Instead, I simply save identifiers that will allow Snapthread to refetch them. Since they’ve already been recently downloaded from iCloud (or were already local on device), refetching takes a trivial amount of time. Additionally, nearly all edits a user makes to a video clip are stored as parameters and applied at the time the final video is composited, so no actual permanent changes are made to the original files.

Another problem I ran into was protocols not playing nicely with Codable (because they don’t conform to themselves…whatever that means). I ended up creating some base structs for my protocols as a workaround. I hate adding extra layers of abstraction like that, but it worked, and now all of my model classes like Clip, StillPhoto, and Title, all of which conform to a protocol called Threadable, can easily be encoded from an array of Threadable.

I used a helper class called Storage by Saoud M. Rizwan to easily cache the user’s current project by saving and retrieving a single .json file to/from the Documents folder. Snapthread caches the project every time a change is made and only deletes the cached data when the user saves or shares the video. Therefore, if the app crashes (or is force quit) before the video is exported, it will offer to recover the project when the app is opened again.

I’m really hoping this cuts down on the number of frustrated users. Personally, I don’t mind if an app crashes as long as I can resume what I was doing with little to no effort. This should allow Snapthread users to do that.

Journey to 2.0: Market Appeal

Snapthread’s journey to version 2.0 has been one of gradually broadening market appeal. If you recall, version 1.0 was an extremely niche product, aimed at people who took lots of short, vertical videos and wanted to merge them together. Over time I added the ability to merge Live Photos, and then added support for landscape videos. I thought I’d market Snapthread as a cool app for stitching Live Photos together, but quickly learned that even Live Photos are pretty niche.

Recently, after spending a couple hours studying the top charts in the Photo/Video category, I realized that Snapthread could be a great, useful app for just about everybody. In the reviews of some top video editing apps, users bemoaned shady subscription practices, being constantly nagged to buy in-app purchases, and being asked to give 5-star ratings in order to use features (for the record, it’s impossible for devs to unlock features based on star rating). I don’t do any of those things with Snapthread. In fact, Snapthread doesn’t collect any analytics, store any user data, or contain any subscription options. Just a plain old “premium unlock” IAP and, in 2.0, a tip jar.

Based on my research, I know there are some things I need to do to help Snapthread gain exposure. I need to add “Video Editor” to Snapthread’s name to help it show up in search results. I need to add “slideshow maker” to its keywords, because apparently people are hungry for simple slideshow creation apps. Despite Apple’s guidelines, I need to add device frames and descriptions to my App Store screenshots, because that’s what all my competitors do (including Clips by Apple).

I also need a new elevator pitch. The app isn’t “just” for Live Photos anymore. So what is Snapthread? Snapthread is a simple, casual video editor and slideshow maker that works with videos, still images, and Live Photos. It’s fun, fast, and user-friendly.

So how will Snapthread 2.0 compare to Clips, which is free?

  • Snapthread supports multiple aspect ratios (portrait, landscape, or square); Clips only supports square.
  • Snapthread imports Live Photos as videos; Clips treats Live Photos as still images.
  • Snapthread allows you to mute, loop, bounce, rotate, trim, and crop clips; Clips supports muting and trimming.
  • Snapthread doesn’t have project management or an in-app camera.
  • Clips has Live Titles, selfie scenes, animated title posters and stickers; Snapthread has non-animated title posters.
  • Both apps only support one music track per video.
  • Neither app has options for transitions between videos (i.e. wipe, cross-dissolve, etc.)

I’m going to do my best to continue to improve Snapthread with things like stickers and transitions, and the ability to select a specific portion of a music track.

This shift in thinking about Snapthread has been exciting for me, and I’m really looking forward to marketing it as a mass market product.

Journey to 2.0: Container View Controllers

One of the major structural changes I need to accomplish for Snapthread 2.0 is switching to the use of container view controllers. For those who don’t know, container view controllers allow you to embed one or more view controllers inside a parent view controller, which can then manage transitions between its children. UINavigationController and UITabBarController are examples of container view controllers in UIKit, but you can also create your own.

I’ve never used custom container view controllers before, so of course I hit Google to see what I could find. John Sundell has a great introduction to the topic and I really liked the series by Mike Woelmer as well. (Edit: I also meant to include this fantastic article by Ben Sandofsky) The first thing I learned was that if you want to be able to switch between child view controllers, you should probably set them up in code instead of Interface Builder, which only allows a single embed segue between parent and child. I wasn’t ready for that though, so I decided to take a baby step and find a situation where a single parent-child relationship made sense.

Snapthread’s main view controller is…well, massive. I’m too embarrassed to tell you how many lines of code it is, but after combing through it, I realized at least 400 lines were devoted to setting up the AVPlayer, handling play/pause events, observing the status of the current AVPlayerItem, scrubbing, adding the watermark, etc. Clearly, the video player was a good candidate for having its own view controller.

So, I created a VideoPlayerViewController class and began copy and pasting everything relevant to setting up the AVPlayerLayer, displaying time elapsed/remaining, scrubbing, etc. In Interface Builder, I added a new view controller and copied over the video preview view, which is a custom UIView class that resizes its own AVPlayerLayer, and the player controls.

I deleted the video player and controls from my main view controller and replaced it with a container view from the Object Library. I hooked it up to my new view controller using an embed segue.

Next, I had to figure out how to communicate between my main view controller and the video player. Communicating between the player and its parent was easy; I just set up a delegate with methods for responding to changes in AVPlayerItem status and duration (if it exceeds a certain duration, the UI displays a warning that the user needs to purchase Premium in order to export). I set the delegate using prepare(for segue:), which is called when the child VC is embedded.

There were times when I needed to tell the player to do something from the main view controller, however, such as hide its controls or clear its watermark. I wasn’t quite sure how to handle that. Using notifications was one option, but it just didn’t feel right for some reason. I ended up storing a reference to my VideoPlayerViewController and referencing it directly. That’s probably bad practice, but I’m pretty sure it’ll be okay, as I don’t plan on using that particular video player with any other view controller.

Overall, I feel slightly more comfortable using container views now, and I think I’m ready to tackle the next step: transitioning between child view controllers. I plan to post more about new things I’m learning; each of those posts will be prefaced by “Journey to 2.0” if you’re interested (or want to ignore them, lol).

Organizing 2019 with Capsicum

I’ve never been a heavy user of task management apps. I use Reminders for things like our shared shopping list and occasional to-dos and make checklists in Notes for features I want to add to Snapthread. I don’t use a bug tracker, I don’t track any of my habits, and I don’t do any daily journaling. There are many beautifully-crafted apps that are well-suited to accomplish each of those tasks, but the cognitive load involved with getting started is just too much for me right now. That’s why I was excited when developer Ish ShaBazz and designer Heidi Helen Pilypas announced that they were creating a new kind of digital planner called Capsicum.

Screenshot of Capsicum's daily view

Capsicum is like a daily planner, to-do list, habit tracker, and notebook all rolled into one. It takes me back to the days when I would linger in that corner of Barnes & Noble near the registers, the one with all the notebooks and calendars and Moleskines, and drool over all the beautiful planners I couldn’t afford. Capsicum is all about customization and aesthetics: you can choose a cover for your planner, select a primary color and style for the decorative “tape” that divides each section, choose from a fun variety of fonts for headings, and more.

Maybe that’s what I love most about it—the skeuomorphic design. When you open your planner, it literally looks like a paper planner (well, if a paper planner could magically show you the local weather forecast at the top of the page!). When you flip between days, it uses that nice page curl animation. The tabs that allow you to switch between daily, weekly, and monthly views look like real notebook tabs. It’s delightful!

I joined the Capsicum beta early on in its development, but couldn’t quite figure out how it could fit into my daily life. Then, a few weeks ago, something finally clicked.

Habit Tracking

I had been feeling a bit discouraged about how little time I had to blog, work on Snapthread, and generally do things that I enjoy. However, I was simultaneously encouraged that I was finally able to complete a year-long Bible reading plan in 2018. It then dawned on me that if I was able to inch my way through the Bible a few chapters at a time for 365 days, I could similarly work toward other goals in my life by completing a bunch of really small tasks.

I set up a list of things I wanted to track in the Habit Tracking section of Capsicum—things that would improve my health, like eating fish (apologies to my vegetarian readers) and flossing my teeth, and things that would make me feel happy and accomplished, like writing code and blog posts. I don’t have any particular goals in mind, I just want to track how often I do all of those things and see if the numbers themselves might either motivate or encourage me. So far it’s motivated me to open Xcode and make little tweaks to Snapthread, even if just for five or ten minutes, nearly every day this month. It may not be much, but it’s something I can feel good about.

Screenshot of habit tracking in Capsicum
Yes, I changed my notebook color just for this screenshot

Anecdotes

I don’t think Ish and Heidi meant for Capsicum to be used as a daily journal, but I’ve found myself jotting little anecdotes in the Notes section below my daily to-do list. They’re mostly funny things Charlie said or did, and honestly I’m not really sure why I’m doing it because there’s no easy way to go back and view them. However, I’ve failed at every other attempt to use an app like Day One or even a paper journal, so, I guess it’s better than nothing!

App Development

It’s not that I don’t try to keep track of bugs and feature requests for Snapthread. It’s that they’re stupidly spread out in a bunch of different places. Some of them are e-mails flagged as important, others are scribbled in a paper notebook next to my recliner, and still others live in a checklist in Notes. Yeah. It’s ugly, folks. I decided to consolidate that mess into two checklists in the “Loose Leaf” section of Capsicum. That way, they live extremely close to my daily to-do list, which is just a couple tabs away. I can simply look at my feature request list, break a feature down into bite-sized tasks, and add those tasks to my to-do list.

To-Dos

I really love Capsicum’s daily, weekly, and monthly views. The daily to-do list is nice enough, but when I switch to the weekly tab I can see all the things I accomplished (or didn’t) for the entire week. The monthly tab allows me to set goals for the month and see a list of birthdays. Capsicum will optionally migrate unfinished tasks to the next day which is handy for filling me with shame (How many more days will I push off writing that thank you card to my aunt? ?).

What I’ve learned from using Capsicum throughout the past few weeks is that I value the simplicity and convenience of having all these functions in one place over the robust feature sets of separate apps. It makes me happy to hear that Ish and Heidi are planning to add even more “modules” to Capsicum in the future.

Capsicum is available as a $1.99/month or $19.99/year subscription, with a 14-day free trial. From what I understand, you are not automatically charged at the end of the trial, so you can check it out without worrying about canceling if it’s not for you. If it is for you, make sure you give it some App Store rating/review love! It helps us indies out a lot.