Every day I drive down a small stretch of our town’s Main Street in order to drop my kids off at the elementary school, which is nestled a block over in the midst of a tightly-packed residential area, full of quirky old houses and frustratingly-narrow streets. The town has around 7,000 residents and, contrary to popular belief about small-town Nebraska, is wonderfully diverse. A sizable chunk of the population is Hispanic (which is the primary reason I’m trying to learn Spanish)—a relatively recent shift in demographics from the primarily German/Czech heritage in this area. When I first moved here years ago I could still feel the tension in the air from that shift, but it seems to be steadily dissipating; the community has shown willingness to come together and forge an entirely new identity, which I love.

The city has been working on revitalizing its aging downtown area. Many businesses now have new awnings, building facades have undergone historical renovations, and new trees have been planted along the sidewalks. Still, many spots are still sitting empty and looking generally trashy, which is something I’d love to change.

I’ve been dreaming of having a little shop in this town for a long time, but I was never sure what exactly I would want it to be. A bookstore? A café? A few weeks ago an idea cropped up in my mind and has been nagging at me ever since: what about a yarn shop? I could offer a place to buy supplies, gather, take classes: a lovely third space for people who like to knit, crochet, sew, quilt, weave, whatever. There could be coffee, lots of plants, comfy couches.

I don’t know if it will ever happen, but if it does, it will be largely because of YarnBuddy. Working on this app has given me confidence that I can offer something helpful to the world. My husband thinks it’s a cool idea. We’ve even been mulling over the idea of buying some alpacas and making our own yarn. We have the land, the time, and the means…why not? Anyway, I wanted to put this dream down in words because if it does come true…well, this is where it started.

This blog will always be primarily about Apple, technology, and app development, but I don’t suppose you’d mind a little occasional farm-related content? After all, we all need to get out and touch grass now and then. :)


We’re already eleven days into the new year, and here I am writing a New Year Reflection Post of sorts. One thing I’ve noticed this year (and in past years, but this year especially) is that a lot of you have exchanged a concrete list of resolutions for an overall theme word. I think that’s a fine idea, so I’ve decided that my word for this year is rediscovery.

There is a kind of brain fog that feels unique to early parenthood, and yet I know it isn’t. It’s the same sort of disorienting haze that envelopes anyone who finds their time is not really their own, but rather has been sacrificed to another purpose, voluntarily or otherwise. You lose pieces of yourself, little by little, often unnoticed, until one day you begin to emerge from your experience without the faintest idea of who you are, or even who you used to be.

My son is now in school full-time, and my daughter for half a day. I have stretches of time when my harried mind can complete full thoughts, my long-tensed muscles begin to relax, and memories start to return. What did I do before I had kids? Oh, yeah, that’s right. I had hobbies! I took photographs and read books, listened to music and played video games. I noodled around on the piano and guitar, drew pictures, treated myself to good food and other little niceties.

It’s my fault that I let all of those things get buried these last few years, as I’m sure I could have made time for at least a few of them (and in a way, I did: the December Photo Project has been the only hobby-project I’ve stubbornly held on to all this time). The trouble is, I have a tendency to hyper-focus on things, and so my family and my apps received 100% of my energy.

This year is about rediscovering the person I once was, in The Before Times (before kids, before Covid, before we all became so jaded by…*gestures wildly*), and deciding what bits of that person I want to incorporate into who I am now.

More concretely, I’d like to read at least seven books—three of which will be the Lord of the Rings trilogy, which I read in high school but would like to reread. That’s up from zero books in the last six years(!). I want to continue learning Spanish with Duolingo (my username is bhansmeyer if you’d like to be accountability pals! I’ll give you all the high fives). I want to clear out all of the sample data in YarnBuddy and start filling it up with my own crochet projects. I want to dust off my Nintendo Switch and play video games again.

I realize all of this has Big Bilbo Energy (“I want to see mountains again, Gandalf! Mountains!”), but that’s exactly how I feel: like I want to shed the cares of the last few years and go on an adventure. I know many of you are hoping to do the same in one way or another, and I wish you all the best!

2022 In Review

App Releases

Hyper-focusing does have its perks: the past year was an especially good one for my app business.

In addition to bringing YarnBuddy to the Mac, I began to hand-craft a database of yarns which was launched in YarnBuddy 2.0. In order to allow users to submit new yarns to the public iCloud database, I created a “PendingYarn” entity. My original intention was to copy over the data from these user-created entities one by one into a corresponding public entity that only I can write to, making sure to research and fill in any empty fields for the sake of data completeness. However, I’ve instead found it far more useful to use the yarn database submissions as a way of determining which brand/type of yarn I should add to the main database next. If I see 100 submissions of a particular line of Lion Brand yarn, I know I should go ahead and add that yarn to the database in every color. I start with a Numbers spreadsheet, which gets converted to a TSV file and then to JSON before getting exported en masse to iCloud via a helper class.

Creating such a database from scratch has been time-consuming, but so far, worth it. Since there is no comprehensive barcode database for yarns, I’ve had to get a bit creative. Sometimes I can find barcodes embedded in the html of retailer websites. Other times, I’ve resorted to scouring Etsy and Ravelry for photos of the skeins of yarn, hoping to catch an angle where I can clearly read the label and barcode number. I’ve even scanned a few codes at my local Walmart, which has a rather abysmal yarn selection.

Since most of my users are from the United States, I’ve focused mainly on popular brands here. However, my goal is to get many of the most popular European and Australian brands into the database by the end of this year.


On the financial side, things are looking pretty good for YarnBuddy. Since its launch in July of 2020 through the end of 2022, it has generated about $20k in proceeds. That amount is almost perfectly split between the yearly subscription and the one-time “lifetime” purchase. On Christmas morning, the app hit 500 subscribers. I’m hesitant to set this year’s goal at 1000 because it sounds impossible to me, but maybe goals are meant to seem that way?

Bar chart of YarnBuddy’s lifetime proceeds
Screenshot from RCKit, showing YarnBuddy’s climb to 500 subscribers on Christmas morning


I experimented with Search Ads quite a bit this year. What I ultimately learned is that while they may cause a small uptick in downloads, they aren’t the right fit for YarnBuddy.

On a whim one day I made an Instagram account for YarnBuddy. I followed the knitting and crocheting hashtags and started liking photos of projects that I thought were pretty or neat (or pretty neat). One day I accidentally liked and followed so many people that I got briefly locked out (I was just excited!). I used my first name in the bio for the account and began using it as if it were my own personal crochet-focused Instagram.

I didn’t really expect anything to come of it; I was just enjoying scrolling through the account’s timeline and appreciating everyone’s skill. Instead, I started to see an increase in new trials being started. Downloads began to slowly increase. I think I may have finally found my people in a place where I actually feel comfortable interacting (unlike Reddit 😬)! My plan is to continue working on my own crochet projects and posting about them, as well as showing photos and videos of how I’m using the app to help me. This is honestly my favorite kind of marketing: genuine, low-key, and nearly impossible to measure in any kind of useful way 😂. Hopefully I’ll have more good things to say about how it’s worked out by the end of the year.

Artificial Intelligence

A few months ago, when everybody and their brother was messing with DALL-E, I decided to see what ideas it could come up with for an icon for YarnBuddy. In particular, I wanted to create a holiday-themed icon, but was having trouble imagining what that might look like. DALL-E spit out a few cute ideas; one had a red yarn ball with a Santa hat, another had snowflakes in the background. I ended up redrawing the Santa one from scratch in Affinity Designer, using the same colors but fixing all of the AI glitches. I’m happy with the end product, but feel a bit leery about the process. It feels like…stealing? I don’t know. Anyway, here’s a side-by-side of the original and my recreation.

Looking Ahead

Ever since I launched YarnBuddy, one customer service issue has continued to plague me: data loss. I’m not sure how many people have contacted me to say that they’ve lost all their data, but I’d guess it to be less than 20. Still, every single one of them is gut-wrenching. These people paid for my product, and I let them down.

It seems to happen most after a lightweight migration in Core Data (YarnBuddy uses NSPersistentCloudKitContainer). During the course of the migration, the local database is wiped for some reason, and is supposed to redownload data from iCloud, but it just…doesn’t. It’s so frustrating, and it’s the single biggest cause of my impostor syndrome.

If I accomplish anything this year, I want it to be this: a robust automatic backup system. All of my Core Data model entities have a corresponding “draft” struct for non-destructive editing purposes. Most of those draft structs conform to Codable, so it shouldn’t be much work to back everything up in a JSON file. The trouble lies in the bigger data blobs: photos and PDFs. How should I back them up? Won’t they take up a ton of space? Should I zip everything together?

Also, when/how often should I backup user data? Is this something I should use Background Tasks for? I’m more than a little confused about the whole execution part of it. I’m sure I’ll figure it out eventually.

A couple other things I’d like to add to YarnBuddy this year are a dedicated area for managing pattern PDFs, and a Spotify Wrapped-style summary of knitting/crochet accomplishments.

I also really want to ship the SpriteKit gardening game I started making for my son (I originally had the word “finish” in place of “ship,” but I have so many ideas for it that honestly I’d better just ship it and add updates later, lol). It’s like 90% done in terms of being in shippable condition, so I’m just on that last 10% that takes foreeeeever. But I think I can do it! I’m definitely not expecting it to be a money-maker; it’s just for fun.

One final goal for this year is to blog more. I’ve been hanging out on Mastodon a lot, and have it set up for this blog to automatically post to Micro.blog and then cross-post Mastodon. It definitely feels good to have a home for my writing that I control.

What are your goals for this year? What accomplishments from 2022 are you proud of? You can always reach me on Twitter, Micro.blog, or Mastodon (preferably Mastodon).

The Friends We Made Along the Way

I was in middle school when I really fell in love with the Internet. I’d flirted with it a bit prior, creating my own Geocities site in homage to the popular virtual pet software “Petz” by P.F. Magic and browsing artwork created with Bryce 3D and early versions of Maya. But it was in a Friends (yes, the TV show) message board of all places that the web became my home. At 13, I was one of the youngest members of the forum, but no one seemed to mind. We discussed the show, accumulated a multitude of silly in-jokes, and eventually began to share bits and pieces of our lives. It was there that I—a little girl from middle-of-nowhere Nebraska—met people from Germany, Turkey, Brazil, Czechia, the Netherlands.

Our conversations drifted off the forum and into AIM and ICQ. We began to send each other snail mail: postcards, candy unique to our countries, even mixtapes. I have them all tucked in a box somewhere—a precious memory of those formative years. Eventually we took on various roles within our community, and developed traditions. For instance, when it was someone’s birthday, someone else would be assigned to write a birthday “script” for them: an original Friends-style sitcom script, usually with a hilariously ridiculous plot, starring all of us. Two people from the forum (from different countries!) even fell in love and got married.

Then, Friends was over. We grew up, scattered. Years later we found one another on Facebook and Instagram, but it was never the same.

When I tweeted my way into the iOS community so many years ago, I felt the same energy and excitement, if not necessarily the same level of closeness. You all gave me the confidence I needed to keep going with programming when I felt like giving up. We’ve shared so many laughs, so many frustrations.

Many have been eulogizing Twitter these past couple of weeks, and I don’t have too much to add to what they’ve said. I know Twitter wasn’t great for a lot of people, but it was great for me. Even if we all manage to stay in touch through a hodgepodge of RSS feeds, Mastodon accounts, and Discords, things will never be the same.

And let’s be honest: Twitter was the best way to get Apple-related bugs fixed. Feedback/radar never was, and never will be, as effective as a few frustrated tweets, retweeted into oblivion. How will App Store Review injustices be rectified without us all rallying behind the little dev? In losing Twitter, we lose our ability to publicly shame corporations into doing the right thing…which, obviously, we never should have had to do in the first place.

And don’t get me started on the wealth of bite-sized information we’ll lose—with the departure of folks like Steve Troughton-Smith, much is already gone. Workarounds for SwiftUI bugs. Tips and tricks for Mac Catalyst. The feeling of solidarity in knowing that you’re not the only one getting dozens of CloudKit error codes.

I know we’ll figure all of this out in time. I simultaneously feel a sort of nervous optimism combined with a deep sense of loss. I’m going to enjoy blogging again. I’m going to miss liking all of your tweets. I hope we’ll stay Internet friends, in one way or another. As always, you can find me on Twitter until the lights go out, on Mastodon with the same handle (I’m on mastodon.social), on Micro.blog, lurking in the RelayFM Discord, and blogging right here on this site. My next post will be all about YarnBuddy (what I’ve been working on, how it’s doing, etc.), so stay tuned. <3

Oh, and a free app idea: yo, but for Bay Area earthquakes. You’re welcome.

Edit 11-14-22 12:42 p.m. CST: Comments are now open. I’ll probably open comments more often from now on.

Reflections on WWDC 2022

General Thoughts

Let me first say: I think it’s hilarious Apple didn’t even mention augmented reality this year. On one hand, it felt like very purposeful trolling; on the other hand, much of Apple’s unusually straightforward messaging this year was clearly designed to prepare developers for new form factors (larger iPads? AR/VR headset?).

During the Platforms State of the Union, Josh Shaffer (Senior Director, Swift Frameworks) gave an overview of Apple’s overall platform vision. He concluded with this statement, directed toward anyone creating a new app in 2022: “The best way to build an app is with Swift and SwiftUI.” Throughout the week, in both sessions and digital lounges, engineers from various teams continued to expand on that statement, reminding developers that AppKit and UIKit weren’t going anywhere.

Josh Shaffer standing in front of a large slide that says "The best way to build an app is with Swift and Swift UI"

For example, in the SwiftUI Digital Lounge, SwiftUI engineer Taylor Kelly wrote the following:

“Across all platforms, we’d recommend comparing your needs to what SwiftUI provides (so no hard rules/recommendations) — and keeping in mind that you can adopt SwiftUI incrementally.

Within Apple’s own apps on macOS, we’re ourselves using the full spectrum of approaches. From just a specific view/views in an app, e.g. in Mail, iWork, Keychain Access; to an entire portion of the UI or a new feature, e.g. in Notes, Photos, Xcode; and all the way to the majority of an application, e.g. Control Center, Font Book, System Settings.

But in the end, I’d recommend starting with a part you’re comfortable with and building up from there! You should look at SwiftUI as another tool in your toolset in enabling you to build the best apps you can. [emphasis added]”

One of the most important things Apple did during this year’s WWDC was untangle the mess of code paths for making a Mac app. Instead of native SwiftUI, pure AppKit, SwiftUI + Catalyst, and UIKit + Catalyst, we have the following:

  1. SwiftUI only, if you’re starting with an iOS app written primarily in SwiftUI. UIKit and/or AppKit can be sprinkled in as needed.
  2. UIKit + Catalyst, if you’re starting with an iOS app built primarily using UIKit. SwiftUI can be sprinkled in as desired.
  3. Pure AppKit. SwiftUI can be sprinkled in as desired.

SwiftUI gained some significant improvements this year in the areas of navigation, UI layout, and system integration (i.e. photo picker, share sheet). With these additions, my iOS 16 update for YarnBuddy should put it on par with what it would have been if I’d built it using UIKit. I know that sounds nuts, but that’s early adopter life! If you don’t need to support older OSes, this is the right time to jump into SwiftUI.

I was probably one of only a few people that let out an audible squeak of excitement when “What’s New in PDFKit” appeared in the session schedule. With a new API for adding a PencilKit overlay to PDF pages, I can replace my incredibly hacky PDFKit/PencilKit implementation (involving syncing up zooming and scrolling between a PencilKit canvas and a PDFDocument…ugh) with something that should work much smoother. ?

Useful Tidbits

While the sessions are always really helpful, I came across a few things while browsing sample code and the SwiftUI Digital Lounge that I wanted to share here, in case they help you too.

Flow Layout
I’m not sure why this wasn’t added as a built-in option, but if you’re planning to use SwiftUI’s new Layout protocol to create your own horizontal flow layout (e.g. a cluster of tags), you don’t need to reinvent the wheel. A FlowLayout is included in the new Food Truck sample app, and you can find it on GitHub.

Core Data + CloudKit
Apple has just updated Synchronizing a local store to the cloud to include some code for testing and debugging NSPersistentCloudKitContainer implementations. I’m also planning to copy Apple’s clever implementation of thumbnail caching for images stored in Core Data. I’m assuming this sample project is considered best practice, so it’s probably a good idea to take as much from it as possible.

Changing view presentation style based on size class in SwiftUI
This was possible pre-iOS 16, but I only just realized it after someone asked in the SwiftUI Digital Lounge. If you’d like to present a view as a sheet in one size class and a popover in another, you can do that by creating a view modifier that looks at the UserInterfaceSizeClass. I haven’t checked this out myself, but apparently you can find an example of this use case in the Bringing multiple windows to your SwiftUI app sample code, in ProgressEditor.swift.

User Perspective

As an Apple product owner, there were several things that really caught my attention. The first was Shared iCloud Photo Libraries. My husband and I often AirDrop each other photos so that we can each have them in our libraries. We also have Shared Albums set up with each of our kids so they can see some of the photos we take on their own devices. Shared iCloud Photo Libraries will finally eliminate all of that needless sharing and shuffling.

The new Shortcuts APIs also look really promising to me. If apps can pre-bundle shortcuts that will automatically appear, that will drastically increase my use of them. Currently, I don’t have the time or patience to piece together my own shortcuts, even though I know they could end up saving me time.

Other things I’m looking forward to include improved sleep metrics on Apple Watch, new Lock Screen customizations, Messages editing and mark-as-unread, improved Mail search, and Stage Manager for iPad—although I won’t get to experience that until I upgrade to a future M2 iPad Pro.

Finally: Weather for iPad. Having the widget without the app was ridiculous, and I’m glad we can put that embarrassing situation behind us.

Summer Plans

My plan this summer is to finish up a quick pre-iOS 16 update for YarnBuddy that allows crochet and knitting projects to be exported as PDFs. Then, I’m going to more or less rewrite the app for iOS 16, adding in as many of the new goodies as I can. I plan to keep my old views around in a different code path so I can continue supporting iOS 14 & 15 though. My question for all of you is: what percentage of active devices (or sessions) serves as your threshold when deciding whether or not to drop support for an older OS version?

Anyway, YarnBuddy 2.0 will be available this fall, for iOS, watchOS, iPadOS and, for the first time, macOS. I’m hoping by the end of this year I’ll hit 500 subscribers, which is a pretty cool milestone to look forward to. Best of luck to all of you working on app updates (or brand new apps!) over the next few months!

What I’m Excited About/Hoping for at WWDC 2022

It’s kitten season here at the farm. As I type this, I’m sitting at the bottom of the stairs near the dining room, keeping an eye on a shoebox full of black and white furballs. We had to pry them from their safe spot under our deck so the exterminator could treat around our house for termites (ugh). In a couple hours, we’ll return them to their mom, who will likely find a different spot for them. (Note: I wrote this earlier today.)

The past few months have really flown by. In March, we decided to build a greenhouse against one side of our garage. My husband has a degree in building construction; he poured the concrete for the foundation on March 4 and we had plants in the greenhouse on March 29, which still amazes me tbh. It still needs a few finishing touches, but in the meantime we’re having fun learning how to grow a variety of veggies, fruits, and flowers. We’re hoping we’ll be able to keep it warm enough in the winter to continue our gardening hobby year-round.

Last month, my son finished kindergarten. ? I still can’t quite process how fast his first year of school went. Thankfully he loved it and can’t wait to go back!

All that’s to say: WWDC kinda crept up on me this year. It had always been my plan to shift gears at the beginning of June, pausing work on my SpriteKit game to prepare an update for YarnBuddy and take in all of the new goodies Apple will announce in just two days’ time. Some weird animation bugs have cropped up in YarnBuddy over the past few iOS point releases (thanks SwiftUI!), so I need to deal with those, as well as add the ability to export projects as nicely-formatted PDF files. Eventually, I need to start working on the foundation of a crowd-sourced yarn database.

There have been so many cool rumors swirling around the conference this year—those, combined with the in-person element and lack of developer-relations kerfuffles have contributed to what seems like an unprecedented amount of hype. Here are a few things I’m hoping to see, in no particular order:

  • A redesigned MacBook Air with an M2 chip. I’m hoping for a green or orange one, but Mark Gurman thinks the only new color might be blue, which would also be rad. I’m still hoping they’ll go with white bezels just for that pure iBook nostalgia.
  • More ways to customize the Home Screen or Lock Screen on iOS and iPadOS. I love the idea of an old school Mac-like Dashboard with live, interactive widgets. It’s also way past time to be able to insert blank spaces into our app icon layouts.
  • A first-party professional iPad app. There have been interesting rumors about yet another revamp of iPad multitasking, possibly involving (gasp) floating windows. What better way to demo this new “pro” mode than with a new pro app? Something that involves multiple windows and/or moveable floating tool panels, etc.
  • The announcement of Apple’s new standalone classical music app. I enjoy listening to classical (especially choral music) and I’m looking forward to having an app specifically optimized for that purpose.
  • SwiftUI improvements. I still don’t regret the decision to write YarnBuddy entirely in SwiftUI; however, it would be nice to be able to start phasing out some of the weird hacks and workarounds I’ve had to come up with to make the app look and work the way it does. For example, there’s still no true collection view equivalent. Navigation could use a re-think, or at the very least, some official guidance. Core Data integration is okay-ish, but it’s needlessly difficult to make it possible for users to sort and filter a fetch request. Any and all improvements are welcome, so I’m excited to see what the team has been working on this year (though I, like others, think SwiftUI needs to be decoupled from the annual OS upgrade cycle).
  • New and/or third-party watch faces. When I think of all the amazing designers I follow on Twitter, it makes me sad to imagine the gorgeous, fun watch faces they could come up with that will probably never see the light of day. I’m hoping this is the year Apple gives up its tight control over watch face design and gives designers a simple API for building and sharing (selling?) watch faces. Even if there are a limited number of third parties (Nintendo??), I’d call it a win.
  • Augmented reality stuff! As someone who gets really bad motion sickness, I’ve never been interested in virtual reality. Augmented reality, however, interests me. I can see its potential for good in areas like navigation, accessibility, and collecting Pokémon. I get a kind of uneasy feeling, though, when I consider the possible negative social consequences of wearing a computer on your face. Maybe I’ll have to write more about that someday.
  • Flat design is just…well, over. It’s been on the way out for awhile, but it’s time for us to save the good bits and jettison the rest of it straight into the sun. In other words: here’s hoping Apple puts the final nail in the coffin on that weird chapter of mobile app design. I just want my apps to have personality again, you know? Not in a garish way, but a beautiful, fun way.

What are you hoping for this year? Although I won’t be there in person, I’m looking forward to watching the keynote at home and watching sessions whenever I can find a free moment. I really enjoyed the Digital Lounges last year, so I signed up for more this time around. I’m also hoping I can try at least one of the challenges.

I hope you all have an awesome week, whatever you’re doing! And don’t get stressed about all the new stuff…remember: the summer is long, the videos can be rewatched, you probably have to support older OSes anyway, and Apple won’t have half of their own bugs worked out until maybe next spring. So relax! Happy WWDC!

I’m Making a Game

I think this is the longest I’ve gone without a new blog post in several years now. Over the summer, I got YarnBuddy to a good, stable place—meaning people are now emailing me feature requests instead of bug reports. I still have a long list of features to add to YarnBuddy, so it is in no sense abandoned and I plan to begin working on it again soon, but in the meantime… I’m making a game.

People who have followed me on Twitter for a long time know that at one point, I was working on a game called Corgi Corral. It was an arcade-style game where you played as a corgi trying to round up as many wayward sheep as possible before a timer reached zero. It looked cute, but I had no knowledge of game design or how to tune the mechanics to make it truly fun, so I stopped working on it.

My son has been really into gardening/farming games recently. There’s one called Montessori Nature that he’s been playing on his iPad for years, but unfortunately it’s buggy and hasn’t been updated in ages. It also has several other points of frustration: namely, there’s not enough room to make a super gigantic garden, and the garden requires too much up-keep (plants wilt quickly, get eaten by pests, etc.). Charlie’s more interested in the design aspects of gardening—making something that looks cool, spelling his name in flowers, etc.

He has since downloaded and tried every single farming game from the App Store (I am not exaggerating). They are all either A) riddled with ads, B) super buggy, C) infested with IAPs or D) all of the above. Games like Stardew Valley, which are actually good, are too advanced for a 5-6 year old and include gameplay elements that he’s not interested in.

So, here I am, making a gardening game for kids. It will be called Charlie’s Garden, and it’s going to be the most chill, relaxing, do-whatever-you-want gardening sim. You’ll see different animals at different times of day and in different seasons, but the day/night cycle is quick enough to catch all of it in 30 minutes. There’s no experience levels, just in-game money (which will be easy to come by), so everything is unlocked from the get-go. There will be an element of exploration, maybe even some light crafting (what happens if I combine these items? Let’s find out!). I’m tossing around the idea of including a few mini-games. Scope-creep is real though, so I have a clearly-defined MVP that I will release first, before adding other stuff.

Here’s an early look at what I have done so far!

The game will be paid-upfront, with no ads or IAPs, and will work on super old iPads. I’m making it using SpriteKit, and drawing every single sprite myself in Affinity Designer. I’m hoping to have a build that can be tested by his birthday in mid-March (in other words…this blog post ends here, because I need to get back to work!).

A Few Thoughts on the Eve of WWDC

Here we are again, on the eve of another WWDC, feeling…weird. Excited. Ambivalent? Curious. Did I mention excited? But also kinda annoyed. And some of you? Some of you are downright mad.

There’s a cloud hovering over Apple Park again, and it’s not just the pandemic. It’s bruised developer relations. It’s alleged anti-trust violations. It’s App Store scammy-ness. It’s the weight of a million different expectations and quibbles, from “make the iPad more like the Mac” to “let the iPad be an iPad,” from pro hardware announcements to satisfy developers, to hints of an augmented reality revolution to satisfy those hungry and excited for the post-staring-at-screens era.

And then there’s the shareholders. Can’t forget the shareholders.

Caught in the middle of it all, then, are the lovely Apple employees we know (or are lightly acquainted with) and love. They show us their work with such deliberation and care, such passion and delight. They made our iMacs colorful again. They made it possible to control an Apple Watch with one hand. They work on Notes, on tvOS, on Safari, on SwiftUI, on hundreds of teams that make things millions of people rely on. And if any of them are reading this: I appreciate you, and I hope you have an awesome WWDC week. I can’t wait to see what you’ve been working on. There’s always something announced at WWDC that just blows me away, and I know this year will be no different.

Speaking of different, Apple’s slogan used to be “Think Different.” Apple does many things differently, such as its environmental initiatives, focus on health and accessibility, and emphasis on privacy. But Apple is a big company and big companies naturally become stubborn, entrenched in tradition, and difficult to steer in different directions.

Unfortunately for Apple, the winds of change are blowing, have been for a long time, and are reaching gale force. In various regions of the United States, the coronavirus’s progress has been stymied and the phrase “back to normal” is bandied about as if it’s a sure thing, as if “normal” is something we have managed to recover, rather than something new being slowly born from the ashes of a horrible year.

While some have learned absolutely nothing from this experience, others are finding a renewed understanding of what’s most important to them. A country obsessed with work is toying with the idea that the way we do and view work might not always be the best way. And amidst all of this, an absolute reckoning involving the way we treat one another, and the way our entire society is structured to, consciously or unconsciously, treat some worse than others.

There is palpable anger toward so many in authority, whether in government, or at companies like Apple, for a failure to listen and a refusal to even consider change.

I’ve said this before, but I believe one of the single most important leadership qualities is humility, which by definition requires listening. If Apple executives listen to their employees and developers, decide their requests are not in line with the company’s core values, and say as much, that is one thing, because at least it’s honest. If, however, their requests or ideas align with the company’s values, but clash with its traditions or shareholder expectations (or simply aggravate the executives’ hubris) and they dig in their heels and tighten their grips, they are rightly deserving of criticism and, dare I say, scorn. And I think they’ll find, as the winds of change continue to blow, that they’ll eventually be caught in a storm they can’t escape, driven along on a course they did not chart for themselves.

It’s not about giving in to every little demand being lobbed at them. It’s about collecting information, determining what the right thing to do is, and doing it the Apple Way. When Apple does that and does it right, the results are fantastic.

Let’s hope we see some of that Apple shine through this week.


Adding a Gradient to Large Title Text in SwiftUI or UIKit

When I first released YarnBuddy, it had an orange-to-pink gradient as the navigation bar background color. The colors reminded me of a sunset, which I liked, but the overall effect was a little too heavy and could easily clash with the user’s own project photos. I wondered if I could do something a little more subtle and put the gradient inside the navigation bar title text itself.

The good news is that SwiftUI makes it trivially easy to create gradients and mask them in a variety of ways. The bad news is that SwiftUI can’t do much of anything when it comes to customizing the navigation bar. Maybe that will change in the next version of SwiftUI, to be announced at WWDC in a little over two weeks…maybe it won’t. For now, we can use the good ol’ UIKit Appearance APIs to accomplish our goal.

What we’re going to do is create a UIColor from a pattern image. The image will be generated using our gradient colors, and sized based on the height of the navigation bar and the width of the longest title we expect to display. All of this will happen in an extension to UINavigationController, in which we’ll override viewDidLoad().

The first thing we’ll need is a function to create an image from our gradient.

func getImageFrom(gradientLayer:CAGradientLayer) -> UIImage? {
    var gradientImage:UIImage?
    if let context = UIGraphicsGetCurrentContext() {
        gradientLayer.render(in: context)
        gradientImage = UIGraphicsGetImageFromCurrentImageContext()?.resizableImage(withCapInsets: UIEdgeInsets.zero, resizingMode: .stretch)
    return gradientImage

Since the function isn’t guaranteed to return an image, we’ll assign a default color for large title text. I chose UIColor.label, since I knew it would automatically adjust between dark mode and light mode. I also really like the “rounded” font design so I added it to my large title font descriptor; you can go with the default or serif options or a completely different font if it suits your app. Here is an example of what you can do in your UINavigationController extension:

extension UINavigationController {
    override open func viewDidLoad() {
        var gradientColor = UIColor.label
        let blue = UIColor.systemBlue
        let purple = UIColor.systemPurple
        let largeTitleFont = UIFont.systemFont(ofSize: 40.0, weight: .bold)
        let longestTitle = "My Awesome App"
        let size = longestTitle.size(withAttributes: [.font : largeTitleFont])
        let gradient = CAGradientLayer()
        let bounds = CGRect(origin: navigationBar.bounds.origin, size: CGSize(width: size.width, height: navigationBar.bounds.height))
        gradient.frame = bounds
        gradient.colors = [blue.cgColor, purple.cgColor]
        gradient.startPoint = CGPoint(x: 0, y: 0)
        gradient.endPoint = CGPoint(x: 1, y: 0)
        if let image = getImageFrom(gradientLayer: gradient) {
            gradientColor = UIColor(patternImage: image)
        let scrollEdgeAppearance = UINavigationBarAppearance()
        if let largeTitleDescriptor = largeTitleFont.fontDescriptor.withDesign(.rounded) {
            scrollEdgeAppearance.largeTitleTextAttributes = [.font : UIFont(descriptor: largeTitleDescriptor, size: 0), .foregroundColor : gradientColor]
        navigationBar.scrollEdgeAppearance = scrollEdgeAppearance

Setting the the x-value of the gradient’s start point to 0 and the end point to 1 creates a horizontal gradient. You can create a vertical gradient by changing the y-value instead. You’ll see that if our getImageFrom(gradientLayer:) function returns an image, we’ll use that to create a UIColor that we can use when assigning text attributes to our instance of UINavigationBarAppearance.

You’ll see I’m only setting the navigation bar’s scroll edge appearance—that’s because it covers the only navigation bar state where large title text appears. However, in YarnBuddy, I also set the “standard appearance” and “compact appearance” to use colors that match the user’s selected theme. If you’re wondering why I’m not using a gradient in all cases, it’s because it doesn’t look very good with small font sizes and makes the text way less readable.

I made a playground using the above code and SwiftUI so that you can fiddle around with colors and font:

What’s New in YarnBuddy

My backlog of blog posts I want to write is getting a bit ridiculous at this point. One of them is a follow-up to YarnBuddy’s “App of the Day” feature back in March, complete with screenshots and stats and the whole story of how that feature came to be. I’m not quite ready to put all of that together yet, so today, I just want to note some of the things I’ve been working on lately in YarnBuddy (from a more technical standpoint).


I’ve released 6 updates for YarnBuddy so far this year (with another one waiting for review), kicking the year off with a major design refresh that introduced the ability to change the app’s theme. I did this by creating a Theme struct that has a number of semantic colors such as “primaryAccent,” “secondaryAccent,” “headerText,” “rowBackground” etc. Next, I set up an AppTheme enum, with each case being the name of a theme. The enum has a variable called “colors” that returns a Theme struct for each case. The result is that I can do things like .background(settings.theme.colors.primaryBackground) and it just works!

However, there are some UI elements in SwiftUI that are notoriously difficult to customize, such as the background of the list view that a Picker pushes onto the stack. I realized that some of my themes could be considered “light,” while others would be more at home with dark mode defaults. In other words, a bright white background would be super jarring in my Supernova theme. So, I decided to override the user’s preferred mode based on the selected theme.

In my SettingsStore class, which manages a number of UserDefaults keys, I added the following:

var theme: AppTheme {
                set {
                        defaults.set(newValue.rawValue, forKey: DefaultKeys.appTheme)
                switch(newValue) {
                case AppTheme.system:
                    UIApplication.shared.windows.first?.overrideUserInterfaceStyle = .unspecified
                case AppTheme.dark, .midnight, .supernova:
                    UIApplication.shared.windows.first?.overrideUserInterfaceStyle = .dark
                    UIApplication.shared.statusBarStyle = .lightContent

                case AppTheme.light, .grapefruit, .creamsicle, .seaside:
                    UIApplication.shared.windows.first?.overrideUserInterfaceStyle = .light
                    UIApplication.shared.statusBarStyle = .darkContent
                    UIApplication.shared.windows.first?.overrideUserInterfaceStyle = .unspecified
    get { AppTheme(rawValue: defaults.string(forKey: DefaultKeys.appTheme) ?? AppTheme.system.rawValue) ?? AppTheme.system }

Those status bar text color overrides are deprecated but I can’t figure out how I would call their replacement in SwiftUI. For now, it works!

Data Export

The next major version, 1.5, gave users the ability to export a zip archive containing all of their photos, pattern PDFs with annotations, and metadata (in plain text files). I wanted to give users some method for getting their data out of the app as soon as possible, even if it wouldn’t be importable. Now that I’ve shipped it, I’ve begun slowly working on a true backup solution using Codable.

To export data using SwiftUI, I created a struct conforming to FileDocument and did all of the data gathering in the fileWrapper(configuration: WriteConfiguration) throws -> FileWrapper method. I used SwiftUI’s .fileExporter modifier to trigger the export. YarnBuddy uses CoreData, so to make things a little easier I created an “exportString” variable in each NSManagedObject subclass that prints a nicely-formatted string of all metadata associated with the class. That way, I just needed to loop through all of the user’s projects and yarn, map the exportStrings to their own array and append them to the appropriate plain text files.

Time Tracking

Version 1.6 introduced time tracking for Pro users. Time tracking can be fun and useful on its own, but what I’m hoping to do is lay the groundwork for some fun “end of the year summary”-type features. For avid knitters and crocheters, it would be neat to know how many projects you completed that year, the average and total time spent on them, etc. Someday I’d also like to have a little “share card” creation studio where users could choose stats and photos to share on social media.

Time tracking is also a great feature to integrate with widgets, Shortcuts, and the watch app, so there will be plenty of low-hanging fruit for me to pick throughout the summer!

Yarn Remaining, User Guide, and Stash Export

Version 1.6.1, when approved, will be a big point update: it includes a new user guide, the ability to export your yarn stash as a CSV file, and new estimates of yarn remaining.

Previously, when a user linked a project to a yarn in their yarn stash, the amount of yarn would not automatically be subtracted from the total stashed amount. The reason for that was logistics: “In progress” projects weren’t guaranteed to be finished, projects could list yarn quantities in terms of the number of skeins (yarn balls), grams, or ounces, and the total amount of yarn in the user’s stash could be shown in terms of skeins, grams, ounces, yards, or meters. In other words, there were potentially a lot of unit conversions required, and in some cases those conversions would only work if the user supplied the length and net weight per yarn ball (which aren’t required fields).

Now, YarnBuddy will attempt to calculate an estimated amount of yarn remaining based on whatever pieces of information is has, using projects marked as finished. If the estimate is clearly wrong, users can override it with a custom value. Hopefully it’s a good compromise, and users will be happy with it.

I don’t get a ton of support email for YarnBuddy, but I wanted to have a place where users could go to get answers to basic questions and even just explore what the app has to offer. So, I wrote a little user guide in plain ol’ HTML using Textastic on my iPad and included it in the app.

YarnBuddy User Guide screenshot

Finally, users can now export their yarn stash as a CSV file. This turned out to be trivially easy, and you can see my entire implementation of it below. If you’re horrified by my lack of error handling, please know that the rest of the app is even worse; it will absolutely give you the heebie-jeebies…just a total wasteland filled with roving packs of feral errors running amok and destroying anything in their path.

struct YBStashCSV: FileDocument {

let managedObjectContext = CoreDataStack.shared.context

static var readableContentTypes: [UTType] { [UTType.commaSeparatedText] }
static var writableContentTypes: [UTType] { [UTType.commaSeparatedText] }

static let yarnWeights = ["Lace (0), Light Fingering, 1-3 ply",
                          "Super Fine (1), Fingering, 4 ply",
                          "Fine (2), Sport, 5 ply",
                          "Light (3), DK, 8 ply",
                          "Medium (4), Worsted/Aran, 10 ply",
                          "Bulky (5), Chunky, 12 ply",
                          "Super Bulky (6), Roving",
                          "Jumbo (7)"]
init() { }

init(configuration: ReadConfiguration) throws {


func fileWrapper(configuration: WriteConfiguration) throws -> FileWrapper {

    let csvText = createCSV()
    return FileWrapper(regularFileWithContents: csvText.data(using: .utf8) ?? Data())


func createCSV() -> String {
    let fetchRequest = NSFetchRequest<Yarn>(entityName: "Yarn")
    fetchRequest.resultType = .managedObjectResultType

    var stashString = ""
    stashString.append(contentsOf: "Name,Colorway,Color Family,Dye Lot,Quantity,Remaining,Weight,Length,Net Weight,Fiber Type,Purchase Location,Date Added,Notes\n")

    do {
        let allYarn = try managedObjectContext.fetch(fetchRequest)
        let stash = allYarn.filter({ $0.isStashed })

        for yarn in stash {
            var notesString = ""
            for note in yarn.notesArray {
            stashString.append("\"\(yarn.wrappedName)\",\"\(yarn.wrappedColorName)\",\"\(yarn.wrappedColorFamily)\",\"\(yarn.dyeLot)\",\"\(yarn.quantityValue) \(yarn.wrappedQuantityUnit)\",\"\(yarn.customRemainingString.isEmpty ? yarn.estimatedQuantityRemainingString : yarn.customRemainingString)\",\"\(Self.yarnWeights[Int(yarn.weight)])\",\"\(yarn.lengthString)\",\"\(yarn.weightString)\",\"\(yarn.wrappedFiberType)\",\"\(yarn.wrappedPurchaseLocation)\",\"\(yarn.dateAdded ?? Date())\",\"\(notesString)\"\n")
    } catch let error {
    return stashString

Wishes for WWDC 2021

‘Tis the season for WWDC rumors, predictions, and wishlists! Spring is off to a shaky start here in Nebraska; we’ve had a series of colder, windy days that seem more at home in March than May, and the forecast for the coming weeks is much more “April showers” than “May flowers.” However, the foliage is definitely greening, the birds are still singing, and, best of all, we have kittens on our farm again (somewhere between 3 and 8…one of the two litters was moved to an undisclosed location by the momma cat).

My husband and I are fully vaccinated, nearby parks, zoos, and orchards are starting to plan fun outdoor activities, and things are generally looking up around here. Of course, I know that’s not true for everyone; I just want to express my gratitude for the circumstances I’m in, and I hope things are looking up for you as well.

There are things to celebrate even in hard times, and WWDC is when we all get to celebrate Apple engineers’ hard work over the past year. I really enjoy reading wishlists and predictions, so this year I’ve compiled a WWDC 2021 Community Wishlist. You’re welcome to contribute, just submit a pull request (or send me a note on Twitter and I’ll add it for you).

Here are a few things I’m hoping to see next month:


  • Native search bar & pull-to-refresh
  • Ability for views to become/resign first responder, and to identify the current first responder
  • For List and Form to either drop their UIKit backing or be much more customizable (cell selection style and behavior, list background colors, etc. without using appearance APIs)
  • Iron out all of the NavigationView/NavigationLink bugs. It seems like there’s some regression in this area with every single update. Also, native navigation bar customizations that don’t require the appearance API (background color, text color, etc.)
  • A way to change the status bar style (light/dark) at runtime using the SwiftUI app life cycle
  • Some sort of “bottom sheet” view that can be pulled up and expanded
  • Accessory views for TextFields/TextViews
  • Inactive/destructive states in Context Menus
  • Context menu preview-providers (for showing a custom preview on long-press/right-click)
  • SwiftUI version of UIVisualEffectView
  • Native support for the share sheet


  • A first-party code editor for iPadOS that supports SwiftUI, UIKit, live previews, Swift Package Manager, light debugging tools, and the ability to archive and submit builds to App Store Connect
  • Another Home Screen overhaul for iPad, allowing widgets to be moved anywhere
  • An iPad feature similar to App Library but that is actually more like LaunchPad
  • The ability to back up to an external drive (i.e. Time Machine)

Everything Else

  • The ability to see and configure Smart Mailboxes in Mail for iOS
  • A method for adding third-party wallpapers that won’t clutter up your Photo Library and also supports light/dark mode
  • TestFlight for Mac
  • Some degree of widget interactivity. I would love to be able to easily start/stop timers, check to-do items, increment a counter, etc.
  • Third-party Apple Watch faces (?)
  • For Apple to chill out and allow apps like Riley Testut’s Delta emulator to be installed on iOS devices in some sanctioned way (remember, emulators are not illegal)
  • For Apple to chill out and let developers accept payments via some approved processors (i.e. Stripe)
  • Improved TestFlight beta review times (or, just ditch the whole review process in its current form)
  • Subscription cancellation API for developers.

I’m sure I’ll think of more things in the coming weeks. Don’t forget to check out the WWDC 2021 Community Wishlist!

A Story Half-Told

Back in November I wrote about how the introduction of the first M1 Macs put extra pressure on Apple to differentiate the iPad Pro from the recently upgraded iPad Air. The Airs were colorful, performant, and supported all the new accessories. Meanwhile, the new M1 MacBook Air was light, blazing fast, and could run iOS apps. What would compel people to buy an iPad Pro?

In that post, I listed 10 things I thought Apple could do to make its pro tablet stand out in the line-up. This week, Apple addressed one and a half of them, and set the stage for a few more. It upgraded the port to Thunderbolt/USB 4 (but didn’t add an additional port like I hoped), added 5G, and gave the iPad Pro the very same M1 chip that powers its new Macs, making it more than capable of running things like Xcode, Final Cut, Logic, etc. The port could potentially point toward things like better external display support and fast Time Machine backups.

Disappointingly, the iPad Pro presentation lacked the colorful, whimsical joy of the new iMac introduction (though I was definitely impressed by the production quality of the M1 chip heist). Apple has doubled down on iPad Pros being Serious Business, which is just too bad, because literally everyone I know would love an iPad Pro in some other color than gray. In fact, I find myself in a strange position—the new iPad Air made me excited for the iPad Pro, which in turned disappointed me enough to make me hopeful that the next iPad Air will be released in some even more vivid colors. Apple has become a company of a thousand SKUs…you can’t tell me they can’t give us some more gosh darn hues. But, I digress.

Once upon a time, Apple made an outrageously powerful, desktop-class tablet, with artificially limited software and I/O.

…and then what?

Well, we have to wait until June 7 to find out. Or do we? The iPad’s future is just as wrapped up in the current anti-trust hullabaloo as it is in iPadOS 15. Will developers be allowed greater freedom to innovate without being fearful of App Review? Will Apple finally shift its focus to eliminating actual multi-million dollar scams and fraud instead of nitpicking honest developers who desire to follow the spirit of the law, if not the letter (which is usually pretty vague to begin with)?

If Apple is willing to give App Review a complete overhaul and also manages to release at least one first party “pro” app for iPadOS this June, I think the iPad Pro’s story will take a happy turn indeed. For now, however, it remains a half-told tale of wasted potential—a sleek, expensive “what if?”

How to Set Up Core Data and CloudKit When You Haven’t the Faintest Clue What You’re Doing

Note: This was posted before WWDC 2021, so if major changes were made to Core Data + CloudKit, they aren’t reflected here. This right here is just pure dumpster fire all the way down. Also if you’re an Apple engineer…I’m sorry.

When Apple introduced changes to Core Data + CloudKit integration in 2019, they sold developers on a dead-simple API: add iCloud sync to your Core Data app with “as little as one line of code.” That one line, of course, is simply changing NSPersistentContainer to NSPersistentCloudKitContainer and enabling a few capabilities in the project settings. Boom, done! And in fact, Apple’s “Core Data –> Host in CloudKit” SwiftUI project template does those things for you, so you’re good to go, right?

Turns out, if you want to sync Core Data-backed data between devices and have those changes reflected in your UI in a timely manner, you have some more work to do. To figure out what that work is, you can’t look at Apple’s Core Data templates. You have to look at their sample code.

My SwiftUI app was created before Apple even added a SwiftUI + Core Data project template, so I created a class called “CoreDataStack” that has a shared instance. If you use the template, that becomes a struct called “PersistenceController.” I’m sure the struct is SwiftUI-ier, but the class from Apple’s sample code (which does not use SwiftUI) makes more sense to my brain, so I went with that.

Step 1: Make your container lazy and set some important options

In Apple’s sample code, you’ll notice that within the persistent container’s lazy initialization, two options are set on the container’s description. Include these.

guard let description = container.persistentStoreDescriptions.first else {
        fatalError("###\(#function): Failed to retrieve a persistent store description.")
    description.setOption(true as NSNumber, forKey: NSPersistentHistoryTrackingKey)
    description.setOption(true as NSNumber, forKey: NSPersistentStoreRemoteChangeNotificationPostOptionKey)

If you don’t set the first option, you’ll regret it. Look, I don’t even really understand what it does, I just know that somewhere down the line, you’ll find some dumb way to break sync and then when you finally get it working again, you’ll find that only managed objects created after this key was set will sync properly. Every object created before the NSPersistentHistoryTrackingKey was set will stubbornly refused to sync unless you modify it and re-save it, which is a giant pain in the derrière. I mean, at least that’s what my…uh…friend told me.

The second option is the first step toward receiving notifications when magic cloud stuff happens. You’ll subscribe to that NSPersistentStoreRemoteChangeNotification later, but for now, just make sure that option is set.

Step 2: Stir in some of this stuff that I have a super weak grasp of

After your container loads its persistent stores, but before you return the container itself, these lines are also important:

container.viewContext.mergePolicy = NSMergeByPropertyObjectTrumpMergePolicy
    container.viewContext.transactionAuthor = appTransactionAuthorName
       container.viewContext.automaticallyMergesChangesFromParent = true
    do {
        try container.viewContext.setQueryGenerationFrom(.current)
    } catch {
        assertionFailure("###\(#function): Failed to pin viewContext to the current generation:\(error)")

There are several merge policies, and you can read about them in the docs.

Again, I barely understand this stuff. For my purposes, I set “appTransactionAuthorName” to the name of my app’s container, which was simply “YarnBuddy.” From what I kinda understand, setting the transaction author here allows me to later filter for changes that weren’t created by my app on this particular device and act on them.

Now, I’ve always had “automaticallyMergesChangesFromParent” set to true, but what I didn’t realize is that it doesn’t just refresh your view hierarchy immediately when a change occurs. Maybe it should, but for me, it doesn’t. That’s where the remote change notification comes in.

Step 3: Dip your toes into Combine for a hot second and subscribe to notifications

I put this code right before “return container.”

      .publisher(for: .NSPersistentStoreRemoteChange)
      .sink {
      .store(in: &subscriptions)

And somewhere within the class I have declared this variable:

private var subscriptions: Set<AnyCancellable> = []

Make sure you import Combine at the top. I know extremely little about Combine at this point. It’s number one on my list of things to learn, and I plan to start with John Sundell’s “Discover Combine” materials.

We’ll get into what my “processRemoteStoreChange” function does in a minute.

Step 4: Just copy over these blessed code snippets from the sample code

Copy the following from CoreDataStack.swift in Apple’s sample code:

  • the initializer
  • lastHistoryToken variable
  • tokenFile variable
  • historyQueue variable

Also copy over the NSPersistentContainer extension in “CoreData+Convenience.swift.”

Also, my “processRemoteStoreChange” function is identical to the sample code’s “storeRemoteChange” function.

Step 5: Merge new changes into the context

I modified Apple’s “processPersistentHistory” function to look like this:

func processPersistentHistory() {
    let backgroundContext = persistentContainer.newBackgroundContext()
    backgroundContext.performAndWait {

        // Fetch history received from outside the app since the last token
        let historyFetchRequest = NSPersistentHistoryTransaction.fetchRequest!
        historyFetchRequest.predicate = NSPredicate(format: "author != %@", appTransactionAuthorName)
        let request = NSPersistentHistoryChangeRequest.fetchHistory(after: lastHistoryToken)
        request.fetchRequest = historyFetchRequest

        let result = (try? backgroundContext.execute(request)) as? NSPersistentHistoryResult
        guard let transactions = result?.result as? [NSPersistentHistoryTransaction],
            else { return }

        print("transactions = \(transactions)")
        self.mergeChanges(from: transactions)

        // Update the history token using the last transaction.
        lastHistoryToken = transactions.last!.token

The “mergeChanges” function looks like this:

private func mergeChanges(from transactions: [NSPersistentHistoryTransaction]) {
        context.perform {
            transactions.forEach { [weak self] transaction in
                guard let self = self, let userInfo = transaction.objectIDNotification().userInfo else { return }
                NSManagedObjectContext.mergeChanges(fromRemoteContextSave: userInfo, into: [self.context])

Most of that code was pulled from Stack Overflow. Apple’s code has a bunch of deduplication logic in it that frankly, I’m not emotionally ready to process, so I skipped it.

I’ve seen a few folks say that merging changes like this shouldn’t be necessary. However—and maybe it’s some sort of weird placebo effect—it seemed like changes from my watch synced much more quickly to my phone. Not instantaneous, but a handful of seconds instead of requiring me to sometimes force quit and restart the app (or switch tabs or something) to see changes.

Step 6: Never forget to deploy your updated schema from the CloudKit Dashboard

Honestly, it’s not that I forgot to do this, it’s that I failed to make sure it actually happened. CloudKit threw some weird error at me and told me my development and production schemas were the same, when they were in fact extremely different. I never double-checked, and chaos ensued! Don’t be like me: make sure your schema is deployed.

After launch, remember that you still have to do this every time you change your Core Data model, before you release your update to testers or the App Store. If your production CloudKit schema doesn’t properly correspond to your production Core Data model, syncing is going to break in all kinds of terrifying ways.


I know I probably could have saved myself a lot of frustration if I’d forked over some money for a Ray Wenderlich membership or some other paid tutorials/books related to Core Data. I’m also guessing there’s a much easier way to set everything up so that cloud changes are reflected near-instantaneously. But y’all, I’ve combed the free internet for weeks and this is the best I could come up with. Maybe it’ll help you too.

Thoughts on Apple Glasses

Rumors about Apple glasses have been swirling around for years now, and they have never once interested me. Why would I want to wear a computer on my face? Didn’t I get LASIK eye surgery to avoid ever again having to clumsily clean a pair of lenses with the corner of my t-shirt and my hot breath? And what about folks that stare at a computer screen all day for their job anyway…are they going to stare at that screen through a pair of Apple glasses? Or do you only wear them in certain situations…in which case, what’s the point?

Whenever I read an article about Apple’s foray into AR headsets, something about it just doesn’t feel right. No one ever makes it sound like Apple is trying to make a product for the mass market, when I strongly believe that they will only release this product if it has mass market (or potentially future mass market) appeal. That means it has to follow the same trajectory as the Apple Watch: comes in many different styles, and is meant to be something you wear all day long and charge at night. Perhaps even reliant on a companion iPhone app until the components get small enough, then made independent.

It’s one thing to have something a bit odd-looking strapped around your wrist, or sticking from your ears. It’s an entirely different thing to have something goofy smack dab in the middle of your face. These glasses are going to have to be sleek af. They’re going to have to look good on a wide range of face shapes, sizes, and skin tones, and appeal to a wide range of personalities. They’re going to have to make people who don’t currently wear glasses want to wear glasses.

But Becky, why try to appeal to everyone when they could just create some sweet futuristic sci-fi specs for influencers and nerds? Because in my heart of hearts, I believe the narrative behind Apple’s AR glasses is going to be the same as the later iterations of the watch: Health. Wellness. Accessibility. The Human Experience.

Honestly, I think these devices will be revolutionary for people who are blind, colorblind, or have low vision. Not necessarily because assistive devices for these groups don’t already exist, but because Apple will do it better, and sell it cheaper. Imagine having Apple Glasses paired with some AirPods, discreetly giving you an audio description of whatever you’re looking at. Personally, I’d love to have a pair of glasses that could help me see better when driving at night (although, who knows if we’ll be allowed to drive while wearing these things. Maybe they’ll have a do-not-disturb-while-driving mode?).

Maybe users with hearing loss could enable live captions, à la Clips. Or on the flip side, hearing folks could use the glasses to recognize sign language. Suddenly, all text responds to dynamic type. Sounds like sirens give visual cues, while sights like signage give audio cues.

Tech reporters love to go on about all the “hidden” accessibility features in iOS that are actually great for the masses, and I think Apple Glasses are going to be a whole lot of that.

People walking or running outdoors could see mile/km markers in the environment around them, and maybe even fireworks in the sky when they reach their goal. Maybe when you’re hunting around for that lost AirTag, a big giant 3D arrow appears over it when you’re close. That might sound silly, but it also kind of sounds like Apple, doesn’t it?

All of this is to say: I think the reason I haven’t been interested in all of the Apple Glasses talk is because the focus seems to be on games, notifications, and maps, which to me are the least interesting and least imaginative features of this supposedly futuristic device. There is no “killer app” because the entire purpose of the device is not just to replicate iPhone functionality, but rather to fundamentally improve the human experience in a deep and meaningful way. It won’t get there in version 1.0, but I now find myself excited about the possibilities. This isn’t about giving us a new screen. It’s about freeing us from screens, from distractions, and bringing us together again.

Let’s be honest: Apple isn’t going to make a “Dear Apple,”-style commercial about how Apple Glasses impacted people’s lives by allowing them to play Minecraft on their bathroom floor or get notifications about the newest AppleTV+ shows directly in front of their pupils. It’s going to be about how two neighbors who speak different languages are able to communicate face-to-face with real-time translations hovering nearby. It’s going to be about people with disabilities having an improved experience in the world and greater overall quality of life. It’s going to be what it’s always been about (besides profit, of course): people. All of the fun-albeit-gimmicky 3D AR stuff is just a cherry on top.

5 Years of App-Making

I realized recently that it’s been nearly five and a half years since I released my first app onto the App Store. Like, holy smokes, where did that time go?

I’ve been feeling somewhat reflective these last few weeks (and I haven’t even watched Soul yet!) as I’ve taken time off from programming to focus on getting ready for Christmas, which is something I genuinely love doing. I ran across this tweet early in my preparations:

…and honestly, that’s exactly what I’m about at Christmastime: being that mom that makes things magical. The day after Christmas, my kids asked how many days were left until next Christmas, so I think I succeeded?

Anyway, back to five and a half years. Although I released my first app on June 10, 2015, I didn’t actually start making any money until the following June when I released LiveRotate, my app for rotating Live Photos (remember when editing Live Photos wasn’t possible without turning them into stills? lol). From June 2016 to today, I’ve made approximately $12,600 in profit from the App Store.

It’s simultaneously a lot and a little. It’s a lot for most developers. It’s a little for the developers I follow on Twitter. Early on, someone asked me how I would define success for myself as an indie developer. I remember stressing that my apps were just side projects (they are) and that I’d be happy if my revenue could cover the cost of my personal device upgrades (it has). At the time, I think I forgot to say something about how I wanted to make things that improved people’s lives, or just made them smile. In that way, I’ve also succeeded, and hearing from happy customers has been incredibly rewarding.

Most of the proceeds from the past four years have come from Snapthread. I owe its success to the very nice media coverage it got, and I owe that media coverage to my following on Twitter, and I owe that following to Brent Simmons, who cared enough to compile a list of women in the iOS community (thanks again, Brent!). Revenue from Snapthread has diminished considerably in the past year or so, as has my enthusiasm for struggling with AVFoundation. Scribblet and YarnBuddy were just what I needed this year, both in terms of challenge and inspiration.

I set a goal a few months ago of reaching 20 annual subscribers to YarnBuddy Pro by the end of the year. I’m happy to report that as of today, my subscriber count stands at 52 subscribers (plus an additional 16 that redeemed a promo code for the first year).

And here we are, at the doorstep of 2021. Convention says that it’s time to set new goals, but I’m just not feeling it. Time stood still in March, and yet somehow things began to happen at an increasingly frenzied pace. There’s a really excellent episode of Mr. Robot in the final season that basically happens in real time, with no spoken dialogue save from one line at the beginning and one at the end. It is ridiculously intense; I felt like I was holding my breath the entire time. This year kinda felt like that too. I hope 2021 feels like a nice, long exhale.

Of course, learning goals are another story. I’d really like to deepen my design skills this coming year as well as my understanding of SwiftUI and Combine. It’s amazing to realize what I’ve learned since starting my development journey, starting with Objective-C in early 2014, then quickly pivoting to Swift 1.0, playing with SpriteKit, AVFoundation, PhotoKit and PencilKit, and now writing apps using SwiftUI. It’s been a wild ride, and I’m so thankful for all of you that have helped me along the way (including many who I’ve never interacted with, but whose blog posts and Stack Overflow answers have literally kept me going).

Do you have indie business goals for 2021? How about learning goals? I’d love to hear them! I wish you all happiness and good health this coming year. ?