stories – MacStories https://www.macstories.net Apple news, app reviews, and stories by Federico Viticci and friends. Fri, 07 Feb 2025 17:55:46 +0000 en-US hourly 1 https://wordpress.org/?v=5.3.2 Gemini 2.0 and LLMs Integrated with Apps https://www.macstories.net/stories/gemini-2-0-and-llms-integrated-with-apps/ Thu, 06 Feb 2025 01:34:36 +0000 https://www.macstories.net/?p=77777

Busy day at Google today: the company rolled out version 2.0 of its Gemini AI assistant (previously announced in December) with a variety of new and updated models to more users. From the Google blog:

Today, we’re making the updated Gemini 2.0 Flash generally available via the Gemini API in Google AI Studio and Vertex AI. Developers can now build production applications with 2.0 Flash.

We’re also releasing an experimental version of Gemini 2.0 Pro, our best model yet for coding performance and complex prompts. It is available in Google AI Studio and Vertex AI, and in the Gemini app for Gemini Advanced users.

We’re releasing a new model, Gemini 2.0 Flash-Lite, our most cost-efficient model yet, in public preview in Google AI Studio and Vertex AI.

Finally, 2.0 Flash Thinking Experimental will be available to Gemini app users in the model dropdown on desktop and mobile.

Google’s reasoning model (which, similarly to DeepSeek-R1 or OpenAI’s o1/o3 family, can display its “chain of thought” and perform multi-step thinking about a user query) is currently ranked #1 in the popular Chatbot Arena LLM leaderboard. A separate blog post from Google also details the new pricing structure for third-party developers that want to integrate with the Gemini 2.0 API and confirms some of the features coming soon to both Gemini 2.0 Flash and 2.0 Pro, such as image and audio output. Notably, there is also a 2.0 Flash-Lite model that is even cheaper for developers, which I bet we’re going to see soon in utilities like Obsidian Web Clipper, composer fields of social media clients, and more.

As part of my ongoing evaluation of assistive AI tools, since Gemini’s initial rollout in December, I’ve been using it in place of ChatGPT, progressively replacing the latter. Today, after the general release of 2.0 Flash, I went ahead and finally swapped ChatGPT for Gemini in my iPhone’s dock.

This will probably need to be an in-depth article at some point, but my take so far is that although ChatGPT gets more media buzz and is the more mainstream product1, I think Google is doing more fascinating work with a) their proprietary AI silicon and b) turning LLMs into actual products for personal and professional use that are integrated with their ecosystem. Gemini (rightfully) got a bad rap with its initial release last year, and while it still hallucinates responses (but all LLMs still do), its 2.0 models are more than good enough for the sort of search queries I was asking ChatGPT before. Plus, we pay for Google Workspace at MacStories, and I like that Gemini is directly integrated with the services we use on a daily basis, such as Drive and Gmail.

Most of all, I’m very intrigued by Gemini’s support for extensions, which turn conversations with a chatbot into actions that can be performed with other Google apps. For instance, I’ve been enjoying the ability to save research sessions to Google Keep by simply invoking the app and asking Gemini what I wanted to save. I’ve searched YouTube videos with it, looked up places in Google Maps, and – since I’ve been running a platform-agnostic home automation setup in my apartment that natively supports HomeKit, Alexa, and Google Home all at once – even controlled my lights with it. While custom GPTs in ChatGPT seem sort of abandonware now, Gemini’s app integrations are fully functional, integrated across the Google ecosystem, and expanding to third-party services as well.2

Even more impressively, today Google rolled out a preview of a reasoning version of Gemini 2.0 that can integrate with YouTube, Maps, and Search. The idea here is that Gemini can think longer about your request, display its thought process, then do something with apps. So I asked:

I want you to find the best YouTube videos with Oasis acoustic performances where Liam is the singer. Only consider performances dated 1994-1996 that took place in Europe. I am not interested in demos, lyrics videos, or other non-live performances. They have to be acoustic sets with Noel playing the guitar and Liam singing.

Surely enough, I was presented with some solid results. If Google can figure out how to integrate reasoning capabilities with advanced Gmail searches, that’s going to give services like Shortwave and Superhuman a run for their money. And that’s not to mention all the other apps in Google’s suite that could theoretically receive a similar treatment.

Bonehead playing the piano? Yes please.

Bonehead playing the piano? Yes please.

However, the Gemini app falls short of ChatGPT and Claude in terms of iOS/iPadOS user experience in several key areas.

The app doesn’t support widgets (which Claude has), doesn’t offer any Shortcuts actions (both Claude and ChatGPT have them), doesn’t have a native iPad app (sigh), and I can’t figure out if there’s a deep link to quickly start a new chat on iOS. The photo picker is also bad in that it only lets you attach one image at a time, and the web app doesn’t support native PWA installation on iPhone and iPad.

Clearly, there’s a long road ahead for Google to make Gemini a great experience on Apple platforms. And yet, none of these missing features have been dealbreakers for me when Gemini is so fast and I can connect my conversations to the other Google services I already use. This is precisely why I remain convinced that a “Siri LLM” (“Siri Chat” as a product name, perhaps?) with support for conversations integrated and/or deep-linked to native iOS apps may be Apple’s greatest asset…in 2026.

Ultimately, I believe that, even though ChatGPT has captured the world’s attention, it is Gemini that will be the ecosystem to beat for Apple. It always comes down to iPhone versus Android after all. Only this time, Apple is the one playing catch-up.


  1. Plus, o1-pro’s coding performance for large codebases is unrivaled. But it also costs $200/month – way more than any regular user interested in assistive AI tools for their personal workflow should pay. ↩︎
  2. I’d love to see a Todoist extension for Gemini at some point. ↩︎

Access Extra Content and Perks

Founded in 2015, Club MacStories has delivered exclusive content every week for nearly a decade.

What started with weekly and monthly email newsletters has blossomed into a family of memberships designed every MacStories fan.

Club MacStories: Weekly and monthly newsletters via email and the web that are brimming with apps, tips, automation workflows, longform writing, early access to the MacStories Unwind podcast, periodic giveaways, and more;

Club MacStories+: Everything that Club MacStories offers, plus an active Discord community, advanced search and custom RSS features for exploring the Club’s entire back catalog, bonus columns, and dozens of app discounts;

Club Premier: All of the above and AppStories+, an extended version of our flagship podcast that’s delivered early, ad-free, and in high-bitrate audio.

Learn more here and from our Club FAQs.

Join Now]]>
The Many Purposes of Timeline Apps for the Open Web https://www.macstories.net/stories/the-many-purposes-of-timeline-apps-for-the-open-web/ Wed, 05 Feb 2025 02:45:24 +0000 https://www.macstories.net/?p=77770 Tapestry (left) and Reeder.

Tapestry (left) and Reeder.

Writing at The Verge following the release of The Iconfactory’s new app Tapestry, David Pierce perfectly encapsulates how I feel about the idea of “timeline apps” (a name that I’m totally going to steal, thanks David):

⁠⁠What I like even more, though, is the idea behind Tapestry. There’s actually a whole genre of apps like this one, which I’ve taken to calling “timeline apps.” So far, in addition to Tapestry, there’s ReederUnreadFeeeedSurf, and a few others. They all have slightly different interface and feature ideas, but they all have the same basic premise: that pretty much everything on the internet is just feeds. And that you might want a better place to read them.⁠⁠
[…]
These apps can also take some getting used to. If you’re coming from an RSS reader, where everything has the same format — headline, image, intro, link — a timeline app will look hopelessly chaotic. If you’re coming from social, where everything moves impossibly fast and there’s more to see every time you pull to refresh, the timeline you curate is guaranteed to feel boring by comparison.⁠⁠

I have a somewhat peculiar stance on this new breed of timeline apps, and since I’ve never written about them on MacStories before, allow me to clarify and share some recent developments in my workflow while I’m at it.

I think both Tapestry and the new Reeder are exquisitely designed apps, for different reasons. I know that Tapestry’s colorful and opinionated design doesn’t work for everyone; personally, I dig the different colors for each connected service, am a big fan the ‘Mini’ layout, and appreciate the multiple font options available. Most of all, however, I love that Tapestry can be extended with custom connectors built with standard web technologies – JavaScript and JSON – so that anyone who produces anything on the web can be connected to Tapestry. (The fact that MacStories’ own JSON feed is a default recommended source in Tapestry is just icing on the cake.) And did you know that The Iconfactory also created a developer tool to make your own Tapestry connectors?

I like the new Reeder for different reasons. The app’s animations are classic Silvio Rizzi work – fluid and smooth like nothing else on iOS and iPadOS. In my experience, the app has maintained impeccable timeline sync, and just this week, it was updated with powerful new filtering capabilities, enabling the creation of saved searches for any source within the app. (More on this below.)

My problem with timeline apps is that I struggle to understand their pitch as alternatives to browsing Mastodon and Bluesky (supported by both Tapestry and Reeder) when they don’t support key functionalities of those services such as posting, replying, reposting, or marking items as favorites.

Maybe it’s just me, but when I’m using a social media app, I want to have access to its full feature set and be able to respond to people or interact with posts. I want to browse my custom Bluesky feeds or post a Mastodon poll if I want to. Instead, both Tapestry and Reeder act as glorified readers for those social timelines. And I understand that perhaps that’s exactly what some people want! But until these apps can tap into Mastodon and Bluesky (and/or their decentralized protocols) to support interactions in addition to reading, I’d rather just use the main social media apps (or clients like Ivory).1 To an extent, the same applies for Reddit: if neither of these apps allow me to browse an entire subreddit or sort its posts by different criteria, what’s the point?

But: the beauty of the open web and the approach embraced by Tapestry and Reeder is that there are plenty of potential use cases to satisfy everyone. Crucially, this includes people who are not like me. There is no one-size-fits-all approach here because the web isn’t built like that.

So, while I still haven’t decided which of these two apps I’m going to use yet, I’ve found my own way to take advantage of timeline apps: I like to use them as specialized feeds for timelines that I don’t want to (or can’t) have in my RSS reader or add as lists to Mastodon/Bluesky.

For instance, I created a custom MacStories timeline in Tapestry with feeds for all kinds of places on the web where MacStories publishes content or social media posts. I love how Tapestry brings everything together in a unified, colorful timeline that I can use alongside my RSS and social apps to see all sorts of posts by our company.

The colors!

The colors!

Reeder’s latest addition is also something I’m considering at the moment. The app can now create saved filters, which are based on multiple filtering conditions. These rules can be stacked to create custom views that aggregate specific subsets of posts from sources that, typically, would be their own silos. Want to create an “AI” feed that cuts through RSS, Bluesky, YouTube, and Reddit to find you the latest AI news or products by keyword? How about a filter to show only YouTube videos that mention Nintendo? All of this (and more) is possible with Reeder’s latest update, with an interface that…I’ll just let the screenshots speak for themselves.

Silvio Rizzi's design taste never disappoints.

Silvio Rizzi’s design taste never disappoints.

Which leads me back to my main point. I feel like thinking about this new generation of apps as social media clients would be wrong and shortsighted; it reduces the scope of what they’re trying to accomplish down to a mere copy of a social media timeline. Instead, I think Tapestry and Reeder are coming at this from two different angles (Tapestry with better developer tools; Reeder with superior user filters), but with the same larger ambition nonetheless: to embrace the open nature of the Web and move past closed platforms that feel increasingly archaic today.

The fact that I can make a timeline out of anything doesn’t mean that Tapestry or Reeder have to be my everything-timelines. It means that the modern web lets me choose what I want to see in these apps. I can’t help but feel that there’s something special about that we must protect.


  1. Speaking of which: are the folks at Tapbots considering a Bluesky client? ↩︎

Access Extra Content and Perks

Founded in 2015, Club MacStories has delivered exclusive content every week for nearly a decade.

What started with weekly and monthly email newsletters has blossomed into a family of memberships designed every MacStories fan.

Club MacStories: Weekly and monthly newsletters via email and the web that are brimming with apps, tips, automation workflows, longform writing, early access to the MacStories Unwind podcast, periodic giveaways, and more;

Club MacStories+: Everything that Club MacStories offers, plus an active Discord community, advanced search and custom RSS features for exploring the Club’s entire back catalog, bonus columns, and dozens of app discounts;

Club Premier: All of the above and AppStories+, an extended version of our flagship podcast that’s delivered early, ad-free, and in high-bitrate audio.

Learn more here and from our Club FAQs.

Join Now]]>
Six Colors’ Apple in 2024 Report Card https://www.macstories.net/stories/six-colors-apple-in-2024-report-card/ Tue, 04 Feb 2025 15:57:10 +0000 https://www.macstories.net/?p=77757 Average scores from the 2024 Six Colors report card. Source: [Six Colors](https://sixcolors.com/post/2025/02/apple-in-2024-the-six-colors-report-card/).

Average scores from the 2024 Six Colors report card. Source: Six Colors.

For the past 10 years, Six Colors’ Jason Snell has put together an “Apple report card” – a survey to assess the current state of Apple “as seen through the eyes of writers, editors, developers, podcasters, and other people who spend an awful lot of time thinking about Apple”.

The 2024 edition of the Six Colors Apple Report Card has been published, and you can find an excellent summary of all the submitted comments along with charts featuring average scores for the different categories here.

I’m grateful that Jason invited me to take part again and share my thoughts on Apple’s 2024. As you’ll see from my comments below, last year represented the end of an interesting transition period for me: after years of experiments, I settled on the iPad Pro as my main computer. Despite my personal enthusiasm, however, the overall iPad story remained frustrating with its peculiar mix of phenomenal M4 hardware and stagnant software. The iPhone lineup impressed me with its hardware (across all models), though I’m still wishing for that elusive foldable form factor. I was very surprised by the AirPods 4, and while Vision Pro initially showed incredible promise, I found myself not using it that much by the end of the year.

I’ve prepared the full text of my responses for the Six Colors report card, which you can find below.

The Mac

4/5

Look, as we’ve established, I can now use my iPad Pro for everything I do and don’t need a Mac in my life. But I think Apple is doing an outstanding job with its Mac lineup, and I’m particularly envious of those who own the new Mac mini, which is small, powerful, and just exceedingly cute. I would give this category 5 stars; I don’t because Apple still insists on not making touchscreen Macs or more interesting and weird form factors.

The iPhone

4/5

It’s been an interesting year in iPhone land for me. After the September event, I purchased an iPhone 16 Pro Max, but my mind kept going to the iPhone 16 Plus. I was fascinated by its color, slimmer form factor, and more affordable overall package. I used the iPhone 16 Plus as my primary phone for two months and loved it, but then something happened: much to my surprise, I realized that I wasn’t taking as many pictures of my dogs, friends, and family as I used to with the iPhone 15 Pro Max.

That’s when it hit me. I thought I wouldn’t need all the features of a “pro” phone – and, honestly, since I’m not a professional cinematographer, I really don’t – but in the end, I was missing the 5x camera too much. In my experience with using a 16 Plus, I was able to confirm that, if I wanted, I could live without a ProMotion display. But it was the lack of a third, zoomed camera on the Plus model that ultimately got me. I rely on the 5x lens to take dozens of pictures of my dogs doing something funny or sleeping in a cute way every day, and its absence on the 16 Plus was preventing me from grabbing my phone out of my pocket to save new memories on a daily basis.

I’m glad I did this experiment because it also left me with a couple of additional thoughts about the iPhone line:

  1. If Apple comes out with a completely redesigned, slimmer “iPhone 17 Air” later this year that doesn’t have a 5x camera, I’ll have to begrudgingly pass on it and stick with the 17 Pro Max instead.
  2. Now more than ever, I truly, fundamentally want Apple to make a foldable phone that expands into a mini-tablet when opened. I don’t care how expensive Apple makes this device. I look at the latest Pixel 9 Pro Fold, and I’m very jealous of its form factor, but I also know that I wouldn’t be able to use Android as the OS for my phone.

If it weren’t for the lack of a foldable form factor in Apple’s iPhone lineup, I would give this category 5 stars. I hope we’ll see some changes on this front within the next couple of years.

The iPad

3/5

What can I say about the iPad that I haven’t already documented extensively? I love the iPad Pro’s hardware, and I find the M4 iPad Pro a miracle of hardware engineering with no equal in other similar products. In 2024, I chose to go all-in on the 11” iPad Pro as my one and only computer; in fact, since the MacPad stopped working a few weeks ago (RIP), I don’t even have a Mac anymore, but I can do everything I need to do on an iPad – that is, after a series of compromises that, unfortunately, continue to be the other side of the coin of the iPad experience.

Going into its 15th year (!), the iPad continues to be incredible hardware let down by a lackluster operating system that is neither as intuitive as iOS nor as advanced or flexible as macOS. The iPad is still stuck in the middle, which is exactly what I – and my fellow iPad users – have been saying for years now. I shouldn’t have to come up with expensive hardware-based workarounds to overcome the limitations of a platform that doesn’t want me to use my computer to its full extent. But, despite everything, I persist because no other tablet even comes close to the performance, thinness, and modularity of an iPad Pro.

Wearables

4/5

I love my new AirPods 4, and I find the combination of no in-ear tips and basic noise cancellation a fantastic balance of trade-offs and comfort. I didn’t rely on AirPods Pro’s advanced noise cancellation and other audio features that much, so switching to the “simpler” AirPods 4 when they were released was a no-brainer for me.

If we’re counting the Vision Pro in wearables, for as flawed as that product can be (it is, after all, a fancy developer kit with an almost non-existent third-party app ecosystem), I also think it’s an impressive showcase of what Apple can do with hardware and miniaturization if money is not a concern and engineers are free to build whatever they want. I don’t use the Vision Pro on a regular basis, but whenever I do, I’m reminded that visionOS is an exciting long-term prospect for what I hope will eventually be shrunk down to glasses.

That is, in fact, the reason why I’m not giving this category 5 stars. I really want to stop using my Meta Ray-Ban glasses, but Apple doesn’t have an alternative that I can purchase today – and worse, it sounds like their version may not be ready for quite some time still. It seems like Apple is, at this point, almost institutionally incapable of releasing a minimum viable product that doesn’t have to be a complete platform with an entire app ecosystem and a major marketing blitz. I just want Apple to make a pair of glasses that combine AirPods, Siri, and a basic camera. I don’t need Apple to make XR glasses that project a computer in front of my eyes today. And I wish the company would understand this – that they would see the interest in “simple” glasses that have speakers, a microphone, and a camera, and release that product this year. I hope they change their minds and can fast-track such a product rather than wait for visionOS to support that kind of form factor years from now.

Apple Watch

5/5

Vision Pro

3/5

Home

2/5

My entire apartment is wired to HomeKit, but I don’t love HomeKit because I’m tired of purchasing third-party hardware that doesn’t have the same degree of quality control that Apple typically brings to the table. I’m intrigued by the idea of Apple finally waking up and making a HomePod with a screen that could potentially serve as a flexible, interactive home hub. That’s a first step, and I hope it won’t disappoint. Seriously, though: I just would love for Apple to make routers again.

Apple TV

3/5

Services

2/5

I switched from Apple Music to Spotify last year, so the only Apple services we use in our household now are iCloud storage with family sharing and Apple TV+. I love Apple TV+, but they should make a native app for Android so that I can watch their TV shows on my Lenovo media tablet. As for iCloud, I use it for Shortcuts, app integrations, and basic iCloud Drive storage, but I don’t trust it for work-related assets because it’s so damn slow. For whatever reason, with Dropbox I can upload heavy video files in seconds thanks to my fiber connection, but with iCloud, I have to wait a full day for those assets to sync across devices. iCloud Drive needs more controls and tools for people who work with files and share them with other people.

Overall Reliability of Apple Hardware

5/5

I have never had an Apple product fail on me, hardware-wise, in the 16 years I’ve been covering the company. If there’s one area where Apple is leagues ahead of its competition, I think it’s hardware manufacturing and overall experience.

Apple OS Quality

4/5

Quality of Apple Apps

3/5

Developer Relations

1/5

Other Comments

I’m genuinely curious about what Apple is going to do with Apple Intelligence this year. Their first wave of previously announced AI features still hasn’t fully rolled out, and it’s fairly clear that the company is more or less two years behind its competitors in this space. While OpenAI is launching Tasks and Google is impressing the industry with their latest Gemini models and promising AI agents living in the browser, Apple is…letting you create cute emoji and terrible images that are so 2022, it hurts.

That being said, I believe that Apple is aware of the fact that they need to catch up – and fast – and I kind of enjoy the fact that we’re witnessing Apple being an underdog again and having to pull out all the stops to show the world that they can still be relevant in a post-AI society. The company, unlike many AI competitors, has a unique advantage: they make the computers we use and the operating systems they run on. I’m convinced that, long term, Apple’s main competitors won’t be OpenAI, Anthropic, or Meta, but Google and Microsoft. The Apple Intelligence features we saw at WWDC last year made for a cute demo; I think 2025 is going to show us a glimpse of what Apple’s true vision for the future of computing and AI is.


Access Extra Content and Perks

Founded in 2015, Club MacStories has delivered exclusive content every week for nearly a decade.

What started with weekly and monthly email newsletters has blossomed into a family of memberships designed every MacStories fan.

Club MacStories: Weekly and monthly newsletters via email and the web that are brimming with apps, tips, automation workflows, longform writing, early access to the MacStories Unwind podcast, periodic giveaways, and more;

Club MacStories+: Everything that Club MacStories offers, plus an active Discord community, advanced search and custom RSS features for exploring the Club’s entire back catalog, bonus columns, and dozens of app discounts;

Club Premier: All of the above and AppStories+, an extended version of our flagship podcast that’s delivered early, ad-free, and in high-bitrate audio.

Learn more here and from our Club FAQs.

Join Now]]>
MacStories Won’t Stand for Meta’s Dehumanizing and Harmful Moderation Policies https://www.macstories.net/stories/macstories-wont-stand-for-metas-dehumanizing-and-harmful-moderation-policies/ Fri, 10 Jan 2025 18:20:07 +0000 https://www.macstories.net/?p=77614

Just over two years ago, MacStories left Twitter behind. We left when Elon Musk began dismantling the company’s trust and safety infrastructure, allowing hateful speech and harassment on the platform. Meta is now doing the same thing with Threads and Instagram, so we’re leaving them behind, too.

We were initially optimistic about Threads because of its support for federation and interoperability with Mastodon. The relatively young service has never done as much as it should to protect its users from hateful content, as Niléane documented last year. Yet as bad as it already was for LGBT people and others, things took a much darker turn this week when Meta announced a series of new policies that significantly scaled back moderation on Threads and Instagram.

Meta has abandoned its relationships with third-party fact-checking organizations in favor of a “community notes” approach similar to X. The company has also eliminated filters it had in place to protect users from a wide variety of harmful speech. As Casey Newton reported yesterday, the internal Meta documents that implement these new policies now allow for posts like:

“There’s no such thing as trans children.”
“God created two genders, ‘transgender’ people are not a real thing.”
“This whole nonbinary thing is made up. Those people don’t exist, they’re just in need of some therapy.”
“A trans woman isn’t a woman, it’s a pathetic confused man.”
“A trans person isn’t a he or she, it’s an it.”

Newton also reports:

So in addition to being able to call gay people insane on Facebook, you can now also say that gay people don’t belong in the military, or that trans people shouldn’t be able to use the bathroom of their choice, or blame COVID-19 on Chinese people, according to this round-up in Wired. (You can also now call women household objects and property, per CNN.) The company also (why not?!) removed a sentence from its policy explaining that hateful speech can “promote offline violence.”

For more on Meta’s new policies and their impact, we encourage MacStories readers to read both of Casey Newton’s excellent Platformer articles linked above.

This is ugly, dehumanizing stuff that has no place on the Internet or anywhere else and runs counter to everything we believe in at MacStories. We believe that platforms should protect all of their users from harm and harassment. Technology should bring people together not divide and dehumanize them, which is why we’re finished with Threads and Instagram.

I’d like to think other media companies will join us in taking similar action, but we understand why many won’t. Meta’s social networks drive a significant amount of traffic to websites like MacStories, and walking away from that isn’t easy in an economy where media companies are under a lot of financial pressure. We’ll be okay thanks to the support of our readers who subscribe to Club MacStories, but many others don’t have that, which is why it’s important for individuals to do what they can to help too.

We know that in times like these, it’s often hard to know what to do because we’ve felt that way ourselves. One way you can help is to make a donation to groups that are working to support the rights of LGBT people who increasingly find themselves threatened by the actions of companies, governments, and others. With Niléane’s assistance, we have identified organizations you can donate in the U.S., E.U., and U.K. that are working to protect the rights of LGBT people:

Thanks to all of you who donate. The world of tech is not immune from the troubles facing our world, but with your help, we can make MacStories a bright spot on the tech landscape where people feel safe and welcome.

– Federico and John


Access Extra Content and Perks

Founded in 2015, Club MacStories has delivered exclusive content every week for nearly a decade.

What started with weekly and monthly email newsletters has blossomed into a family of memberships designed every MacStories fan.

Club MacStories: Weekly and monthly newsletters via email and the web that are brimming with apps, tips, automation workflows, longform writing, early access to the MacStories Unwind podcast, periodic giveaways, and more;

Club MacStories+: Everything that Club MacStories offers, plus an active Discord community, advanced search and custom RSS features for exploring the Club’s entire back catalog, bonus columns, and dozens of app discounts;

Club Premier: All of the above and AppStories+, an extended version of our flagship podcast that’s delivered early, ad-free, and in high-bitrate audio.

Learn more here and from our Club FAQs.

Join Now]]>
What’s in My CES Bag? https://www.macstories.net/stories/whats-in-my-ces-bag/ Thu, 02 Jan 2025 15:23:01 +0000 https://www.macstories.net/?p=77550

Packing for CES has been a little different than WWDC. The biggest differences are the huge crowds at CES and the limits the conference puts on the bags you can carry into venues.

My trusty Tom Bihn Synapse 25 backpack isn’t big, but it’s too large for CES, so the first thing I did was look for a bag that was small enough to meet the CES security rules but big enough to hold my 14” MacBook Pro and 11” iPad Pro, plus accessories. I decided on a medium-sized Tomtoc Navigator T24 sling bag, which is the perfect size. It holds 7 liters of stuff and has built-in padding to protect the corners of the MacBook Pro and iPad as well as pockets on the inside and outside to help organize cables and other things.

Tomtoc's medium Navigator T24 sling bag. Source: Tomtoc.

Tomtoc’s medium Navigator T24 sling bag. Source: Tomtoc.

I don’t plan to carry my MacBook Pro with me during the day. The iPad Pro will be plenty for any writing and video production I do on the go, but it will be good to have the power and flexibility of the MacBook Pro when I return to my hotel room. For traveling to and from Las Vegas, I appreciate that the Tomtoc bag can fit everything I’m bringing.

A surprising amount of stuff fits in the T24. Source: Tomtoc.

A surprising amount of stuff fits in the T24. Source: Tomtoc.

With little room to spare, my setup is minimal. I’ll write on the iPad Pro and MacBook Pro, carrying the iPad with me tethered to my iPhone for Internet access. That’s a tried-and-true setup I already use whenever I’m away from home.

My entire video production setup, minus my iPhone and iPad.

My entire video production setup, minus my iPhone and iPad.

The new part of my travel setup is the gear I’m using to record video. At the core of the setup is my iPhone 16 Pro Max. For on-the-go video, I’m bringing an Insta360 Flow Pro gimbal. It’s lightweight, so it fits nicely in my Tomtoc bag, and it has a lot of bells and whistles like subject tracking and DockKit support, which I’m looking forward to trying.

The Insta360 Flow Pro gimbal. Source: Insta360.

The Insta360 Flow Pro gimbal. Source: Insta360.

I’m also planning to record podcast-style segments with Brendon wherever we can find a little space. For that, I have a Manfrotto PIXI Mini Tripod and a MagSafe-compatible tripod adapter from Moment. Using Final Cut Camera and Final Cut Pro for the iPad, I’ll be able to control recording from my iPad Pro.

A 2TB SSD and mini USB-C hub.

A 2TB SSD and mini USB-C hub.

With thousands of people crammed in a tight space, I can’t rely on always having a good Wi-Fi or mobile data connection. For maximum flexibility and minimal reliance on wireless connections, I have an incredibly small Lexar 2TB SSD. I’ll record video to the SSD and then plug it into the iPad Pro for editing instead of relying on AirDrop.

By adding a DJI Mic 2 receiver, I can improve our audio while saving to external media.

By adding a DJI Mic 2 receiver, I can improve our audio while saving to external media.

The Lexar SSD came with a tiny hub, too, which is how I’m handling audio. With the hub, I can also plug in a DJI Mic 2 receiver. It comes with two microphone transmitters that sound remarkably good for mics that are so small and wireless. The DJI Mic 2 comes in a sturdy carrying case that charges its components just like an AirPods case does. Like the Lexar outboard storage, the case barely adds any volume or weight to my overall kit, which I love.

A big Anker Prime battery to power our gear all day long. Source: Anker.

A big Anker Prime battery to power our gear all day long. Source: Anker.

However, not everything I’m taking is lightweight. I also plan to carry my Anker Prime 27,650mAh Power Bank with me during the day to recharge my iPhone, iPad, and other devices that will undoubtedly need topping off after shooting video. I’ll bring my 10,000mAh Anker MagGo Power Bank too because it’s compact and a great way to charge up my Apple Watch and smaller USB-C devices. When I’m in my hotel room, I’ll be able to charge everything relatively quickly thanks to a 100W Anker USB-C GaN wall charger and a 65W three-port USB-C GaN charger from UGREEN, which I’ll probably carry with me during the day too. I’ll charge everything via OWC Thunderbolt 4 cables to maximize power throughput.

The TrimUI Brick. Source: TrimUI.

The TrimUI Brick. Source: TrimUI.

For any downtime I’ll have, I’m bringing my white TrimUI Brick, one of my latest retro videogame handhelds. I don’t expect to have much free time on this trip, but if nothing else, it’s a long flight from Charlotte to Las Vegas, the perfect time for some light gaming. Brendon talked a little about the Brick on a recent episode of NPC. Federico and I have both received our own Bricks since then, so I’m sure we’ll talk about the device more soon. In my limited time with it, though, it seems like the perfect travel handheld with its solid build and small but sharp and bright 3.2” screen.

Finally, I’m bringing two sets of old-fashioned wired EarPods. I’ll use the version with a 3.5mm headphone jack for the TrimUI Brick and the USB-C model as needed for editing videos. They aren’t the best quality headphones by a long stretch, but I won’t have to deal with any audio latency or connectivity issues that wireless headphones would introduce. Plus, all the other wired headphones I have are simply too big to fit in my Tomtoc bag.

The Tom Bihn Synapse 25 has been demoted to carrying clothes.

The Tom Bihn Synapse 25 has been demoted to carrying clothes.

To complement my compact tech kit, I’ll also bring my trusty Tom Bihn Synapse 25 backpack. I won’t be able to bring it into any venues, but since I’ll only be away for three nights, I should be able to pack my clothes, on-the-go snacks, and other items with some careful planning.

CES has required a different sort of setup than I’m used to. I appreciate the constraint on the size of bag I can carry during the day because, having never attended CES in Las Vegas, I know I’d be prone to pack a lot of unnecessary gear “just in case.” Instead, I’ve had to focus on minimizing what I bring and maximizing its flexibility. I’ll know soon enough whether I’ve made any miscalculations.


You can follow along with our CES coverage here on MacStories.net under the tag ‘CES 2025’ and this dedicated RSS feed. You’ll also find two playlists on our YouTube channel: ‘NPC @ CES’ for handheld gaming news and ‘MacStories @ CES’ for everything else.


Access Extra Content and Perks

Founded in 2015, Club MacStories has delivered exclusive content every week for nearly a decade.

What started with weekly and monthly email newsletters has blossomed into a family of memberships designed every MacStories fan.

Club MacStories: Weekly and monthly newsletters via email and the web that are brimming with apps, tips, automation workflows, longform writing, early access to the MacStories Unwind podcast, periodic giveaways, and more;

Club MacStories+: Everything that Club MacStories offers, plus an active Discord community, advanced search and custom RSS features for exploring the Club’s entire back catalog, bonus columns, and dozens of app discounts;

Club Premier: All of the above and AppStories+, an extended version of our flagship podcast that’s delivered early, ad-free, and in high-bitrate audio.

Learn more here and from our Club FAQs.

Join Now]]>
iPad Pro for Everything: How I Rethought My Entire Workflow Around the New 11” iPad Pro https://www.macstories.net/stories/ipad-pro-for-everything/ Wed, 18 Dec 2024 15:30:18 +0000 https://www.macstories.net/?p=77499 My 11" iPad Pro.

My 11” iPad Pro.

For the past two years since my girlfriend and I moved into our new apartment, my desk has been in a constant state of flux. Those who have been reading MacStories for a while know why. There were two reasons: I couldn’t figure out how to use my iPad Pro for everything I do, specifically for recording podcasts the way I like, and I couldn’t find an external monitor that would let me both work with the iPad Pro and play videogames when I wasn’t working.

This article – which has been six months in the making – is the story of how I finally did it.

Over the past six months, I completely rethought my setup around the 11” iPad Pro and a monitor that gives me the best of both worlds: a USB-C connection for when I want to work with iPadOS at my desk and multiple HDMI inputs for when I want to play my PS5 Pro or Nintendo Switch. Getting to this point has been a journey, which I have documented in detail on the MacStories Setups page.

This article started as an in-depth examination of my desk, the accessories I use, and the hardware I recommend. As I was writing it, however, I realized that it had turned into something bigger. It’s become the story of how, after more than a decade of working on the iPad, I was able to figure out how to accomplish the last remaining task in my workflow, but also how I fell in love with the 11” iPad Pro all over again thanks to its nano-texture display.

I started using the iPad as my main computer 12 years ago. Today, I am finally able to say that I can use it for everything I do on a daily basis.

Here’s how.

[table_of_contents]

iPad Pro for Podcasting, Finally

If you’re new to MacStories, I’m guessing that you could probably use some additional context.

Through my ups and downs with iPadOS, I’ve been using the iPad as my main computer for over a decade. I love the iPad because it’s the most versatile and modular computer Apple makes. I’ve published dozens of stories about why I like working on the iPad so much, but there was always one particular task that I just couldn’t use the device for: recording podcasts while saving a backup of a Zoom call alongside my local audio recording.

I tried many times over the years to make this possible, sometimes with ridiculous workarounds that involved multiple audio interfaces and a mess of cables. In the end, I always went back to my Mac and the trusty Audio Hijack app since it was the easiest, most reliable way to ensure I could record my microphone’s audio alongside a backup of a VoIP call with my co-hosts. As much as I loved my iPad Pro, I couldn’t abandon my Mac completely. At one point, out of desperation, I even found a way to use my iPad as a hybrid macOS/iPadOS machine and called it the MacPad.

Fast forward to 2024. I’ve been recording episodes of AppStories, Connected, NPC, and Unwind from my iPad Pro for the past six months. By and large, this project has been a success, allowing me to finally stop relying on macOS for podcast recording. However, none of this was made possible by iPadOS or new iPad hardware. Instead, I was able to do it thanks to a combination of new audio hardware and Zoom’s cloud recording feature.

When I record a show with my co-hosts, we’re having a VoIP call over Zoom, and each of us has to record their own microphone’s audio. After the recording is done, all of these audio tracks are combined in a single Logic project, mixed, and exported as the finished MP3 file you listen to in your podcast client of choice. It’s a pretty standard procedure. When it comes to the iPad, there are two issues related to this process that iPadOS alone still can’t provide a solution for:

  • In addition to recording my local audio, I also like to have a backup recording of the entire call on Zoom – you know, just as a precaution. On the Mac, I can easily do this with a session in Audio Hijack. On the iPad, there’s no way to do it because the system can’t capture audio from two different sources at once.
  • Backups aside, the bigger issue is that, due to how iPadOS is architected, if I’m on a Zoom call, I can’t record my local audio at the same time, period.

As you can see, if I were to rely on iPadOS alone, I wouldn’t be able to record podcasts the way I like to at all. This is why I had to employ additional hardware and software to make it happen.

For starters, per Jason Snell, I found out that Zoom now supports a cloud recording feature that automatically uploads and saves each participant’s audio track. This is great. I enabled this feature for all the scheduled meetings in my Zoom account, and now, as soon as an AppStories call starts, the automatic cloud recording also kicks in. If anything goes wrong with my microphone, audio interface, or iPad at any point, I know there will be a backup waiting for me in my Zoom account a few minutes after the call is finished. I turned this option on months ago, and it’s worked flawlessly so far, giving me the peace of mind that a backup is always happening behind the scenes whenever we record on Zoom.

Being able to have backups for video and audio recordings is a great Zoom feature.

Being able to have backups for video and audio recordings is a great Zoom feature.

But what about recording my microphone’s audio in the first place? This is where hardware comes in. As I was thinking about this limitation of iPadOS again earlier this year, I realized that the solution had been staring me in the face this entire time: instead of recording my audio via iPadOS, I should offload that task to external hardware. And a particular piece of gear that does exactly this has been around for years.

Enter Sound Devices’ MixPre-3 II, a small, yet rugged, USB audio interface that lets you plug in up to three microphones via XLR, output audio to headphones via a standard audio jack, and – the best part – record your microphone’s audio to an SD card. (I use this one.)

The MixPre-3 II.

The MixPre-3 II.

That was my big realization a few months ago: rather than trying to make iPadOS hold a Zoom call and record my audio at the same time, what if I just used iPadOS for the call and delegated recording to a dedicated accessory?

I’m here to tell you that, after some configuration, this works splendidly. Here’s the idea: the MixPre-3 acts as a USB interface for the iPad Pro, but at the same time, it can also record its microphone input to a WAV track that is kept completely separate from iPadOS. The recording feature is built into the MixPre’s software itself; the iPad has no idea that it’s happening. When I finish recording and press the stop button on the MixPre, I can then switch the device’s operating mode from USB audio interface to USB drive, and my iPad will see the MixPre’s SD card as an external source in the Files app.

Grabbing audio files from the MixPre when mounted in the Files app.

Grabbing audio files from the MixPre when mounted in the Files app.

Then, from the Files app, I can grab the audio file and upload it to Dropbox. For all my issues with the Files app (which has only marginally improved in iPadOS 18 with a couple of additions), I have to say that transferring heavy files from the MixPre’s SD card has been reliable.

With a good SD card, transfer speeds aren't too bad.

With a good SD card, transfer speeds aren’t too bad.

The menu I have to use when I'm done recording.

The menu I have to use when I’m done recording.

The trickiest aspect of using the MixPre with my iPad has been configuring it so that it can record audio from a microphone plugged in via XLR locally while also passing that audio over USB to the iPad and receiving audio from the iPad, outputting it to headphones connected to the MixPre. Long story short, while there are plenty of YouTube guides you can follow, I configured my MixPre in advanced mode so that it records audio using channel 3 (where my microphone is plugged in) and passes audio back and forth over USB using the USB 1 and 2 channels.

It’s difficult for me right now to encapsulate how happy I am that I was finally able to devise a solution for recording podcasts with call backups on my iPad Pro. Sure, the real winners here are Zoom’s cloud backup feature and the MixPre’s excellent USB support. However, I think it should be noted that, until a few years ago, not only was transferring files from an external drive on the iPad impossible, but some people were even suggesting that it was “wrong” to assert that an iPad should support that feature.

Finally.

Finally.

As I’ll explore throughout this story, the, “An iPad isn’t meant to do certain things,” ship sailed years ago. It’s time to accept the reality that some people, including me, simply prefer getting their work done on a machine that isn’t a MacBook.

iPad Pro at a Desk: Embracing USB-C with a New Monitor

My desk setup.

My desk setup.

A while back, I realized that I like the idea of occasionally taking breaks from work by playing a videogame for a few minutes in the same space where I get my work done. This may seem like an obvious idea, but what you should understand about me is that I’ve never done this since I started MacStories 15 years ago. My office has always been the space for getting work done; all game consoles stayed in the living room, where I’d spend some time in the evening or at night if I wasn’t working. Otherwise, I could play on one of my handhelds, usually in bed before going to sleep.

This year, however, the concept of taking a quick break from writing (like, say, 20 minutes) without having to switch locations altogether has been growing on me. So I started looking for alternatives to Apple’s Studio Display that would allow me to easily hop between the iPad Pro, PlayStation 5 Pro, and Nintendo Switch with minimal effort.1

Before you tell me: yes, I tried to make the Studio Display work as a gaming monitor. Last year, I went deep down the rabbit hole of USB-C/HDMI switches that would be compatible with the Studio Display. While I eventually found one, the experience was still not good enough for high-performance gaming; the switch was finicky to set up and unreliable. Plus, even if I did find a great HDMI switch, the Studio Display is always going to be limited to a 60Hz refresh rate. The Studio Display is a great productivity monitor, but I don’t recommend it for gaming. I had to find something else.

After weeks of research, I settled on the Gigabyte M27U as my desk monitor. I love this display: it’s 4K at 27” (I didn’t want to go any bigger than that), refreshes at 160Hz (which is sweet), has an actual OSD menu to tweak settings and switch between devices, and, most importantly, lets me connect computers and consoles over USB-C, HDMI, or DisplayPort.

Another angle.

Another angle.

There have been some downgrades coming from the Studio Display. For starters, my monitor doesn’t have a built-in webcam, which means I had to purchase an external one that’s compatible with my iPad Pro. (More on this later.) The speakers don’t sound nearly as good as the Studio Display’s, either, so I often find myself simply using the iPad Pro’s (amazing) built-in speakers or my new AirPods Max, which I surprisingly love after a…hack.

Furthermore, the M27U offers 400 nits of brightness compared to the Studio Display’s 600 nits. I notice the difference, and it’s my only real complaint about this monitor, which is slim enough and doesn’t come with the useless RGB bells and whistles that most gaming monitors feature nowadays.

In using the monitor, I’ve noticed something odd about its handling of brightness levels. By default, the iPad Pro connected to my CalDigit TS4 dock (which is then connected over USB-C to the monitor) wants to use HDR for the external display, but that results in a very dim image on the M27U:

With HDR enabled, the monitor gets very dim, and the colors are off.

With HDR enabled, the monitor gets very dim, and the colors are off.

The most likely culprit is the fact that this monitor doesn’t properly support HDR over USB-C. If I choose SDR instead of HDR for the monitor, the result is a much brighter panel that doesn’t make me miss the Studio Display that much:

SDR mode.

SDR mode.

Another downside of using an external monitor over USB-C rather than Thunderbolt is the lack of brightness and volume control via the Magic Keyboard’s function keys. Neither of these limitations is a dealbreaker; I don’t care about volume control since I prefer the iPad Pro’s built-in speakers regardless, and I always keep the brightness set to 100% anyway.

The shortcomings of this monitor for Apple users are more than compensated for by its astounding performance when gaming. Playing games on the M27U is a fantastic experience: colors look great, and the high refresh rate is terrific to see in real life, especially for PS5 games that support 120Hz and HDR. Nintendo Switch games aren’t nearly as impressive from a pure graphical standpoint (there is no 4K output on the current Switch, let alone HDR or 120Hz), but they usually make up for it in art direction and vibrant colors. I’ve had a lovely time playing Astro Bot and Echoes of Wisdom on the M27U, especially because I could dip in and out of those games without having to switch rooms.

This monitor is terrific for gaming.

This monitor is terrific for gaming.

What truly sells the M27U as a multi-device monitor isn’t performance alone, though; it’s the ease of switching between multiple devices connected to different inputs. On the back of the monitor, there are two physical buttons: a directional nub that lets you navigate various menus and a KVM button that cycles through currently active inputs. When one of my consoles is awake and the iPad Pro is connected, I can press the KVM button to instantly toggle between the USB-C input (iPad) and whichever HDMI input is active (either the PS5 or Switch). Alternatively, if – for whatever reason – everything is connected and active all at once, I can press the nub on the back and open the ‘Input’ menu to select a specific one.

Multiple inputs for a desktop monitor – what a concept.

Multiple inputs for a desktop monitor – what a concept.

I recognize that this sort of manual process is probably antithetical to what the typical Apple user expects. But I’m not your typical Apple user or pundit. I love the company’s minimalism, but I also like modularity and using multiple devices. The M27U is made of plastic, its speakers are – frankly – terrible, and it’s not nearly as elegant as the Apple Studio Display. At the same time, quickly switching between iPadOS and The Legend of Zelda makes it all worth it.

Looking ahead at what’s coming in desktop monitor land, I think my next upgrade (sometime in late 2025, most likely) is going to be a 27” 4K OLED panel (ideally with HDMI and Thunderbolt 5?). For now, and for its price, the M27U is an outstanding piece of gear that transformed my office into a space for work and play.

The 11” iPad Pro with Nano-Texture Glass

You may remember that, soon after Apple’s event in May, I decided to purchase a 13” iPad Pro with standard glass. I used that iPad for about a month, and despite my initial optimism, something I was concerned about came true: even with its reduction in weight and thickness, the 13” model was still too unwieldy to use as a tablet outside of the Magic Keyboard. I was hoping its slimmer profile and lighter body would help me take it out of the keyboard case and use it as a pure tablet more often; in reality, nothing can change the fact that you’re holding a 13” tablet in your hands, which can be too much when you just want to watch some videos or read a book.

I had slowly begun to accept that unchanging reality of the iPad lineup when Apple sent me two iPad Pro review units: a 13” iPad Pro with nano-texture glass and a smaller 11” model with standard glass. A funny thing happened then. I fell in love with the 11” size all over again, but I also wanted the nano-texture glass. So I sold my original 13” model and purchased a top-of-the-line 11” iPad Pro with cellular connectivity, 1 TB of storage, and nano-texture glass.

I was concerned the nano-texture glass would take away the brilliance of the iPad's OLED display. I was wrong.

I was concerned the nano-texture glass would take away the brilliance of the iPad’s OLED display. I was wrong.

It’s no exaggeration when I say that this is my favorite iPad of all time. It has reignited a fire inside of me that had been dormant for a while, weakened by years of disappointing iPadOS updates and multitasking debacles.

I have been using this iPad Pro every day for six months now. I wrote and edited the entire iOS and iPadOS 18 review on it. I record podcasts with it. I play and stream videogames with it. It’s my reading device and my favorite way to watch movies and YouTube videos. I take it with me everywhere I go because it’s so portable and lightweight, plus it has a cellular connection always available. The new 11” iPad Pro is, quite simply, the reason I’ve made an effort to go all-in on iPadOS again this year.

There were two key driving factors behind my decision to move from the 13” iPad Pro back to the 11”: portability and the display. In terms of size, this is a tale as old as the iPad Pro. The large model is great if you primarily plan to use it as a laptop, and it comes with superior multitasking that lets you see more of multiple apps at once, whether you’re using Split View or Stage Manager. The smaller version, on the other hand, is more pleasant to use as a tablet. It’s easier to hold and carry around with one hand, still big enough to support multitasking in a way that isn’t as cramped as an iPad mini, and, of course, just as capable as its bigger counterpart when it comes to driving an external display and connected peripherals. With the smaller iPad Pro, you’re trading screen real estate for portability; in my tests months ago, I realized that was a compromise I was willing to make.

As a result, I’ve been using the iPad Pro more, especially at the end of the workday, when I can take it out of the Magic Keyboard to get some reading done in Readwise Reader or catch up on my queue in Play. In theory, I could also accomplish these tasks with the 13” iPad Pro; in practice, I never did because, ergonomically, the larger model just wasn’t that comfortable. I always ended up reaching for my iPhone instead of the iPad when I wanted to read or watch something, and that didn’t feel right.

Using the 11" iPad Pro with one hand is totally fine.

Using the 11” iPad Pro with one hand is totally fine.

Much to my surprise, using the 11” iPad Pro with old-school Split View and Slide Over has also been a fun, productive experience.

When I’m working at my desk, I have to use Stage Manager on the external monitor, but when I’m just using the iPad Pro, I prefer the classic multitasking environment. There’s something to the simplicity of Split View with only two apps visible at once that is, at least for me, conducive to writing and focusing on the current task. Plus, there’s also the fact that Split View and Slide Over continue to offer a more mature, fleshed-out take on multitasking: there are fewer keyboard-related bugs, there’s a proper window picker for apps that support multiwindowing, and replacing apps on either side of the screen is very fast via the Dock, Spotlight, or Shortcuts actions (which Stage Manager still doesn’t offer). Most of the iOS and iPadOS 18 review was produced with Split View; if you haven’t played around with “classic” iPadOS multitasking in a while, I highly recommend checking it out again.

I still love the simplicity of Split View.

I still love the simplicity of Split View.

One of the other nice perks of Split View – a feature that has been around for years now2, but I’d forgotten about – is the ease of multitasking within Safari. When I’m working in the browser and want to compare two webpages side by side, taking up equal parts of the screen, I can simply drag a tab to either side of the screen to create a new Safari Split View:

When I drag a link to the side, Split View instantly splits the screen in half with two Safari windows.

When I drag a link to the side, Split View instantly splits the screen in half with two Safari windows.

Conversely, doing the same with Stage Manager opens a new Safari window, which I then have to manually resize if I want to compare two webpages:

So far, I’ve focused on the increased portability of the 11” iPad Pro and how enjoyable it’s been to use a tablet with one hand again. Portability, however, is only one side of this iPad Pro’s story. In conjunction with its portable form factor, the other aspect of the 11” iPad Pro that makes me enjoy using it so much is its nano-texture glass.

Long story short, I’m a nano-texture glass convert now, and it’s become the kind of technology I want everywhere.

My initial concern with the nano-texture glass was that it would substantially diminish the vibrancy and detail of the iPad Pro’s standard glass. I finally had an OLED display on my iPad, and I wanted to make sure I’d fully take advantage of all its benefits over mini-LED. After months of daily usage, I can say not only that my concerns were misplaced and this type of glass is totally fine, but that this option has opened up new use cases for the iPad Pro that just weren’t possible before.

For instance, I discovered the joy of working with my iPad Pro outside, without the need to chase down a spot in the shade so I can see the display more clearly. One of the many reasons we bought this apartment two years ago is the beautiful balcony, which faces south and gets plenty of sunlight all year long. We furnished the balcony so we could work on our laptops there when it’s warm outside, but in practice, I never did because it was too bright. Everything reflected on the screen, making it barely readable. That doesn’t happen anymore with the nano-texture iPad Pro. Without any discernible image or color degradation compared to the standard iPad Pro, I am – at long last – able to sit outside, enjoy some fresh air, and bask in the sunlight with my dogs while also typing away at my iPad Pro using a screen that remains bright and legible.

Sure, I'm talking about the display now. But I just want to stop for a second and appreciate how elegant and impossibly thin the M4 iPad Pro is.

Sure, I’m talking about the display now. But I just want to stop for a second and appreciate how elegant and impossibly thin the M4 iPad Pro is.

If you know me, you also know where this is going. After years of struggle and begrudging acceptance that it just wasn’t possible, I took my iPad Pro to the beach earlier this year and realized I could work in the sun, with the waves crashing in front of me as I wrote yet another critique of iPadOS. I’ve been trying to do this for years: every summer since I started writing annual iOS reviews 10 years ago, I’ve attempted to work from the beach and consistently given up because it was impossible to see text on the screen under the hot, August sun of the Italian Riviera. That’s not been the case with the 11” iPad Pro. Thanks to its nano-texture glass, I got to have my summer cake and eat it too.

I can see the comments on Reddit already – “Italian man goes outside, realizes fresh air is good” – but believe me, to say that this has been a quality-of-life improvement for me would be selling it short. Most people won’t need the added flexibility and cost of the nano-texture glass. But for me, being unable to efficiently work outside was antithetical to the nature of the iPad Pro itself. I’ve long sought to use a computer that I could take with me anywhere I went. Now, thanks to the nano-texture glass, I finally can.

iPad Pro and Video Recording for MacStories’ Podcasts

I struggled to finish this story for several months because there was one remaining limitation of iPadOS that kept bothering me: I couldn’t figure out how to record audio and video for MacStories’ new video podcasts while also using Zoom.

What I’m about to describe is the new aspect of my iPad workflow I’m most proud of figuring out. After years of waiting for iPadOS to eventually improve when it comes to simultaneous audio and video streams, I used some good old blue ocean strategy to fix this problem. As it turns out, the solution had been staring me in the face the entire time.

Consider again, for a second, the setup I described above. The iPad is connected to a CalDigit Thunderbolt dock, which in turn connects it to my external monitor and the MixPre audio interface. My Neumann microphone is plugged into the MixPre, as are my in-ear buds; as I’ve explained, this allows me to record my audio track separately on the MixPre while coming through to other people on Zoom with great voice quality and also hearing myself back. For audio-only podcasts, this works well, and it’s been my setup for months.

As MacStories started growing its video presence as a complement to text and audio, however, I suddenly found myself needing to record video versions of NPC and AppStories in addition to audio. When I started recording video for those shows, I was using an Elgato FaceCam Pro 4K webcam; the camera had a USB-C connection, so thanks to UVC support, it was recognized by iPadOS, and I could use it in my favorite video-calling apps. So far, so good.

The problem, of course, was that when I was also using the webcam for Zoom, I couldn’t record a video in Camo Studio at the same time. It was my audio recording problem all over again: iPadOS cannot handle concurrent media streams, so if the webcam was being used for the Zoom call, then Camo Studio couldn’t also record its video feed.

Once again, I felt powerless. I’d built this good-looking setup with a light and a microphone arm and a nice poster on the wall, and I couldn’t do it all with my iPad Pro because of some silly software limitation. I started talking to my friend (and co-host of Comfort Zone) Chris Lawley, who’s also been working on the iPad for years, and that’s when it dawned on me: just like I did with audio, I should offload the recording process to external hardware.

The message that started it all.

The message that started it all.

My theory was simple. I needed to find the equivalent of the MixPre, but for video: a camera that I could connect over USB-C to the iPad Pro and use as a webcam in Zoom (so my co-hosts could see me), but which I could also operate to record video on its own SD card, independent of iPadOS. At the end of each recording session, I would grab the audio file from the MixPre, import the video file from the camera, and upload them both to Dropbox – no Mac involved in the process at all.

If the theory was correct – if iPadOS could indeed handle both the MixPre and a UVC camera at the same time while on a Zoom call – then I would be set. I could get rid of my MacBook Air (or what’s left of it, anyway) for good and truly say that I can do everything on my iPad Pro after more than a decade of iPad usage.

And well…I was right.


I did a lot of research on what could potentially be a very expensive mistake, and the camera I decided to go with is the Sony ZV-E10 II. This is a mirrorless Sony camera that’s advertised as made for vlogging and is certified under the Made for iPhone and iPad accessory program. After watching a lot of video reviews and walkthroughs, it seemed like the best option for me for a variety of reasons:

  • I know nothing about photography and don’t plan on becoming a professional photographer. I just wanted a really good camera with fantastic image quality for video recording that could work for hours at a time while recording in 1080p. The ZV-E10 II is specifically designed with vlogging in mind and has an ‘intelligent’ shooting mode that doesn’t require me to tweak any settings for exposure or ISO.
  • The ZV-E10 supports USB-C connection to the iPad – and, specifically, UVC – out of the box. USB connections are automatically detected, so the camera gets picked up on the iPad by apps like Zoom, FaceTime, and Camo Studio.
  • The camera can record video to an SD card while also streaming over USB to an iPad. The recording is completely separate from iPadOS, and I can start it by pressing a physical button on the camera, which plays a helpful sound to confirm when it starts and stops recording. Following Chris’ recommendation, I got this SD card from Lexar, which I plan to rotate on a regular basis to avoid storage degradation.
  • The ZV-E10 II has a flip-out display that can swivel to face me. This allows me to keep an eye on what I look like in the video and has the added benefit of helping the camera run cooler. (More on this below.)

The ZV-E10 II seemed to meet all my requirements for an iPad-compatible mirrorless USB camera, so I ordered one in white (of course, it had to match my other accessories) with the default 16-50mm lens kit. The camera arrived about two months ago, and I’ve been using it to record episodes of AppStories and NPC entirely from my iPad Pro, without using a Mac anywhere in the process.

The latest entry in my iPad production workflow.

The latest entry in my iPad production workflow.

The ZV-E10 II with the display closed.

The ZV-E10 II with the display closed.

To say that I’m happy with this result would be an understatement. There are, however, some implementation details and caveats worth covering.

For starters, the ZV-E10 II notoriously overheats when recording long sessions at 4K, and since NPC tends to be longer than an hour, I had to make sure this wouldn’t happen. Following a tip from Chris, we decided to record all of our video podcasts in 1080p and upscale them to 4K in post-production. This is good enough for video podcasts on YouTube, and it allows us to work with smaller files while preventing the camera from running into any 4K-related overheating issues. Second, to let heat dissipate more easily and quickly while recording, I’m doing two things:

  • I always keep the display open, facing me. This way, heat from the display isn’t transferred back to the main body of the camera.
  • I’m using a “dummy battery”. This is effectively an empty battery that goes into the camera but actually gets its power from a wall adapter. There are plenty available on Amazon, and the one I got works perfectly. With this approach, the camera can stay on for hours at a time since heat is actually produced in the external power supply rather than inside the camera’s battery slot.

In terms of additional hardware, I’m also using a powerful 12” Neewer ring light for proper lighting with an adjustable cold shoe mount to get my angle just right. I tried a variety of ring lights and panels from Amazon; this one had the best balance of power and price for its size. (I didn’t want to get something that was too big since I want to hide its tripod in a closet when not in use.)

My ring light (and, as you can see, my reflection in the folded-out display).

My ring light (and, as you can see, my reflection in the folded-out display).

The other view when the display is open.

The other view when the display is open.

The software story is a bit more simplistic, and right in line with the limitations of iPadOS we’re familiar with. If you’ve followed along with the story so far, you know that I have to plug both my MixPre-3 II and ZV-E10 II into the iPad Pro. To do this, I’m using a CalDigit TS4 dock in the middle that also handles power delivery, Ethernet, and the connection to my monitor. The only problem is that I have to remember to connect my various accessories in a particular order; specifically, I have to plug in my audio interface last, or people on Zoom will hear me speaking through the camera’s built-in microphone.

This happens because, unlike macOS, iPadOS doesn’t have a proper ‘Sound’ control panel in Settings to view and assign different audio sources and output destinations. Instead, everything is “managed” from the barebones Control Center UI, which doesn’t let me choose the MixPre-3 II for microphone input unless it is plugged in last. This isn’t a dealbreaker, but seriously, how silly is it that I can do all this work with an iPad Pro now and its software still doesn’t match my needs?

When streaming USB audio and video to Zoom on the iPad from two separate devices, I also have to remember that if I accidentally open another camera app while recording, video in Zoom will be paused. This is another limitation of iPadOS: an external camera signal can only be active in one app at a time, so if I want to, say, take a selfie while recording on the iPad, I can’t – unless I’m okay with video being paused on Zoom while I do so.

When I’m done recording a video, I press the stop button on the camera, grab its SD card, put it in Apple’s USB-C SD card adapter, and plug it into the iPad Pro. To do this, I have to disconnect the Thunderbolt cable that connects my iPad Pro to the CalDigit TS4. I can’t plug the adapter into the Magic Keyboard’s secondary USB-C port since it’s used for power delivery only, something that I hope will change eventually. In any case, the Files app does a good enough job copying large video files from the SD card to my iPad’s local storage. On a Mac, I would create a Hazel automation to grab the latest file from a connected storage device and upload it to Dropbox; on an iPad, there are no Shortcuts automation triggers for this kind of task, so it has to be done manually.

My trusty official Apple dongle.

My trusty official Apple dongle.

Transferring large video files takes a while, but it works.

Transferring large video files takes a while, but it works.

And that’s pretty much everything I have to share about using a fancy webcam with the iPad Pro. It is, after all, a USB feature that was enabled in iPadOS 17 thanks to UVC; it’s nothing new or specific to iPadOS 18 this year. While I wish I had more control over the recording process and didn’t have to use another SD card to save videos, I’m happy I found a solution that works for me and allows me to keep using the iPad Pro when I’m recording AppStories and NPC.

iPad Pro and the Vision Pro

I’m on the record saying that if the Vision Pro offered an ‘iPad Virtual Display’ feature, my usage of the headset would increase tenfold, and I stand by that. Over the past few weeks, I’ve been rediscovering the joy of the Vision Pro as a (very expensive) media consumption device and stunning private monitor. I want to use the Vision Pro more, and I know that I would if only I could control its apps with the iPad’s Magic Keyboard while also using iPadOS inside visionOS. But I can’t; nevertheless, I persist in the effort.

As Devon covered in his review of visionOS 2, one of the Vision Pro’s new features is the ability to turn into a wireless AirPlay receiver that can mirror the screen of a nearby Apple device. That’s what I’ve been doing lately when I’m alone in the afternoon and want to keep working with my iPad Pro while also immersing myself in an environment or multitasking outside of iPadOS: I mirror the iPad to the Vision Pro and work with iPadOS in a window surrounded by other visionOS windows.

Mirroring to the Vision Pro…

Mirroring to the Vision Pro…

…lets me work with a bigger iPad display on top of my actual iPad.

…lets me work with a bigger iPad display on top of my actual iPad.

Now, I’ll be honest: this is not ideal, and Apple should really get around to making the iPad a first-class citizen of its $3,500 spatial computer just like the Mac can be. If I don’t own a Mac and use an iPad as my main computer instead, I shouldn’t be penalized when I’m using the Vision Pro. I hope iPad Virtual Display is in the cards for 2025 as Apple continues to expand the Vision line with more options. But for now, despite the minor latency that comes with AirPlay mirroring and the lack of true integration between the iPad’s Magic Keyboard and visionOS, I’ve been occasionally working with my iPad inside the Vision Pro, and it’s fun.

There’s something appealing about the idea of a mixed computing environment where the “main computer” becomes a virtual object in a space that is also occupied by other windows. For example, one thing I like to do is activate the Bora Bora beach environment about halfway (so that it’s in front of me, but doesn’t cover my keyboard), turn down the iPad’s display brightness to a minimum (so it’s not distracting), and write in Obsidian for iPad – mirrored via AirPlay to the Vision Pro – while other windows such as Messages, Aura for Spotify, and Safari surround me.

This is better multitasking than Stage Manager – which is funny, because most of these are also iPad apps.

This is better multitasking than Stage Manager – which is funny, because most of these are also iPad apps.

Aforementioned limitations notwithstanding, I’ve found some tangible benefits in this setup. I can keep music playing at a medium volume via the Vision Pro’s audio pods, which sound great but also keep me aware of my surroundings. Potentially distracting apps like Messages can be physically placed somewhere in my room so they’re nearby, but outside my field of view; that way, I can send a quick Tapback reaction using hand gestures or type out a quick response using the Vision Pro’s virtual keyboard, which is only good for those types of responses anyway. And most importantly, I can make my iPad’s mirrored window bigger than any external monitor I have in my apartment, allowing me to place a giant Obsidian window at eye level right in front of me.

Bora Bora multitasking.

Bora Bora multitasking.

Since I started using Spigen’s head strap with my Vision Pro, I completely solved the issue of neck fatigue, so I can wear and work in the headset for hours at a time without any sort of pain or strain on my muscles.

The head strap I use with the Vision Pro.

The head strap I use with the Vision Pro.

I don’t need to extol the virtues of working with a traditional computing environment inside visionOS; for Mac users, it’s a known quantity, and it’s arguably one of the best features of the Vision Pro. (And it’s only gotten better with time.) What I’m saying is that, even with the less flexible and not as technically remarkable AirPlay-based flavor of mirroring, I’ve enjoyed being able to turn my iPad’s diminutive display into a large, TV-sized virtual monitor in front of me. Once again, it goes back to the same idea: I have the most compact iPad Pro I can get, but I can make it bigger via physical or virtual displays. I just wish Apple would take things to the next level here for iPad users as well.

iPad Pro as a Media Tablet for TV and Game Streaming…at Night

In the midst of working with the iPad Pro, something else happened: I fell in love with it as a media consumption device, too. Despite my appreciation for the newly “updated” iPad mini, the combination of a software feature I started using and some new accessories made me completely reevaluate the iPad Pro as a computer I can use at the end of the workday as well. Basically, this machine is always with me now.

Let’s start with the software. This may sound obvious to several MacStories readers, but I recently began using Focus modes again, and this change alone allowed me to transform my iPad Pro into a different computer at night.

Specifically, I realized that I like to use my iPad Pro with a certain Home and Lock Screen configuration during the day and use a different combo with dark mode icons at night, when I’m in bed and want to read or watch something. So after ignoring them for years, I created two Focus modes: Work Mode and Downtime. The first Focus is automatically enabled every morning at 8:00 AM and lasts until 11:59 PM; the other one activates at midnight and lasts until 7:59 AM.3 This way, I have a couple of hours with a media-focused iPad Home Screen before I go to sleep at night, and when I wake up around 9:00 AM, the iPad Pro is already configured with my work apps and widgets.

My 'Downtime Focus' Home Screen.

My ‘Downtime Focus’ Home Screen.

I don’t particularly care about silencing notifications or specific apps during the day; all I need from Focus is a consistent pair of Home and Lock Screens with different wallpapers for each. As you can see from the images in this story, the Work Mode Home Screen revolves around widgets for tasks and links, while the Downtime Home Screen prioritizes media apps and entertainment widgets.

This is something I suggested in my iPad mini review, but the idea here is that software, not hardware, is turning my iPad Pro into a third place device. With the iPad mini, the act of physically grabbing another computer with a distinct set of apps creates a clear boundary between the tools I use for work and play; with this approach, software transforms the same computer into two different machines for two distinct times of day.

I also used two new accessories to smooth out the transition from business during the day to relaxation at night with the iPad Pro. A few weeks back, I was finally able to find the kind of iPad Pro accessory I’d been looking for since the debut of the M4 models: a back cover with a built-in kickstand. Last year, I used a similar cover for the M2 iPad Pro, and the idea is the same: this accessory only protects the back of the device, doesn’t have a cover for the screen, and comes with an adjustable kickstand to use the iPad in landscape at a variety of viewing angles.

The back cover for my iPad Pro.

The back cover for my iPad Pro.

The reason I wanted this product is simple. This is not a cover I use for protecting the iPad Pro; I only want to attach it in the evening, when I’m relaxing with the iPad Pro on my lap and want to get some reading done or watch some TV. In fact, this cover never leaves my nightstand. When I’m done working for the day, I leave the Magic Keyboard on my desk, bring the iPad Pro into the bedroom, and put it in the cover, leaving it there for later.

I know what you’re thinking: couldn’t I just use a Magic Keyboard for the same exact purpose? Yes, I could. But the thing is, because it doesn’t have a keyboard on the front, this cover facilitates the process of tricking my brain into thinking I’m no longer in “work mode”. Even if I wanted, I couldn’t easily type with this setup. By making the iPad Pro more like a tablet rather than a laptop, the back cover – combined with my Downtime Focus and different Home Screen – reminds me that it’s no longer time to get work done with this computer. Once again, it’s all about taking advantage of modularity to transform the iPad Pro into something else – which is precisely what a traditional MacBook could never do.

But I went one step further.

If you recall, a few weeks ago on NPC, my podcast about portable gaming, I mentioned a “gaming pillow” – a strange accessory that promises to provide you with a more comfortable experience when playing with a portable console by combining a small mounting clasp with a soft pillow to put on your lap. Instead of feeling the entire weight of a Steam Deck or Legion Go in your hand, the pillow allows you to mount the console on its arm, offload the weight to the pillow, and simply hold the console without feeling any weight on your hands.

Fun, right? Well, as I mentioned in the episode, that pillow was a no-brand version of a similar accessory that the folks at Mechanism had pre-announced, and which I had pre-ordered and was waiting for. In case you’re not familiar, Mechanism makes a suite of mounting accessories for handhelds, including the popular Deckmate, which I’ve been using for the past year. With the Mechanism pillow, I could combine the company’s universal mounting system for my various consoles with the comfort of the pillow to use any handheld in bed without feeling its weight on my wrists.

I got the Mechanism pillow a few weeks ago, and not only do I love it (it does exactly what the company advertised, and I’ve been using it with my Steam Deck and Legion Go), but I also had the idea of pairing it with the iPad Pro’s back cover for the ultimate iPad mounting solution…in bed.

The gaming pillow paired with my iPad Pro.

The gaming pillow paired with my iPad Pro.

All I had to do was take one of Mechanism’s adhesive mounting clips and stick it to the back of the aforementioned iPad cover. Now, if I want to use the iPad Pro in bed without having to hold it myself, I can attach the cover to the gaming pillow, then attach the iPad Pro to the cover, and, well, you can see the result in the photo above. Believe me when I say this: it looks downright ridiculous, Silvia makes fun of me every single day for using it, and I absolutely adore it. The pillow’s plastic arm can be adjusted to the height and angle I want, and the whole structure is sturdy enough to hold everything in place. It’s peak laziness and iPad comfort, and it works incredibly well for reading, watching TV, streaming games with a controller in my hands, and catching up on my YouTube queue in Play.

The mounting clip attached to the back cover.

The mounting clip attached to the back cover.

Speaking of streaming games, there is one final – and very recent – addition to my iPad-centric media setup I want to mention: NDI streaming.

NDI (which stands for Network Device Interface) is a streaming protocol created by NewTek that allows high-quality video and audio to be transmitted over a local network in real time. Typically, this is done through hardware (an encoder) that gets plugged into the audio/video source and transmits data across your local network for other clients to connect to and view that stream. The advantages of NDI are its plug-and-play nature (clients can automatically discover NDI streamers on the network), high-bandwidth delivery, and low latency.

We initially covered NDI in the context of game streaming on MacStories back in February, when John explained how to use the Kiloview N40 to stream games to a Vision Pro with better performance and less latency than a typical PlayStation Remote Play or Moonlight environment. In his piece, John covered the excellent Vxio app, which remains the premier utility for NDI streaming on both the Vision Pro and iPad Pro. He ended up returning the N40 because of performance issues on his network, but I’ve stuck with it since I had a solid experience with NDI thanks to my fancy ASUS gaming router.

Since that original story on NDI was published, I’ve upgraded my setup even further, and it has completely transformed how I can enjoy PS5 games on my iPad Pro without leaving my bed at night. For starters, I sold my PS5 Slim and got a PS5 Pro. I wouldn’t recommend this purchase to most people, but given that I sit very close to my monitor to play games and can appreciate the graphical improvements enabled by the PS5 Pro, I figured I’d get my money’s worth with Sony’s latest and greatest PS5 revision. So far, I can confirm that the upgrade has been incredible: I can get the best possible graphics in FFVII Rebirth or Astro Bot without sacrificing performance.

My PS5 Pro and N60 encoder next to it.

My PS5 Pro and N60 encoder next to it.

Secondly, I switched from the Kiloview N40 to the bulkier and more expensive Kiloview N60. I did it for a simple reason: it’s the only Kiloview encoder that, thanks to a recent firmware upgrade, supports 4K HDR streaming. The lack of HDR was my biggest complaint about the N40; I could see that colors were washed out and not nearly as vibrant as when I was playing games on my TV. It only seemed appropriate that I would pair the PS5 Pro with the best possible version of NDI encoding out there.

After following developer Chen Zhang’s tips on how to enable HDR input for the N60, I opened the Vxio app, switched to the correct color profile, and was astounded:

The image quality with the N60 is insane. This is Astro Bot being streamed at 4K HDR to my iPad Pro with virtually no latency.

The image quality with the N60 is insane. This is Astro Bot being streamed at 4K HDR to my iPad Pro with virtually no latency.

The image above is a native screenshot of Astro Bot being streaming to my iPad Pro using NDI and the Vxio app over my network. Here, let me zoom in on the details even more:

Now, picture this: it’s late at night, and I want to play some Astro Bot or Final Fantasy VII before going to sleep. I grab my PS5 Pro’s DualSense Edge controller4, wake up the console, switch the controller to my no-haptics profile, and attach the iPad Pro to the back cover mounted on the gaming pillow. With the pillow on my lap, I can play PS5 games at 4K HDR on an OLED display in front of me, directly from the comfort of my bed. It’s the best videogame streaming experience I’ve ever had, and I don’t think I have to add anything else.

I have now achieved my final form.

I have now achieved my final form.

If you told me years ago that a future story about my iPad Pro usage would wrap up with a section about a pillow and HDR, I would have guessed I’d lost my mind in the intervening years. And here we are.

Hardware Mentioned in This Story

Here’s a recap of all the hardware I mentioned in this story:

Back to the iPad

It's good to be home.

It’s good to be home.

After months of research for this story, and after years of experiments trying to get more work done from an iPad, I’ve come to a conclusion:

Sometimes, you can throw money at a problem on the iPad and find a solution that works.

I can’t stress this enough, though: with my new iPad workflow, I haven’t really fixed any of the problems that afflict iPadOS. I found new solutions thanks to external hardware; realistically, I have to thank USB-C more than iPadOS for making this possible. The fact that I’m using my iPad Pro for everything now doesn’t mean I approve of the direction Apple has taken with iPadOS or the slow pace of its development.

As I was wrapping up this story, I found myself looking back and reminiscing about my iPad usage over the past 12 years. One way to look at it is that I’ve been trying to get work done on the iPad for a third of my entire life. I started in 2012, when I was stuck in a hospital bed and couldn’t use a laptop. I persisted because I fell in love with the iPad’s ethos and astounding potential; the idea of using a computer that could transform into multiple things thanks to modularity latched onto my brain over a decade ago and never went away.

I did, however, spend a couple of years in “computer wilderness” trying to figure out if I was still the same kind of tech writer and if I still liked using the iPad. I worked exclusively with macOS for a while. Then I secretly used a Microsoft Surface for six months and told no one about it. Then I created a hybrid Mac/iPad device that let me operate two platforms at once. For a brief moment, I even thought the Vision Pro could replace my iPad and become my main computer.

I’m glad I did all those things and entertained all those thoughts. When you do something for a third of your life, it’s natural to look outside your comfort zone and ask yourself if you really still enjoy doing it.

And the truth is, I’m still that person. I explored all my options – I frustrated myself and my readers with the not-knowing for a while – and came out at the end of the process believing even more strongly in what I knew years ago:

The iPad Pro is the only computer for me.

Even with its software flaws, scattershot evolution, and muddled messaging over the years, only Apple makes this kind of device: a thin, portable slab of glass that can be my modular desktop workstation, a tablet for reading outside, and an entertainment machine for streaming TV and videogames. The iPad Pro does it all, and after a long journey, I found a way to make it work for everything I do.

I’ve stopped using my MacPad, I gave up thinking the Vision Pro could be my main computer, and I’m done fooling myself that, if I wanted to, I could get my work done on Android or Windows.

I’m back on the iPad. And now more than ever, I’m ready for the next 12 years.


  1. NPC listeners know this already, but I recently relocated my desktop-class eGPU (powered by an NVIDIA 4090) to the living room. There are two reasons behind this. First, when I want to play PC games with high performance requirements, I can do so with the most powerful device I own on the best gaming monitor I have (my 65” LG OLED television). And second, I have a 12-meter USB4 cable that allows me to rely on the eGPU while playing on my Legion Go in bed. Plus, thanks to their support for instant sleep and resume, both the PS5 and Switch are well-suited for the kind of shorter play sessions I want to have in the office. ↩︎
  2. Remember when split view for tabs used to be a Safari-only feature↩︎
  3. Oddly enough, despite the fact that I set all my Focus modes to sync between devices, the Work Mode Focus wouldn’t automatically activate on my iPad Pro in the morning (though it would on the iPhone). I had to set up a secondary automation in Shortcuts on the iPad Pro to make sure it switches to that Focus before I wake up. ↩︎
  4. When you’re streaming with NDI, you don’t pair a controller with your iPad since you’re merely observing the original video source. This means that, in the case of my PS5 Pro, its controller needs to be within range of the console when I’m playing in another room. Thankfully, the DualSense has plenty of range, and I haven’t run into any input latency issues. ↩︎

Access Extra Content and Perks

Founded in 2015, Club MacStories has delivered exclusive content every week for nearly a decade.

What started with weekly and monthly email newsletters has blossomed into a family of memberships designed every MacStories fan.

Club MacStories: Weekly and monthly newsletters via email and the web that are brimming with apps, tips, automation workflows, longform writing, early access to the MacStories Unwind podcast, periodic giveaways, and more;

Club MacStories+: Everything that Club MacStories offers, plus an active Discord community, advanced search and custom RSS features for exploring the Club’s entire back catalog, bonus columns, and dozens of app discounts;

Club Premier: All of the above and AppStories+, an extended version of our flagship podcast that’s delivered early, ad-free, and in high-bitrate audio.

Learn more here and from our Club FAQs.

Join Now]]>
Apple Intelligence in iOS 18.2: A Deep Dive into Working with Siri and ChatGPT, Together https://www.macstories.net/stories/apple-intelligence-and-chatgpt-in-18-2/ Wed, 11 Dec 2024 13:10:15 +0000 https://www.macstories.net/?p=77415 The ChatGPT integration in iOS 18.2.

The ChatGPT integration in iOS 18.2.

Apple is releasing iOS and iPadOS 18.2 today, and with those software updates, the company is rolling out the second wave of Apple Intelligence features as part of their previously announced roadmap that will culminate with the arrival of deeper integration between Siri and third-party apps next year.

In today’s release, users will find native integration between Siri and ChatGPT, more options in Writing Tools, a smarter Mail app with automatic message categorization, generative image creation in Image Playground, Genmoji, Visual Intelligence, and more. It’s certainly a more ambitious rollout than the somewhat disjointed debut of Apple Intelligence with iOS 18.1, and one that will garner more attention if only by virtue of Siri’s native access to OpenAI’s ChatGPT.

And yet, despite the long list of AI features in these software updates, I find myself mostly underwhelmed – if not downright annoyed – by the majority of the Apple Intelligence changes, but not for the reasons you may expect coming from me.

Some context is necessary here. As I explained in a recent episode of AppStories, I’ve embarked on a bit of a journey lately in terms of understanding the role of AI products and features in modern software. I’ve been doing a lot of research, testing, and reading about the different flavors of AI tools that we see pop up on almost a daily basis now in a rapidly changing landscape. As I discussed on the show, I’ve landed on two takeaways, at least for now:

  • I’m completely uninterested in generative products that aim to produce images, video, or text to replace human creativity and input. I find products that create fake “art” sloppy, distasteful, and objectively harmful for humankind because they aim to replace the creative process with a thoughtless approximation of what it means to be creative and express one’s feelings, culture, and craft through genuine, meaningful creative work.
  • I’m deeply interested in the idea of assistive and agentic AI as a means to remove busywork from people’s lives and, well, assist people in the creative process. In my opinion, this is where the more intriguing parts of the modern AI industry lie:
    • agents that can perform boring tasks for humans with a higher degree of precision and faster output;
    • coding assistants to put software in the hands of more people and allow programmers to tackle higher-level tasks;
    • RAG-infused assistive tools that can help academics and researchers; and
    • protocols that can map an LLM to external data sources such as Claude’s Model Context Protocol.

I see these tools as a natural evolution of automation and, as you can guess, that has inevitably caught my interest. The implications for the Accessibility community in this field are also something we should keep in mind.

To put it more simply, I think empowering LLMs to be “creative” with the goal of displacing artists is a mistake, and also a distraction – a glossy facade largely amounting to a party trick that gets boring fast and misses the bigger picture of how these AI tools may practically help us in the workplace, healthcare, biology, and other industries.

This is how I approached my tests with Apple Intelligence in iOS and iPadOS 18.2. For the past month, I’ve extensively used Claude to assist me with the making of advanced shortcuts, used ChatGPT’s search feature as a Google replacement, indexed the archive of my iOS reviews with NotebookLM, relied on Zapier’s Copilot to more quickly spin up web automations, and used both Sonnet 3.5 and GPT-4o to rethink my Obsidian templating system and note-taking workflow. I’ve used AI tools for real, meaningful work that revolved around me – the creative person – doing the actual work and letting software assist me. And at the same time, I tried to add Apple’s new AI features to the mix.

Perhaps it’s not “fair” to compare Apple’s newfangled efforts to products by companies that have been iterating on their LLMs and related services for the past five years, but when the biggest tech company in the world makes bold claims about their entrance into the AI space, we have to take them at face value.

It’s been an interesting exercise to see how far behind Apple is compared to OpenAI and Anthropic in terms of the sheer capabilities of their respective assistants; at the same time, I believe Apple has some serious advantages in the long term as the platform owner, with untapped potential for integrating AI more deeply within the OS and apps in a way that other AI companies won’t be able to. There are parts of Apple Intelligence in 18.2 that hint at much bigger things to come in the future that I find exciting, as well as features available today that I’ve found useful and, occasionally, even surprising.

With this context in mind, in this story you won’t see any coverage of Image Playground and Image Wand, which I believe are ridiculously primitive and perfect examples of why Apple may think they’re two years behind their competitors. Image Playground in particular produces “illustrations” that you’d be kind to call abominations; they remind me of the worst Midjourney creations from 2022. Instead, I will focus on the more assistive aspects of AI and share my experience with trying to get work done using Apple Intelligence on my iPhone and iPad alongside its integration with ChatGPT, which is the marquee addition of this release.

Let’s dive in.

ChatGPT Integration: Siri and Writing Tools

Apple Intelligence in iOS and iPadOS 18.2 offers direct integration with OpenAI’s ChatGPT using the GPT-4o model. This is based on a ChatGPT extension that can be enabled in Settings ⇾ Apple Intelligence & Siri ⇾ Extensions.

Setting up the ChatGPT extension.

Setting up the ChatGPT extension.

The mere existence of an ‘Extensions’ section seems to confirm that Apple may consider offering other LLMs in the future in addition to ChatGPT, but that’s a story for another time. For now, you can only choose to activate the ChatGPT extension (it’s turned off by default), and in doing so, you have two options. You can choose to use ChatGPT as an anonymous, signed-out user. In this case, your IP address will be obscured on OpenAI’s servers, and only the contents of your request will be sent to ChatGPT. According to Apple, while in this mode, OpenAI must process your request and discard it afterwards; furthermore, the request won’t be used to improve or train OpenAI’s models.

You can also choose to log in with an existing ChatGPT account directly from the Settings app. When logged in, OpenAI’s data retention policies will apply, and your requests may be used for training of the company’s models. Furthermore, your conversations with Siri that involve ChatGPT processing will be saved in your OpenAI account, and you’ll be able to see your previous Siri requests in ChatGPT’s conversation sidebar in the ChatGPT app and website.

The onboarding flow for ChatGPT.

The onboarding flow for ChatGPT.

You have the option to use ChatGPT for free or with your paid ChatGPT Plus account. In the ChatGPT section of the Settings app, Apple shows the limits that are in place for free users and offers an option to upgrade to a Plus account directly from Settings. According to Apple, only a small number of requests that use the latest GPT-4o and DALL-E 3 models can be processed for free before having to upgrade. For this article, I used my existing ChatGPT Plus account, so I didn’t run into any limits.

The ChatGPT login flow in Settings.

The ChatGPT login flow in Settings.

But how does Siri actually determine if ChatGPT should swoop in and answer a question on its behalf? There are more interesting caveats and implementation details worth covering here.

By default, Siri tries to determine if any regular request may be best answered by ChatGPT rather than Siri itself. In my experience, this usually means that more complicated questions or those that pertain to “world knowledge” outside of Siri’s domain get handed off to ChatGPT and are subsequently displayed by Siri with its new “snippet” response style in iOS 18 that looks like a taller notification banner.

A response from ChatGPT displayed in the new Siri UI.

A response from ChatGPT displayed in the new Siri UI.

For instance, if I ask “What’s the capital of Italy?”, Siri can respond with a rich snippet that includes its own answer accompanied by a picture. However, if I ask “What’s the capital of Italy, and has it always been the capital of Italy?”, the additional information required causes Siri to automatically fall back to ChatGPT, which provides a textual response.

Basic questions (left) can still be answered by Siri itself; ask for more details, however, and ChatGPT comes in.

Basic questions (left) can still be answered by Siri itself; ask for more details, however, and ChatGPT comes in.

Siri knows its limits; effectively, ChatGPT has replaced the “I found this on the web” results that Siri used to bring up before it had access to OpenAI’s knowledge. In the absence of a proper Siri LLM (more on this later), I believe this is a better compromise than the older method that involved Google search results. At the very least, now you’re getting an answer instead of a bunch of links.

You can also format your request to explicitly ask Siri to query ChatGPT. Starting your request with “Ask ChatGPT…” is a foolproof technique to go directly to ChatGPT, and you should use it any time you’re sure Siri won’t be able to answer immediately.

I should also note that, by default, Siri in iOS 18.2 will always confirm with you whether you want to send a request to ChatGPT. There is, however, a way to turn off these confirmation prompts: on the ChatGPT Extension screen in Settings, turn off the ‘Confirm ChatGPT Requests’ option, and you’ll no longer be asked if you want to pass a request to ChatGPT every time. Keep in mind, though, that this preference is ignored when you’re sending files to ChatGPT for analysis, in which case you’ll always be asked to confirm your request since those files may contain sensitive information.

By default, you'll be asked to confirm if you want to use ChatGPT to answer questions. You can turn this off.

By default, you’ll be asked to confirm if you want to use ChatGPT to answer questions. You can turn this off.

The other area of iOS and iPadOS that is receiving ChatGPT integration today is Writing Tools, which debuted in iOS 18.1 as an Apple Intelligence-only feature. As we know, Writing Tools are now prominently featured system-wide in any text field thanks to their placement in the edit menu, and they’re also available directly in the top toolbar of the Notes app.

The updated Writing Tools in iPadOS 18.2.

The updated Writing Tools in iPadOS 18.2.

In iOS 18.2, Writing Tools gain the ability to refine text by letting you describe changes you want made, and they also come with a new ‘Compose’ submenu powered by ChatGPT, which lets you ask OpenAI’s assistant to write something for you based on the content of the document you’re working on.

If the difference between the two sounds confusing, you’re not alone. Here’s how you can think about it, though: the ‘Describe your change’ text field at the top of Writing Tools defaults to asking Apple Intelligence, but may fall back to ChatGPT if Apple Intelligence doesn’t know what you mean; the Compose menu always uses ChatGPT. It’s essentially just like Siri, which tries to answer on its own, but may rely on ChatGPT and also includes a manual override to skip Apple Intelligence altogether.

The ability to describe changes is a more freeform way to rewrite text beyond the three default buttons available in Writing Tools for Friendly, Professional, and Concise tones.

The ability to describe changes is a more freeform way to rewrite text beyond the three default buttons available in Writing Tools for Friendly, Professional, and Concise tones.

With Compose, you can use the contents of a note as a jumping-off point to add any other content you want via ChatGPT.

With Compose, you can use the contents of a note as a jumping-off point to add any other content you want via ChatGPT.

You can also refine results in the Compose screen with follow-up questions while retaining the context of the current document. In this case, ChatGPT composed a list of more games similar to Wind Waker, which was the main topic of the note.

You can also refine results in the Compose screen with follow-up questions while retaining the context of the current document. In this case, ChatGPT composed a list of more games similar to Wind Waker, which was the main topic of the note.

In testing the updated Writing Tools with ChatGPT integration, I’ve run into some limitations that I will cover below, but I also had two very positive experiences with the Notes app that I want to mention here since they should give you an idea of what’s possible.

In my first test, I was working with a note that contained a list of payments for my work at MacStories and Relay FM, plus the amount of taxes I was setting aside each month. The note originated in Obsidian, and after I pasted it into Apple Notes, it lost all its formatting.

There were no proper section headings, the formatting was inconsistent between paragraphs, and the monetary amounts had been entered with different currency symbols for EUR. I wanted to make the note look prettier with consistent formatting, so I opened the ‘Compose’ field of Writing Tools and sent ChatGPT the following request:

This is a document that describes payments I sent to myself each month from two sources: Relay FM and MacStories. The currency is always EUR. When I mention “set aside”, it means I set aside a percentage of those combined payments for tax purposes. Can you reformat this note in a way that makes more sense?

Where I started.

Where I started.

I hit Return, and after a few seconds, ChatGPT reworked my text with a consistent structure organized into sections with bullet points and proper currency formatting. I was immediately impressed, so I accepted the suggested result, and I ended up with the same note, elegantly formatted just like I asked.

And the formatted result, composed by ChatGPT.

And the formatted result, composed by ChatGPT.

This shouldn’t come as a surprise: ChatGPT – especially the GPT-4o model – is pretty good at working with numbers. Still, this is the sort of use case that makes me optimistic about this flavor of AI integration; I could have done this manually by carefully selecting text and manually making each line consistent, but it was going to be boring busywork that would have wasted a bunch of my time. And that’s time that is, frankly, best spent doing research, writing, or promoting my work on social media. Instead, Writing Tools and ChatGPT worked with my data, following a natural language query, and modified the contents of my note in seconds. Even better, after the note had been successfully updated, I was able to ask for more additional information including averages, totals for each revenue source, and more. I could have done this in a spreadsheet, but I didn’t want to (and I also never understood formulas), and it was easier to do so with natural language in a popup menu of the Notes app.

Fun detail: here's how a request initiated from the Notes app gets synced to your ChatGPT account. Note the <code>prompt</code> and <code>surroundingText</code> keys of the JSON object the Notes app sends to ChatGPT.

Fun detail: here’s how a request initiated from the Notes app gets synced to your ChatGPT account. Note the prompt and surroundingText keys of the JSON object the Notes app sends to ChatGPT.

The second example of ChatGPT and Writing Tools applied to regular MacStories work involves our annual MacStories Selects awards. Before getting together with the MacStories team on a Zoom call to discuss our nominees and pick winners, we created a shared note in Apple Notes where different writers entered their picks. When I opened the note, I realized that I was behind others and forgot to enter the different categories of awards in my section of the document. So I invoked ChatGPT’s Compose menu under a section heading with my name and asked:

Can you add a section with the names of the same categories that John used? Just the names of those categories.

My initial request.

My initial request.

A few seconds later, Writing Tools pasted this section below my name:

This may seem like a trivial task, but I don’t think it is. ChatGPT had to evaluate a long list of sections (all formatted differently from one another), understand where the sections entered by John started and ended, and extract the names of categories, separating them from the actual picks under each category. Years ago, I would have had to do a lot of copying and pasting, type it all out manually, or write a shortcut with regular expressions to automate this process. Now, the “automation” takes place as a natural language command that has access to the contents of a note and can reformat it accordingly.

As we’ll see below, there are plenty of scenarios in which Writing Tools, despite the assistance from ChatGPT, fails at properly integrating with the Notes app and understanding some of the finer details behind my requests. But given that this is the beginning of a new way to think about working with text in any text field (third-party developers can integrate with Writing Tools), I’m excited about the prospect of abstracting app functionalities and formatting my documents in a faster, more natural way.

The Limitations – and Occasional Surprises – of Siri’s Integration with ChatGPT

Having used ChatGPT extensively via its official app on my iPhone and iPad for the past month, one thing is clear to me: Apple has a long way to go if they want to match what’s possible with the standalone ChatGPT experience in their own Siri integration – not to mention with Siri itself without the help from ChatGPT.

The elephant in the room here is the lack of a single, self-contained Siri LLM experience in the form of an app that can remember all of your conversations and keep the context of an ongoing conversation across multiple sessions. Today, Apple’s efforts to infuse Siri with more “Apple Intelligence” result in a scattershot implementation comprised of disposable interactions that forego the true benefits of LLMs, lacking a cohesive vision. It’s quite telling that the best part of the “new” Siri experience is the ChatGPT integration in 18.2, and even then, it’s no replacement for the full-featured ChatGPT app.

With ChatGPT on my iPhone and iPad, all my conversations and their full transcripts are saved and made accessible for later. I can revisit a conversation about any topic I’m researching with ChatGPT days after I started it and pick up exactly where I left off. Even while I’m having a conversation with ChatGPT, I can look further up in the transcript and see what was said before I continue asking anything else. The whole point of modern LLMs is to facilitate this new kind of computer-human conversation where the entire context can be referenced, expanded upon, and queried.

Siri still doesn’t have any of this – and that’s because it really isn’t based on an LLM yet.1 While Siri can hold some context of a conversation while traversing from question to question, it can’t understand longer requests written in natural language that reference a particular point of an earlier request. It doesn’t show you the earlier transcript, whether you’re talking or typing to it. By and large, conversations in Siri are still ephemeral. You ask a question, get a response, and can ask a follow-up question (but not always); as soon as Siri is dismissed, though, the entire conversation is gone.

As a result, the ChatGPT integration in iOS 18.2 doesn’t mean that Siri can now be used for production workflows where you want to hold an ongoing conversation about a topic or task and reference it later. ChatGPT is the shoulder for Siri to temporarily cry on; it’s the guardian parent that can answer basic questions in a better way than before while ultimately still exposing the disposable, inconsistent, impermanent Siri that is far removed from the modern experience of real LLMs.

Do not expect the same chatbot experience as Claude (left) or ChatGPT (right) with the new ChatGPT integration in Siri.

Do not expect the same chatbot experience as Claude (left) or ChatGPT (right) with the new ChatGPT integration in Siri.

Or, taken to a bleeding-edge extreme, do not expect the kind of long conversations with context recall and advanced reasoning you can get with ChatGPT's most recent models in the updated Siri for iOS 18.2.

Or, taken to a bleeding-edge extreme, do not expect the kind of long conversations with context recall and advanced reasoning you can get with ChatGPT’s most recent models in the updated Siri for iOS 18.2.

But let’s disregard for a second the fact that Apple doesn’t have a Siri LLM experience comparable to ChatGPT or Claude yet, assume that’s going to happen at some point in 2026, and remain optimistic about Siri’s future. I still believe that Apple isn’t taking advantage of ChatGPT enough and could do so much more to make iOS 18 seem “smarter” than it actually is while relying on someone else’s intelligence.

Unlike other AI companies, Apple has a moat: they make the physical devices we use, create the operating systems, and control the app ecosystem. Thus, Apple has an opportunity to leverage deep, system-level integrations between AI and the apps billions of people use every day. This is the most exciting aspect of Apple Intelligence; it’s a bummer that, despite the help from ChatGPT, I’ve only seen a handful of instances in which AI results can be used in conjunction with apps. Let me give you some examples and comparisons between ChatGPT and Siri to show you what I mean.

In addition to text requests, ChatGPT has been integrated with image and file uploads across iOS and iPadOS. For example, if you have a long PDF document you want to summarize, you can ask Siri to give you a summary of it, and the assistant will display a file upload popup that says the item will be sent to ChatGPT for analysis.

Sending a PDF to ChatGPT for analysis and summarization.

Sending a PDF to ChatGPT for analysis and summarization.

In this popup, you can choose the type of file representation you want to send: you can upload a screenshot of a document to ChatGPT directly from Siri, or you can give it the contents of the entire document. This technique isn’t limited to documents, nor is it exclusive to the style of request I mentioned above. Any time you invoke Siri while looking at a photo, webpage, email message, or screenshot, you can invoke requests like…

  • “What am I looking at here?”
  • “What does this say?”
  • “Take a look at this and give me actionable items.”

…and ChatGPT will be summoned – even without explicitly saying, “Ask ChatGPT…” – with the file upload permission prompt. As of iOS and iPadOS 18.2, you can always choose between sending a copy of the full content of an item (usually as a PDF) or a screenshot of just what’s shown on-screen.

In any case, after a few seconds, ChatGPT will provide a response based on the file you gave it, and this is where things get interesting – in both surprising and disappointing ways.

You can also ask follow-up questions after the initial file upload, but you can't scroll back to see previous responses.

You can also ask follow-up questions after the initial file upload, but you can’t scroll back to see previous responses.

By default, you’ll find a copy button in the notification with the ChatGPT response, so that’s nice. Between the Side button, Type to Siri (which also got a Control Center control in 18.2), and the copy button next to responses, the iPhone now has the fastest way to go from a spoken/typed request to a ChatGPT response copied to the clipboard.

But what if you want to do more with a response? In iOS and iPadOS 18.2, you can follow up to a ChatGPT response with, “Make a note out of this”, and the response will be saved as a new note in the Notes app with a nice UI shown in the Siri notification.

Saving a ChatGPT response in Siri as a new note.

Saving a ChatGPT response in Siri as a new note.

This surprised me, and it’s the sort of integration that makes me hopeful about the future role of an LLM on Apple platforms – a system that can support complex conversations while also sending off responses into native apps.

Sadly, this is about as far as Apple’s integration between ChatGPT and apps went for this release. Everything else that I tried did not work, in the sense that Siri either didn’t understand what I was asking for or ChatGPT replied that it didn’t have enough access to my device to perform that action.

Specifically:

  • If instead of, “Make a note”, I asked to, “Append this response to my note called [Note Title]”, Siri didn’t understand me, and ChatGPT said it couldn’t do it.
  • When I asked ChatGPT to analyze the contents of my clipboard, it said it couldn’t access it.
  • When I asked to, “Use this as input for my [shortcut name] shortcut”, ChatGPT said it couldn’t run shortcuts.

Why is it that Apple is making a special exception for creating notes out of responses, but nothing else works? Is this the sort of thing that will magically get better once Apple Intelligence gets connected to App Intents? It’s hard to tell right now.

The lackluster integration between ChatGPT and native system functions goes beyond Siri responses and extends to Writing Tools. When I attempted to go even slightly beyond the guardrails of the Compose feature, things got weird:

  • Remember the Payments note I was so impressed with? When I asked ChatGPT in the Compose field to, “Make a table out of this”, it did generate a result…as a plain text list without the proper formatting for a native table in the Notes app.
  • When I asked ChatGPT to, “Turn this selected Markdown into rich text”, it performed the conversion correctly – except that Notes pasted the result as raw HTML in the body of the note.
  • ChatGPT can enter and reformat headings inside a note, but they’re in a different format than the Notes app’s native ‘Heading’ style. I have no idea where that formatting style is coming from.
When I asked Apple Intelligence to convert Markdown to rich text, it asked me to do it with ChatGPT instead.

When I asked Apple Intelligence to convert Markdown to rich text, it asked me to do it with ChatGPT instead.

But when I asked ChatGPT, it composed raw HTML.

But when I asked ChatGPT, it composed raw HTML.

Clearly, Apple has some work to do if they want to match user requests with the native styling and objects supported by the Notes app. But that’s not the only area where I’ve noticed a disparity between Siri and ChatGPT’s capabilities, resulting in a strange mix of interactions when the two are combined.

One of my favorite features of ChatGPT’s website and app is the ability to store bits of data in a personal memory that can be recalled at any time. Memories can be used to provide further context to the LLM in future requests as well as to jot down something that you want to remember later. Alas, ChatGPT accessed via Siri can’t retrieve the user’s personal memories, despite the ability to log into your ChatGPT account and save conversations you have with Siri. When asked to access my memory, ChatGPT via Siri responds as such:

I’m here to assist you by responding to your questions and requests, but I don’t have the ability to access any memory or personal data. I operate only within the context of our current conversation.

That’s too bad, and it only exacerbates the fact that Apple is limited to an à la carte assistant that doesn’t really behave like an LLM (because it can’t).

The most ironic part of the Siri-ChatGPT relationship, however, is that Siri is not multilingual, but ChatGPT is, so you can use OpenAI’s assistant to fill a massive hole in Siri’s functionality via some clever prompting.

My Siri is set to English, but if I ask it in Italian, “Chiedi a ChatGPT” (“Ask ChatGPT”), followed by an Italian request, “Siri” will respond in Italian since ChatGPT – in addition to different modalities – also supports hopping between languages in the same conversation. Even if I take an Italian PDF document and tell Siri in English to, “Ask ChatGPT to summarize this in its original language”, that’s going to work.

On its own, Siri is not bilingual…

On its own, Siri is not bilingual…


…but with ChatGPT, it can be.

…but with ChatGPT, it can be.

Speaking as a bilingual person, this is terrific – but at the same time, it underlines how deeply ChatGPT puts Siri to shame when it comes to being more accessible for international users. What’s even funnier is that Siri tries to tell me I’m wrong when I’m typing in Italian in its English text field (and that’s in spite of the new bilingual keyboard in iOS 18), but when the request is sent off to ChatGPT, it doesn’t care.

I want to wrap up this section with an example of what I mean by assistive AI in regards to productivity and why I now believe so strongly in the potential to connect LLMs with apps.

I’ve been trying Todoist again lately, and I discovered the existence of a TodoistGPT extension for ChatGPT that lets you interact with the task manager using ChatGPT’s natural language processing. So I had an idea: what if I took a screenshot of a list in the Reminders app and asked ChatGPT to identify the tasks in it and recreate them with the same properties in Todoist?

I asked:

 This is a screenshot of a work project in the Reminders app. Can you identify the two remaining tasks in it, along with their due dates and, if applicable, repeat patterns?

ChatGPT identified them correctly, parsing the necessary fields for title, due date, and repeat pattern. I then followed up by asking:

Can you add these to my Work Review project?

And, surely enough, the tasks found in the image were recreated as new tasks in my Todoist account.

In ChatGPT, I was able to use its vision capabilities to extract tasks from a screenshot, then invoke a custom GPT to recreate them with the same properties in Todoist.

In ChatGPT, I was able to use its vision capabilities to extract tasks from a screenshot, then invoke a custom GPT to recreate them with the same properties in Todoist.

The tasks in Todoist.

The tasks in Todoist.

Right now, Siri can’t do this. Even though the ChatGPT integration can recognize the same tasks, asking Siri a follow-up question to add those tasks to Reminders in a different list will fail.

Meanwhile, ChatGPT can perform the same image analysis via Siri, but the resulting text is not actionable at all.

Meanwhile, ChatGPT can perform the same image analysis via Siri, but the resulting text is not actionable at all.

Think about this idea for a second: in theory, the web-based integration I just described is similar to the scenario Apple is proposing with App Intents and third-party apps in Apple Intelligence. Apple has the unique opportunity to leverage the millions of apps on the App Store – and the multiple thousands that will roll out App Intents in the short term – to quickly spin up an ecosystem of third-party integrations for Apple Intelligence via the apps people already use on their phones.

How will that work without a proper Siri LLM? How flexible will the app domains supported at launch be in practice? It’s hard to tell now, but it’s also the field of Apple Intelligence that – unlike gross and grotesque image generation features – has my attention.

Visual Intelligence

The other area of iOS that now features ChatGPT integration is Visual Intelligence. Originally announced in September, Visual Intelligence is a new Camera Control mode and, as such, exclusive to the new iPhone 16 family of devices.

The new Visual Intelligence camera mode of iOS 18.2.

The new Visual Intelligence camera mode of iOS 18.2.

With Visual Intelligence, you can point your iPhone’s camera at something and get information about what’s in frame from either ChatGPT or Google search – the first case of two search providers embedded within the same Apple Intelligence functionality of iOS. Visual Intelligence is not a real-time camera view that can overlay information on top of a live camera feed; instead, it freezes the frame and sends a picture to ChatGPT or Google, without saving that image to your photo library.

The interactions of Visual Intelligence are fascinating, and an area where I think Apple did a good job picking a series of reasonable defaults. You activate Visual Intelligence by long-pressing on Camera Control, which reveals a new animation that combines the glow effect of the new Siri with the faux depressed button state first seen with the Action and volume buttons in iOS 18. It looks really nice. After you hold down for a second, you’ll feel some haptic feedback, and the camera view of Visual Intelligence will open in the foreground.

The Visual Intelligence animation.

Once you’re in camera mode, you have two options: you either manually press the shutter button to freeze the frame then choose between ChatGPT and Google, or you press one of those search providers first, and the frame will be frozen automatically.

Google search results in Visual Intelligence.

Google search results in Visual Intelligence.

Google is the easier integration to explain here. It’s basically reverse image search built into the iPhone’s camera and globally available via Camera Control. I can’t tell you how many times my girlfriend and I rely on Google Lens to look up outfits we see on TV, furniture we see in magazines, or bottles of wine, so having this built into iOS without having to use Google’s iPhone app is extra nice. Results appear in a popup inside Visual Intelligence, and you can pick one to open it in Safari. As far as integrating Google’s reverse image search with the operating system goes, Apple has pretty much nailed the interaction here.

ChatGPT has been equally well integrated with the Visual Intelligence experience. By default, when you press the ‘Ask’ button, ChatGPT will instantly analyze the picture and describe what you’re looking at, so you have a starting point for the conversation. The whole point of this feature, in fact, is to be able to inquire about additional details or use the picture as visual context for a request you have.

My [NPC](https://www.macstories.net/npc/) co-hosts still don't know anything about this new handheld, and ChatGPT's response is correct.

My NPC co-hosts still don’t know anything about this new handheld, and ChatGPT’s response is correct.

You can also ask follow-up questions to ChatGPT in Visual Intelligence.

You can also ask follow-up questions to ChatGPT in Visual Intelligence.

I’ll give you an example. A few days ago, Silvia and I noticed that the heated tower rail in our bathroom was making a low hissing noise. There were clearly valves we were supposed to operate to let air out of the system, but I wanted to be sure because I’m not a plumber. So I invoked Visual Intelligence, took a picture, and asked ChatGPT – in Italian – how I was supposed to let the air out. Within seconds, I got the confirmation I was looking for: I needed to turn the valve in the upper left corner.

This was useful.

This was useful.

I can think of plenty of other scenarios in everyday life where the ability to ask questions about what I’m looking at may be useful. Whether you’re looking up instructions to operate different types of equipment, dealing with recipes, learning more about landmarks, or translating signs and menus in a different country, there are clear, tangible benefits when it comes to augmenting vision with the conversational knowledge of an LLM.

By default, ChatGPT doesn't have access to web search in Visual Intelligence. If you want to continue a request by looking up web results, you'll have to use the ChatGPT app.

By default, ChatGPT doesn’t have access to web search in Visual Intelligence. If you want to continue a request by looking up web results, you’ll have to use the ChatGPT app.

Right now, all Apple Intelligence queries to ChatGPT are routed to the GPT-4o model; I can imagine that, with the o1 model now supporting image uploads, Apple may soon offer the option to enable slower but more accurate visual responses powered by advanced reasoning. In my tests, GPT-4o has been good enough to address the things I was showing it via Visual Intelligence. It’s a feature I plan to use often – certainly more than the other (confusing) options of Camera Control.

The Future of a Siri LLM

Sure, Siri.

Sure, Siri.

Looking ahead at the next year, it seems clear that Apple will continue taking a staged approach to evolving Apple Intelligence in their bid to catch up with OpenAI, Anthropic, Google, and Meta.

Within the iOS 18 cycle, we’ll see Siri expand its on-screen vision capabilities and gain the ability to draw on users’ personal context; then, Apple Intelligence will be integrated with commands from third-party apps based on schemas and App Intents; according to rumors, this will culminate with the announcement of a second-generation Siri LLM at WWDC 2025 that will feature a more ChatGPT-like assistant capable of holding longer conversations and perhaps storing them for future access in a standalone app. We can speculatively assume that Siri LLM will be showcased at WWDC 2025 and released in the spring of 2026.

Taking all this into account, it’s evident that, as things stand today, Apple is two years behind their competitors in the AI chatbot space. Training large language models is a time-consuming, expensive task that is ballooning in cost and, according to some, leading to diminishing returns as a byproduct of scaling laws.

Today, Apple is stuck between the proverbial rock and hard place. ChatGPT is the fastest-growing software product in modern history, Meta’s bet on open-source AI is resulting in an explosion of models that can be trained and integrated into hardware accessories, agents, and apps with a low barrier to entry, and Google – facing an existential threat to search at the hands of LLM-powered web search – is going all-in on AI features for Android and Pixel phones. Like it or not, the vast majority of consumers now expect AI features on their devices; whether Apple was caught flat-footed here or not, the company today simply doesn’t have the technology to offer an experience comparable to ChatGPT, Llama-based models, Claude, or Gemini, that’s entirely powered by Siri.

So, for now, Apple is following the classic “if you can’t beat them, join them” playbook. ChatGPT and other chatbots will supplement Siri with additional knowledge; meanwhile, Apple will continue to release specialized models optimized for specific iOS features, such as Image Wand in Notes, Clean Up in Photos, summarization in Writing Tools, inbox categorization in Mail, and so forth.

All this begs a couple of questions. Will Apple’s piecemeal AI strategy be effective in slowing down the narrative that they are behind other companies, showing their customers that iPhones are, in fact, powered by AI? And if Apple will only have a Siri LLM by 2026, where will ChatGPT and the rest of the industry be by then?

Given the pace of AI tools’ evolution in 2024 alone, it’s easy to look at Apple’s position and think that, no matter their efforts and the amount of capital thrown at the problem, they’re doomed. And this is where – despite my belief that Apple is indeed at least two years behind – I disagree with this notion.

You see, there’s another question that begs to be asked: will OpenAI, Anthropic, or Meta have a mobile operating system or lineup of computers with different form factors in two years? I don’t think they will, and that buys Apple some time to catch up.

In the business and enterprise space, it’s likely that OpenAI, Microsoft, and Google will become more and more entrenched between now and 2026 as corporations begin gravitating toward agentic AI and rethink their software tooling around AI. But modern Apple has never been an enterprise-focused company. Apple is focused on personal technology and selling computers of different sizes and forms to, well, people. And I’m willing to bet that, two years from now, people will still want to go to a store and buy themselves a nice laptop or phone.

Despite their slow progress, this is Apple’s moat. The company’s real opportunity in the AI space shouldn’t be to merely match the features and performance of chatbots; their unique advantage is the ability to rethink the operating systems of the computers we use around AI.

Don’t be fooled by the gaudy, archaic, and tone-deaf distractions of Image Playground and Image Wand. Apple’s true opening is in the potential of breaking free from the chatbot UI, building an assistive AI that works alongside us and the apps we use every day to make us more productive, more connected, and, as always, more creative.

That’s the artificial intelligence I hope Apple is building. And that’s the future I’d like to cover on MacStories.


  1. Apple does have some foundation models in iOS 18, but in the company’s own words, “The foundation models built into Apple Intelligence have been fine-tuned for user experiences such as writing and refining text, prioritizing and summarizing notifications, creating playful images for conversations with family and friends, and taking in-app actions to simplify interactions across apps.” ↩︎

Access Extra Content and Perks

Founded in 2015, Club MacStories has delivered exclusive content every week for nearly a decade.

What started with weekly and monthly email newsletters has blossomed into a family of memberships designed every MacStories fan.

Club MacStories: Weekly and monthly newsletters via email and the web that are brimming with apps, tips, automation workflows, longform writing, early access to the MacStories Unwind podcast, periodic giveaways, and more;

Club MacStories+: Everything that Club MacStories offers, plus an active Discord community, advanced search and custom RSS features for exploring the Club’s entire back catalog, bonus columns, and dozens of app discounts;

Club Premier: All of the above and AppStories+, an extended version of our flagship podcast that’s delivered early, ad-free, and in high-bitrate audio.

Learn more here and from our Club FAQs.

Join Now]]>
MacStories Selects 2024: Recognizing the Best Apps of the Year https://www.macstories.net/stories/macstories-selects-2024-recognizing-the-best-apps-of-the-year/ Mon, 09 Dec 2024 15:43:28 +0000 https://www.macstories.net/?p=77422

John: 2024 was a big year for apps, but it was also different from most. More often than not, app innovation is driven by new Apple APIs; that wasn’t the case this year. Instead, it was other trends that shaped the apps we love.

Artificial intelligence played a big role, with some apps adopting it in clever ways to reduce user friction while other developers reacted to it by adopting a more human-centric, creative approach. The rapidly evolving social media landscape played a part, too, with new ways to communicate and manage our timelines emerging.

However, the biggest driver of change in the world of apps this year was government regulation led by the European Commission. The full effects of the Digital Markets Act and the U.S. Department of Justice’s antitrust action against Apple have yet to play out, but nothing since the introduction of the App Store has shaken up the status quo like governments in the EU, U.S., and elsewhere did in 2024.

Not all regulatory effects were what developers wanted – or even positive – but as 2024 winds down, it’s undeniable that apps that weren’t possible before regulation are now available worldwide. Plus, developers in certain parts of the world have more options than before, which we at MacStories are happy to see as fans of apps and their makers. Let’s hope the opening up of the App Store continues and spreads geographically in 2025 and beyond.

Change lingers in the air, which makes me excited for the apps that 2025 will bring, but before we shut off the lights on 2024, it’s time to pause as we do each year to reflect on the many apps we tried in the year gone by and recognize the best among them.

Like last year, the MacStories team picked the best apps in seven categories:

  • Best New App
  • Best App Update
  • Best New Feature
  • Best Watch App
  • Best Mac App
  • Best Design
  • App of the Year

Club MacStories members were part of the selection process, too, picking the winner of the MacStories Selects Readers’ Choice Award. And as we’ve done the past few years, we named a Lifetime Achievement Award winner that has stood the test of time and had an outsized impact on the world of apps. This year’s winner, which joins past winners Pixelmator, PCalc, and Drafts, is the subject of a special story Niléane wrote for the occasion.

As usual, Federico and I also recorded a special episode of AppStories covering all the winners and runners-up. It’s a terrific way to learn more about this year’s apps. Plus, it’s on our YouTube channel this year, giving you a chance to actually see the awards as we cover them.

You can also listen to the episode below.

0:00
55:19

And with that, it’s my pleasure to unveil the 2024 MacStories Selects Awards.

[table_of_contents]

Best New App

Croissant

John: The splintering of social media began long before 2024, but it accelerated this year. For better or worse, the iPhone is inextricably linked to social media – especially Twitter, which grew up in tandem with the device. Twitter’s decline has been well-documented as the service has shed waves of users, advertisers, and employees. Today, its successor, X, is unrecognizable compared to Twitter’s heyday.

What’s been different in 2024 is the rise of Twitter alternatives. The process began in earnest last year, but as Threads opened up to more of the world and added features, Bluesky gained traction among a wider audience, and Mastodon continued to dominate the tech sphere, social media became hard. Sure, you could pick one service and run with it, and many people did. However, I know that a lot of the MacStories audience didn’t just pick one; instead, we nerds split our time and attention between two and, often, three services.

That’s where Croissant comes into the picture. It’s the kind of focused utility that we love at MacStories. It doesn’t try to consolidate your social feeds, connect you with friends, or do anything else to step in between you and your social media. Instead, Croissant focuses on one thing: cross-posting to Mastodon, Bluesky, and Threads, the three social media services that we and many of our readers use most.

Croissant was launched by Ben McCarthy and Aaron Vegh in October as an iPhone-only app. In the two months since its debut, though, Croissant has added iPad and Mac versions, along with other refinements.

The app’s design is excellent. The primary focus is on the compose view, which lets you craft the perfect hot take before launching it into the world. Then when you do, the app displays progress bars to confirm you’ve successfully posted to each service. And as Brendon Bigley recently explained on AppStories, Croissant has the added benefit of enabling you to post without being tempted to catch up on your timeline.

Just as clever as the app itself is its business model. There are a lot of services out there for managing social media accounts, but they’re designed for business users and priced accordingly. Croissant doesn’t do as much as a service like Buffer can for a social media manager. However, it does plenty for most individuals and small businesses at a price they can afford, which I appreciate because I’ve only added more accounts since reviewing the app when it launched.

Croissant was an easy pick for Best New App. Everyone on the MacStories team spends a lot of time on social media, but there’s more to it than that. Croissant represents the kind of app that’s a sweet spot for MacStories: it’s the perfect mix of utility, automation, and design that you can tell is made with a lot of care.

Learn more about Croissant:

Best New App Runner-Up

Simple Scan

Niléane: Simple Scan by Greg Pierce was introduced earlier this year and has been a great showcase of how a straightforward utility can take shape on iOS and iPadOS. The app’s premise is simple: it allows you to scan documents and immediately save them to a desired location without having to fiddle with Apple’s native Files app. Unlike most scanning utilities in the App Store that create a full library of your scanned documents and store them in the cloud, Simple Scan prides itself on staying native, first and foremost.

To start scanning a document, simply launch the app, select a destination (Email, Messages, Photos, Files, or the share sheet), and scan using Apple’s built-in scanning feature, which allows you to drag points to the corners of your document for precise cropping. Simple Scan only includes two additional features: enabling or disabling OCR for PDF documents and creating presets for custom destinations. As a result, the app is fast and beautifully uncluttered, and it never tries to steer you away from the destination you specifically picked.

Simple Scan has become a utility that I use multiple times a week for both work and personal documents. For this reason alone, I value it as one of my best app purchases of the year.

Learn more about Simple Scan:

Best App Update

Sofa 4

Federico: We live in a golden age of media. No matter where you look, there’s an abundance of TV shows, movies, music, books, videogames, and podcasts to check out that – let’s be fair – most people will likely never get to in their lifetimes. Not even the people who review different types of media for a living can possibly consume all the options at their disposal these days. We have to make conscious decisions about the entertainment we choose to stream, buy, or rent. If you’re anything like me and have often felt paralyzed by, say, the movies you want to watch or the games in your backlog, you could use a powerful tool like Sofa to keep track of it all.

Developed by Shawn Hickman, Sofa isn’t new to MacStories readers; it’s a downtime tracking app that we’ve been covering ever since it first released in 2018. What made the launch of Sofa 4 special for us earlier this year – and the reason we’re giving this update a MacStories Selects Award – is its balance of elegant design, native Apple platform features, and incredibly powerful customization unlocked by Hickman’s latest additions to the app. In a year where customization is more important than ever on our iOS and iPadOS devices, Sofa 4 feels like a perfect fit.

As John explained in his review, Sofa 4 still lets you organize media you’re tracking with lists like before, but now you can also define custom “ingredients” (think of them as filtering criteria) to assemble smart lists of specific media items that match those filters. Ingredients can be all kinds of properties: date ranges, statuses, URLs, checkboxes, and more. You’re free to create as many ingredients as you want, and you can even choose to apply some of them only to specific item categories.

Ingredients and smart lists have revolutionized my usage of Sofa, which consequently has become the only app I use to keep track of my videogame collection and the games I want to play for my podcasts Unwind and NPC. I’ve gone all-in on Sofa with smart lists that keep track of games in my backlog, games that I’ve started but paused, games that I’m playing, and games that I’m considering as Game of the Year candidates. I’ve even created custom lists to track older games released more than 10 years ago that I want to play on emulation handhelds today.

What’s great about this system is that you don’t have to use custom ingredients to enjoy Sofa, but if you want to, the power is there for you to customize it and build your own unique media tracking app. No other utility on the App Store gets close to the level of personalization and granular control offered by Sofa 4, which does it all with an elegant, native design and also happens to support modern functionalities like widgets, Shortcuts actions, and more.

John and I have professed our love for apps that support smart lists and filters several times over the years on AppStories, and I now consider Sofa 4 the gold standard for how these features should be implemented in an iOS/iPadOS app without confusing users or overcomplicating the UI. And the best part is that this is just how I have been using Sofa. Other MacStories readers may have completely different use cases for the app, and that’s exactly the point of user customization: the same app can adapt to a variety of needs while always staying true to itself.

Sofa 4 is a remarkable update, envisioned by a developer who’s constantly refining their creation with passion, skill, and thoughtfulness. For all these reasons, Sofa 4 is the Best App Update of 2024.

Learn more about Sofa:

Best App Update Runners-Up

Remind Me Faster 5.2

Federico: It can be challenging for “simple” utilities to keep evolving over time with new features while staying true to their essence, but that’s exactly what’s been happening with Remind Me Faster, the excellent Reminders companion app from indie developer Nick Leith. Remind Me Faster has been around for years and does one thing extremely well: it lets you create reminders faster than you normally can with Apple’s own Reminders app.

At a glance, Remind Me Faster is an empty scratchpad that lets you jot down a task and press a send button to save it. The app’s true power lies beneath the surface, and that’s where Nick Leith has been innovating again this year with version 5.2. Remind Me Faster now features an intuitive gesture that involves pulling down above the keyboard to quickly change your destination list in Reminders. Combined with the app’s existing support for natural language parsing, this change alone makes Remind Me Faster a very compelling solution to go from an empty text field to a task with a due date in a specific Reminders list within seconds. But there’s more. Remind Me Faster 5.2 also adds support for iOS 18’s controls, which means you can now launch the app even more quickly via Control Center, the Lock Screen, or the Action button.

Remind Me Faster’s staying power – I’ve had this app on my Home Screen for years at this point – is a testament to its developer’s judicious iteration and thoughtful embrace of modern iOS technologies that never compromise what makes the app great. If you’re a heavy Reminders user, you owe it to yourself to give Remind Me Faster a try.

Learn more about Remind Me Faster:

Federico: We also want to highlight version 2.0 of GoodLinks as a runner-up in this category because it’s another example of an app that’s been actively developed for a long time, and it’s still growing. Earlier this year, indie developer Ngoc Luu added the ability to highlight passages of text in GoodLinks and (optionally) attach notes to them.

Now, the ability to highlight text while you’re reading in a read-later app isn’t new, but GoodLinks is the first app we’ve seen that combines this functionality with deep Shortcuts integration to process and automate your highlights. Like other parts of GoodLinks, there are native Shortcuts actions to find and open your highlights, get them as rich text or Markdown, and edit them. Not only is this a good idea from a data portability perspective (you can export and take your highlights with you as, say, a CSV or Markdown file), but it also creates countless possibilities in terms of connecting GoodLinks to note-taking apps, journaling apps, and more. Want to turn a highlight that struck you as interesting into a new entry in the Journal app? Thanks to Shortcuts, you can.

Today’s read-later landscape is largely dominated by services designed for “power readers” that aim to integrate AI summarizations and similar tools into their reading workflows. There is a place for those tools, but at the same time, it’s refreshing to see meaningful, continued iteration on a product like GoodLinks, a read-later app designed from the ground up for Apple’s platforms that fully embraces Shortcuts. And who knows? Maybe all this work with Shortcuts actions will also pay off next year with App Intents in Apple Intelligence.

Learn more about GoodLinks:

Best New Feature

Play’s Channels Inbox

Federico: Rarely has a single app feature fundamentally changed my entertainment setup as much as Play’s Channels Inbox did this year. Play debuted in February 2022 as a “watch later” app to save YouTube videos for later with an experience native to Apple platforms. The app was off to an excellent start – not a surprise given that it was created by Marcos Tanaka, the author of MusicHarbor and MusicBox (two previous MacStories Selects Award recipients). I started using Play immediately, but a lot of the time, I was still opening the YouTube app to browse its front page and find new videos from my favorite channels.

At the end of last year, Tanaka added the Channels Inbox, a new feature that allows you to see all the videos from your favorite channels in one place and quickly add them to your queue in Play. So while, technically speaking, the Channels Inbox was released in 2023 (although after the cutoff date for when we pick MacStories Selects winners), it was in 2024 that the functionality came into its own as a full-featured alternative to the YouTube app. As a result of the updates released this year, I’ve stopped browsing in the YouTube app altogether.

For starters, you can now import the channels you’re already subscribed to on YouTube using your Google Takeout file. This considerably speeds up the transition from YouTube to Play and ensures you won’t have to re-add all your channels manually. Additionally, the Channels Inbox has been improved with a variety of big and small additions: you can now select and act on multiple videos at once, organize channels in folders, follow YouTube playlists, customize the sorting order of the inbox, share videos directly from the inbox, tag videos when saving them for later, and choose to display video descriptions in the inbox. Effectively, Marcos Tanaka built a native, high-performance alternative YouTube interface to browse new videos from particular channels without any algorithm or janky YouTube UI. It’s excellent.

The constant iteration on the Channels Inbox is a shining example of the innovation third-party developers for Apple platforms bring to the table. There is no equivalent to Play on other platforms – certainly not with this degree of customization, performance, and integration with Shortcuts, controls, and other automation features. Play’s Channels Inbox stands alone as a multi-platform (it’s even available on visionOS and tvOS), well-designed alternative to YouTube’s algorithmic homepage that lets you check out and save new videos as if they were articles in an RSS reader.

All of this was created by a single developer who’s using YouTube’s native APIs and integrating them into a cohesive experience that scales from Apple’s smallest phone display to the biggest virtual monitor you can use with a Vision Pro. Thanks to the Channels Inbox, I’ve watched more YouTube this year alone than in previous years combined, and as a result, I’ve enjoyed the work of my favorite video creators more. I can’t think of any other feature available on as many Apple devices that would be more deserving of this year’s Best New Feature Award than Play’s outstanding Channels Inbox.

Learn more about Play:

Best New Feature Runners-Up

Sequel’s Magic Lookup

John: Sequel is another favorite of the MacStories team and our readers. The app came into its own with version 2.0 last year, winning the MacStories Selects Best App Update Award for 2023. Since then, developer Roman Lefebvre hasn’t slowed down, releasing several substantial updates.

One of my favorites was Sequel 2.3 and its addition of Magic Lookup, one of two runners-up in this year’s Best New Feature category. Magic Lookup is a great example of a feature that uses artificial intelligence to reduce user friction; it works by pulling references to media from articles on the web so you can add them to your queue for later.

Magic Lookup doesn’t generate content or step between websites and their readers. In fact, its effect is the opposite. Instead of leaving a movie review or other article to look up whatever you’re reading about and save it to your queue, Sequel takes care of that from the share sheet, analyzing what you’re reading about and saving it so you can get back to reading. If you’re wondering what sort of AI coverage you’ll see on MacStories in 2025, it’s this: tools that reduce friction and tedium and aren’t built at the expense of writers, artists, musicians, and other creators. I give Roman a lot of credit for coming up with such a useful, thoughtful implementation of AI amidst all the hype and half-baked implementations we saw this year.

Learn more about Sequel:

Halide’s Process Zero

John: Halide’s Process Zero is a good bookend to Sequel. Process Zero records images straight to the iPhone’s camera sensor without any processing, leaving it to photographers to edit photos to their liking. Computational photography has become ubiquitous on smartphones, and while the iPhone’s processing is fine in many contexts, that isn’t always the case.

I love Process Zero because of the choice the team at Lux has given back to photographers. Instead of your iPhone making decisions for you, the feature puts you in full control. What I said in August remains just as true today:

I also appreciate that the Halide team is taking a human-focused approach to photography at a time when so many developers and AI companies seem all too willing to cast aside photographers in favor of algorithms and generative AI. Process Zero’s approach to photography isn’t for everyone, and I expect most of the time, it won’t be for me either. However, I’m glad it’s an option because, in the hands of a skilled photographer, it’s a great tool.

Having options is good. Process Zero opens up new avenues of creativity beyond the defaults chosen by Apple. Even if it’s only used by relatively few photographers in comparison to the full iPhone install base, everyone is better off that Process Zero is available. If you haven’t tried the feature yet, I highly recommend taking it for a spin.

Learn more about Halide:

Best Watch App

Chronicling

Jonathan: Chronicling was one of our runners-up for Best New App last year, but developer Rebecca Owen hasn’t rested on her laurels since then. As a highly flexible tool for tracking just about anything, the core iOS app has benefitted from several solid upgrades in 2024. One great thing about Rebecca’s approach to the app, though, is her decision to make its watchOS sibling more than just a limitedly useful sidekick, like so many other watchOS apps are.

Earlier in the year, Chronicling added selective category syncing to the Apple Watch so that if you’re tracking a considerable number of things, you don’t need to clog up the Apple Watch’s much smaller interface with every single one. At the same time, options to modify event counts and add notes from the watch were introduced as well. These capabilities go far beyond what many other Apple Watch companion apps allow; one might expect a simple, binary button to log an event, but Chronicling offers much more.

When watchOS 11 arrived, the app immediately introduced support for two new APIs: Interactive Widgets and Double Tap. Now, you can log an event directly from the Smart Stack or use Double Tap within the app to do the same. Both implementations are excellent and make Chronicling even easier to use right from your wrist.

So often, Apple Watch apps feel like afterthoughts when compared to their iOS counterparts. But putting effort into an app’s watchOS version can enhance its use across every platform and benefit its users. Chronicling is a shining example of that, which is why we’re delighted to name it the Best Watch App of 2024.

Learn more about Chronicling:

Best Watch App Runners-Up

Peak

Jonathan: Peak made enormous strides in 2024, and nowhere more so than on the Apple Watch, where it wasn’t previously available. Earlier in the year, the iOS version added support for sleep and cardiac health data, laying the groundwork for Peak 3.0, which introduced the app’s first watchOS version. It allows you to easily check many goals and metrics right from your wrist, far more than those offered by Apple’s Activity app.

Peak on Apple Watch also offers widgets, which became interactive with the introduction of watchOS 11. In addition to around 30 – that’s right, 30 – widget variations, Peak has a very clever option to refresh your widgets whenever you tap on them. Data on the Apple Watch isn’t always perfectly in sync with your iPhone, so this is a smart way to brute-force syncing and ensure you are always looking at the correct data.

Combine all of this with a bold, clean design that can show you trends and historical data from up to a month in the past (another thing the Activity app doesn’t allow), and it’s clear developer Harshil Shah has created an excellent Apple Watch app.

Learn more about Peak:

Zenitizer

John: It’s taken years, but we’re finally to the point where, instead of struggling to find any great watchOS app for the MacStories Selects Awards, we have multiple excellent choices. That’s why this year we have two runners-up, including Zenitizer, a meditation app from Manuel Kehl that’s been on our radar since I first reviewed it in 2023. As I said about version 1.2 of the app, Zenitizer:

manages to include the app’s full feature set in a Watch app that doesn’t feel cramped or confusing to navigate. Thanks to the iCloud sync introduced with version 1.2 this week, every routine that’s built-in or that you’ve created yourself is available on the Watch, but you can build routines from scratch on the Apple Watch, too, which I love.

Even with today’s more powerful Apple Watches, that’s no simple feat. The watch’s tiny screen makes it a substantial design challenge, but by using four views and leveraging iCloud sync, Zenitizer not only makes the app just as capable on the watch as it is on the iPhone, it also retains the app’s soothing, organic aesthetic, setting it apart from its peers.

Learn more about Zenitizer:

Best Mac App

Moom 4

Niléane: In 2024, well-known window management utility Moom received a major 4.0 upgrade more than 12 years after the initial release of Moom 3. This new version is a significant step forward for the app. Moom still lets you tile windows simply by dragging them to either side or any of the four corners of your screen. It also still lets you save specific window layouts so you can easily restore them using a custom keyboard shortcut. But Moom 4 goes further than that.

The app introduced the option to fully customize the command palette that appears when hovering over the green button in a window’s title bar. That feature combined with the new ability to create custom commands and chain them together makes Moom 4 more powerful than ever. While I used to turn off the palette in Moom 3 and solely rely on keyboard shortcuts and predefined layouts, the ability to customize the palette has turned me into a big fan of the feature.

One of Moom’s other main features is its window resizing grid, which carries over from previous versions. When enabled, you can use it to quickly draw a rectangle onscreen to immediately move and resize a specific window. The grid’s cell sizes can be customized in the app’s settings, and there’s also an option to add gaps between and around tiled windows. The smaller the cells, the more flexibility you have when tiling windows on the fly.

Alongside the new customizable palette and the ability to save and restore layouts, this flexibility makes Moom indispensable for many users who need to juggle windows on their Macs every day.

Moom 4 lets you create, chain, save, and restore custom window layouts.

Moom 4 lets you create, chain, save, and restore custom window layouts.

That being said, when I reviewed the app this year, one specific feature stood out to me above all the others: Hover. It lets you move and resize windows on the fly using your cursor while holding down specific modifier keys, without the need to click and hold on a window’s title bar or reach for one of its four corners. I’ve used the Globe key for this purpose ever since, and I simply could not imagine going back. To move a window onscreen, all I have to do now is place my cursor anywhere over a window, press the Globe key, and drag the cursor to reposition the window. Hover truly feels like it should be a native feature in macOS.

This is a recurring impression I get from Moom 4. As you come to rely on them, every single one of its advanced features begins to feel like it should just be a part of the OS. Impressively, this remains true even this year, after Apple added a native window tiling feature to macOS Sequoia. Apple’s implementation works well for most people, but it won’t suffice for anyone wanting to go a little further with their window management workflows.

Not only is Moom 4 easier to set up than you may think, but it is also powerful and native-feeling in a way that makes it an essential utility in any Mac power user’s toolbox. Without any doubt, it deserves to be named the Best Mac App of the year.

Learn more about Moom:

Best Mac App Runner-Up

Bezel

Jonathan: Bezel is one of those simple, easy-to-use apps that feels like it should have been made years ago. The one-line summary of Bezel is that it mirrors your iPhone screen to your Mac. But it actually does much more than that, elevating the app from a basic tool to a fully functioning utility with genuine, everyday use cases.

In addition to mirroring, Bezel offers many different ways to capture the screen of the mirrored device. You can place the screenshot in the frame of your iPhone and add padding around the frame with any pattern or color you want, or make the padding transparent. All of this together yields unique results, enabling many real-world applications. You can layer your phone’s screen onto other videos, show what you are doing on your screen during a big presentation, take screenshots with a frame for a how-to guide, and much more. Bezel also includes excellent keyboard shortcuts for almost every action within the app, as well as the ability to resize its window to a custom size or even to the device’s actual pixel size.

Although Bezel was a solid app when it was first released, the enhancements added by developer Mathijs Kadijk throughout the year have been increasingly impressive. I can’t wait to see what else is to come for the app in 2025.

Learn more about Bezel:

Best Design

Shareshot

John: I remember when I first heard about Shareshot from Marc Palmer, who was a little self-conscious about the app because of its overlap with Federico’s Apple Frames shortcut. The similarities are obvious, but what stood out to me even in the early betas was that Shareshot is the logical extension of Apple Frames, doing things in a native app that aren’t possible with a shortcut.

The advantages of using a native app for tastefully compositing screenshots go way beyond its feature set, too. One of the difficulties with Shortcuts that Federico and I have discussed on AppStories in the past is the inability to generate anything but the most basic interface elements, making it hard to offer users choices, something that sets Shareshot apart not only from Apple Frames but the screenshot apps that came before it.

There have been other apps that allow users to combine screenshots with hardware frames, but I’ve never stuck with them because they were all clunky in some way. In contrast, Shareshot makes it dead simple to create a good-looking framed screenshot every time. That starts with the fact that it will detect a screenshot on the clipboard or suggest the most recent screenshot in your photo library. A segmented control at the top of the app makes it easy to switch between various background options, while the Edit button reveals an extensive but easy-to-understand set of controls for changing the background of your image, drop shadows, padding, and other details. The fact that the entire editing process happens in a single view is a testament to the amount of thought and care that has gone into Shareshot’s design.

The holidays are when I tear down many of my systems and rethink how I get my work done. I’m making a wide variety of changes big and small this year, but one thing that isn’t changing is my use of Shareshot. There’s still a place for the Apple Frames shortcut in my screenshot toolbox, but I appreciate how easy Shareshot has made it for me to add more screenshot variety to my reviews this year, so I’ll happily continue to subscribe to it.

Learn more about Shareshot:

Best Design Runner-Up

Reeder

Jonathan: 2024 has been one of the most significant years for Reeder and its developer, Silvio Rizzi, since its introduction 15 years ago. The app has been a mainstay of many users’ RSS setups for years, but this year, Silvio decided to take a bold step and completely remake it. This was not just a fresh coat of paint for Reeder; it was an entirely fresh rethink. As a result, the original Reeder app is still around, but it has been renamed Reeder Classic.

The new Reeder certainly divided people’s opinions when it was released. It wasn’t for some, but others loved it. However, the thing that almost all can agree on is the app’s excellent design. Silvio has the receipts for impressive design skills in the original Reeder as well as the recipe app Mela, so it maybe shouldn’t be a surprise.

The app is an all-in-one inbox that combines your RSS feeds, podcasts, social media accounts, and more into a single timeline. That’s a lot to take in, but Reeder makes navigating it much easier with a clear, attractive layout, adjustable text support, and lovely, simple animations.

As Steve Jobs said in 2003, “It’s not just what it looks like and feels like. Design is how it works.” Reeder’s design enables the app to work well, in addition to being a joy to look at.

Learn more about Reeder:

Readers’ Choice

Play

Jonathan: Play 2.0, by prolific developer Marcos Tanaka, was released two days before the end of November last year – too late to be considered for MacStories Selects in 2023. But this year’s Readers’ Choice Award, chosen exclusively by Club MacStories members, reflects not only that significant upgrade but the continuous, meaningful updates the app has received over the course of 2024.

Following Play’s 2.0 release and the brilliant Channels Inbox feature, Marcos really dug in to add many more features to the app, making it the primary way many people digest their YouTube content. The ability to follow playlists opened up new ways to add videos to the app. I use this feature to follow certain playlists from a channel rather than the channel as a whole. Following your favorite channels in Play became a lot easier with the addition of support for Google Takeout, Google’s method for exporting your account data. Now, you can import all your channels at once or select the ones you want to follow from a list rather than adding them manually via their URLs.

Other new features added this year include timestamp detection, support for Spotlight Search, translations of video descriptions, and much more.

Play has come a long way from its origin as a simple utility for storing YouTube links, and the almost biweekly updates have no doubt resonated with members. Marcos is a developer who listens to his users and quickly implements new ideas and feature requests. We’re delighted that members have awarded Play the Readers’ Choice Award for 2024 and can’t wait to see what new and exciting things come to the app in 2025.

Learn more about Play:

Readers’ Choice Runner-Up

Sequel

Jonathan: The fact that Sequel came in second to Play by a single-digit number of votes says a lot about the app. Like Marcos Tanaka with Play, developer Roman Lefebvre continues to listen to users and introduce meaningful updates to Sequel with pleasing regularity.

This year, Roman redesigned the app for iPadOS, making better use of screen real estate. He also updated the app’s design on macOS and made it fully compatible with Apple silicon.

But it wasn’t all about cosmetic changes this year. The app also introduced personal notes (Markdown-compatible notes that can be attached to any media item), the ability to rate episodes, and an option to change titles’ completion dates. On the system side, Sequel now offers Shortcuts support and integration with Spotlight Search and Control Center. Roman even painstakingly reviewed each of the app’s alternate “themed” icons to create dark and tinted versions for iOS 18.

Finally, Sequel introduced an ingenious new feature called Magic Lookup. It’s our runner-up for the Best New Feature this year and is undoubtedly one of the many reasons why users and Club MacStories members love this app.

Learn more about Sequel:

App of the Year

Delta

Federico: Few devices shaped my interest in technology growing up as much as the original Nintendo DS and its touchscreen input did all the way back in 2004. Before I ever laid my hands on an iPod touch (which I got before the iPhone), before the iPad, and well before I’d try my first VR headset 10 years ago, it was an imported Nintendo DS and a Japanese copy of Super Mario 64 DS that showed me how, in the future, technology could be so much more than clicking buttons or moving analog sticks.

I remember the moment vividly: the owner of my trusted family-run videogame store in my hometown of Viterbo called me one afternoon 20 years ago, telling me to go down to the store because he had something to show me. He’d somehow managed to import a few Nintendo DS units from Japan, and he wanted me to play around with the remake of Super Mario 64 for a few hours. As soon as I put the stylus down on the console’s resistive touch screen, my mind was blown. I no longer had to control Mario with a stick that replicated 3D movements; my finger was the input method. I was instantly hooked, and I walked out the door that day with a Japanese Nintendo DS and a copy of Super Mario 64 DS. I didn’t understand any of the on-screen text, but it didn’t matter. Touch input spoke for itself.

Earlier this year, I found that cartridge of Super Mario 64 DS at my mom’s house in Viterbo and took it home with me here in Rome. For the past few months, I’ve been able to continue my playthrough of Super Mario 64 DS (with my original save) because I ripped the physical game, converted it to a digital file, and installed it on my iPhone thanks to Delta, an emulator for old Nintendo consoles.

To say that Delta, developed by Riley Testut, took over the App Store this year would be an understatement. Delta isn’t a new app. In fact, it’s been around for a few years (and it is itself the evolution of an older GBA emulator originally created by Testut), and it was previously available via AltStore, the alternative app marketplace that allowed third-party apps not fit for the App Store to be distributed outside of it. Earlier this year, however, something changed: likely in response to regulatory pressure in the European Union with the arrival of the DMA and AltStore becoming an official, native third-party marketplace, Apple finally relented and changed its worldwide App Store guidelines to allow game emulators of retro consoles to be made available for everyone. And so, not only did Delta find a new life outside the confines of AltStore with global distribution on the App Store, but it became the poster child of a new kind of App Store, one where retrogaming can thrive and the legacy of classic games can live on thanks to a variety of emulators for all kinds of retro consoles.

Delta for iPhone running our special MacStories skins. You can buy them [here](https://store.macstories.net/ds-skins).

Delta for iPhone running our special MacStories skins. You can buy them here.

Looking back at the past year of app coverage and the conversations surrounding apps for Apple platforms, it’s undeniable that Delta was, for months, the conversation – and for good reason. In a sea of badly designed, sometimes shady emulators filled with ads and other junk or marred by performance issues, Delta stands alone as a high-performance, native emulator for iPhone and iPad that feels like the quintessential blend of an Apple experience and Nintendo-focused retro gaming.

It’s not just that Delta is a capable emulator that brings modern quality-of-life enhancements to older games, although the app certainly is that. It can emulate consoles such as Super Nintendo, multiple generations of Game Boy, and Nintendo DS without breaking a sweat, augmenting those experiences with new features like save states, custom artwork, and cross-device sync. What makes Delta special, though, is the fact that, besides being an excellent emulator, it’s also a great, modern iOS app. Delta fully supports game controllers, allowing you to play older games with any controller supported by the iPhone and iPad, including Joy-Cons. (There’s something poetic about that.) It lets you create custom touch gestures to access different functions such as fast forward and quick save. It supports URL schemes, so you can put together custom launchers to start playing specific games from your Home Screen. It also supports the Files app, context menu previews, custom app icons, and a lot more.

Most importantly, however, Delta helped us remember the magic of the original Nintendo DS thanks to its support for touch controls and haptic feedback on iPhone. What ultimately makes the app special, in my opinion, is the fact that you don’t even need a controller to play old games; by using touch controls alone, you can have the ultimate Nintendo retro console always in your pocket, readily available whenever you think you have a couple minutes to play some Mario Kart: Super Circuit or Pokémon HeartGold. Delta supports customizable console skins for all its supported platforms, allowing you to approximate the experience of playing on a “physical” Game Boy with virtualized, customizable on-screen controls.

In fact, we liked the idea of custom skins in Delta so much that we even commissioned our own versions for the Nintendo DS. They’re available for purchase here, and you can read more about them here.

The MacStories DS skins.

The MacStories DS skins.

That’s another thing that struck me about Delta’s debut on the App Store this year: its popularity spawned an entirely new market of designers (like Sean Fletcher, who designed our skins) for custom themes, as well as creators on Etsy who launched side businesses dedicated to making iPhone controller attachments specifically designed for people who want to use Joy-Cons with Delta.

In the 15 years I’ve been covering indie apps on MacStories, I don’t recall a single example of an app that had the same political, economic, and cultural impact that Delta did in 2024. Delta is a symbol of perseverance in the face of hostility from Apple’s older App Store guidelines, an example of the fact that competition in app marketplaces is the rising tide that lifts all boats, and, ultimately, just a really good app that lets people have fun and rediscover their most precious gaming memories in order to relive them today.

We knew months ago that we could only pick one App of the Year for MacStories Selects in 2024, and it had to be Delta. 20 years later, the magic of the Nintendo DS lives on.

Learn more about Delta:

App of the Year Runner-Up

Raycast

Niléane: This year, once again, Raycast cemented itself as a great Mac app. In addition to being an excellent app launcher, Raycast is special because of the number of features that it offers in a single package. It is a powerful clipboard manager, emoji picker, AI toolkit, note-taking app, and more, all at the same time.

But it doesn’t stop there. As I’ve highlighted multiple times on MacStories, Raycast is also a platform for third-party extensions and automations. The Raycast Store is packed with free, ultra-specific extensions that allow you to complement your workflows. From translation features to GitHub utilities to web development tools, Raycast truly shines as a Swiss Army knife when you start to take advantage of its thriving ecosystem.

Still, Raycast’s main strength remains that it’s incredibly fast. It lets you launch apps, run shortcuts, and manage your windows with straightforward commands, and it is extremely flexible in the way that it allows you to assign keyboard shortcuts and aliases to any command. Users have leveraged this to create powerful automations, as I have by combining it with BetterTouchTool. None of this would work well if Raycast started to slow down the more you use it. Fortunately, this never seems to happen; the app is fast no matter how many extensions you install and use on a daily basis.

Raycast has kept evolving its built-in features, too. Last month, it introduced a fully revamped note-taking feature, which has impacted my usage of Apple Notes on the Mac. The company also announced an upcoming port to Windows and iOS, which I’m hopeful can only be positive for the future of this great Mac app.

Learn more about Raycast:


Access Extra Content and Perks

Founded in 2015, Club MacStories has delivered exclusive content every week for nearly a decade.

What started with weekly and monthly email newsletters has blossomed into a family of memberships designed every MacStories fan.

Club MacStories: Weekly and monthly newsletters via email and the web that are brimming with apps, tips, automation workflows, longform writing, early access to the MacStories Unwind podcast, periodic giveaways, and more;

Club MacStories+: Everything that Club MacStories offers, plus an active Discord community, advanced search and custom RSS features for exploring the Club’s entire back catalog, bonus columns, and dozens of app discounts;

Club Premier: All of the above and AppStories+, an extended version of our flagship podcast that’s delivered early, ad-free, and in high-bitrate audio.

Learn more here and from our Club FAQs.

Join Now]]>
The MacStories Selects 2024 Lifetime Achievement Award https://www.macstories.net/stories/the-macstories-selects-2024-lifetime-achievement-award/ Mon, 09 Dec 2024 15:42:19 +0000 https://www.macstories.net/?p=77419

Transit

Earlier this year, I took the time to step back from the tech news cycle and reflect on one of my favorite iPhone apps of all time, Transit. For the past decade and more — Transit first launched in 2012 — the app has been a powerful way to plan trips and look up waiting times when traveling around your hometown using public transportation. But the team behind Transit has never stopped enhancing and improving the app. Today, Transit remains one of the best transit apps on the iPhone, and it’s not even close; not only that, but the app has also slowly but surely cemented itself as a staple of UI design in this category.

I started relying on Transit in 2014, when I first arrived in France. At the time, I had never experienced a massive public transit network like the one in Paris, and I specifically remember how overwhelming it all felt. Finding Transit in the App Store truly felt like a godsend for 18-year-old me.

When it was first released, the app focused on one key feature: as soon as you tapped its icon on the Home Screen, it would immediately give you real-time waiting times for bus and train stops all around you, wherever you were in the city. Unlike with its competitors, you didn’t need to tap around the UI to find the stop or train line you were looking for; in all likelihood, the information you were looking for was already there, right on the app’s main screen. Instead of trying to play the retention game and keep you in the app as long as it could, Transit was designed to be launched and dismissed again just a few seconds later, as soon as you got a glance at the waiting times on its main screen.

This basic foundation immediately made Transit relevant at any time of the day. It also explains why, over the past ten years, I’ve never once removed the app’s icon from my Home Screen. In fact, it’s hard for me to imagine my Home Screen setup without Transit.

Transit lets you compare itineraries on a timeline and presents you with a detailed breakdown of each itinerary.

Transit lets you compare itineraries on a timeline and presents you with a detailed breakdown of each itinerary.

Today, in addition to checking waiting times, the app also lets you plan itineraries and compare trips, and it can track your vehicle to alert you when you’re about to reach your destination so you don’t miss your stop — all of this across 741 cities and regions in 23 countries. If this sounds like a lot, just know that at every step along the way, the app is always graced with a thoughtful design that never makes any part of it feel overwhelming. Every single data point has been carefully placed in the interface and is introduced with beautiful and subtle animations.

When I try to think of apps that started strong so many years ago and only evolved to become stronger, there are just a few names that come to mind. Transit is one of those names.

Transit's Live Activities are a perfect use case for the feature. They let you keep track of your trip and alert you when you're about to reach your destination.

Transit’s Live Activities are a perfect use case for the feature. They let you keep track of your trip and alert you when you’re about to reach your destination.

One of the reasons the app has been able to achieve this is its ability to gracefully adopt Apple’s new system APIs in iOS every year. This was especially true in iOS 16 with Live Activities, which allow you to track your trip and keep an eye on your next stop right from your iPhone’s Lock Screen. And just last month, the Transit team went beyond our expectations, revealing an impressive new way to track your train when it’s underground and you don’t have a GPS signal. The app now utilizes the iPhone’s built-in accelerometer and analyzes its patterns to identify when your vehicle is in motion and every time it reaches a new station. I’ve been able to try this new feature over the past month in the tunnels of the Paris Métro, and I’m happy to report that this wizardry actually works.

In November 2024, Transit added a prediction model that allows it to track your train underground without GPS, using only the iPhone's accelerometer.

In November 2024, Transit added a prediction model that allows it to track your train underground without GPS, using only the iPhone’s accelerometer.

The Transit team’s ability to innovate and expand to more regions around the world, all while keeping the app focused on the main feature set that it launched with 12 years ago, is remarkable. For that, and the app’s ever-beautiful design, Transit deserves to be recognized with this year’s MacStories Selects Lifetime Achievement Award.

Learn more about Transit:


Access Extra Content and Perks

Founded in 2015, Club MacStories has delivered exclusive content every week for nearly a decade.

What started with weekly and monthly email newsletters has blossomed into a family of memberships designed every MacStories fan.

Club MacStories: Weekly and monthly newsletters via email and the web that are brimming with apps, tips, automation workflows, longform writing, early access to the MacStories Unwind podcast, periodic giveaways, and more;

Club MacStories+: Everything that Club MacStories offers, plus an active Discord community, advanced search and custom RSS features for exploring the Club’s entire back catalog, bonus columns, and dozens of app discounts;

Club Premier: All of the above and AppStories+, an extended version of our flagship podcast that’s delivered early, ad-free, and in high-bitrate audio.

Learn more here and from our Club FAQs.

Join Now]]>
The MacStories Holiday Gift Guide for the Apple Nerd in Your Life https://www.macstories.net/stories/the-macstories-holiday-gift-guide-for-the-apple-nerd-in-your-life/ Tue, 26 Nov 2024 16:58:35 +0000 https://www.macstories.net/?p=77328 With Black Friday sales in full swing and the holidays around the corner, we here at MacStories thought we’d each share gift ideas for the Apple nerd in your life. Some of these items are currently on sale, so be sure to get your shopping started and check them out soon.

Federico

UGREEN 300W 48,000mAh Battery

I love this big, chunky battery with a handle.

As I recently mentioned on Unwind and NPC, I’ve been really into the idea of gadgets that are “portable, but for the home” this year. These are accessories that are portable in the sense that they can be moved around, but you wouldn’t commute or travel with them. In this case, I was looking for a powerful battery I could place on my living room table to charge multiple devices at once, such as Silvia’s MacBook Pro and my iPad Pro, or my Legion Go and iPhone. The internal capacity of this battery ensures it can stay on for hours when charging a single device like a Steam Deck, too.

The battery comes with a front-facing display with details about its charge and in/out wattages, and it even offers an LED light on the side for illuminating your environment. Plus, if you have a 140W USB-C charger, filling it up completely doesn’t take too long. This has to be one of my favorite tech purchases this year, and I can’t recommend it enough.

UGREEN Nexode Pro 100W Charger

Speaking of UGREEN, I also like their latest 100W GaN charger. Part of the company’s Nexode line, this is a compact USB-C wall charger that can output up to 100W via its first USB-C port when used by itself. This one is actually bag- and travel-friendly, and it’s become my new default for fast-charging the iPhone and iPad Pro.

Baseus 240W USB-C Cable

Do I have any devices that can charge at 240W over USB-C yet? Absolutely not. But has that stopped me from seeking a USB-C cable that theoretically supports that charging speed? You know me. I like this cable! The nylon texture is nice, the connector feels sturdy, and it’s what I’ve been using on my nightstand for the past few months.

Satechi Mobile XR Hub

I love that Satechi made an accessory for people like me, who occasionally like to use an iPad Pro or Steam Deck with XR glasses. Basically, this compact adapter allows you to output video up to 4K over USB-C, receive USB-C charging up to 100W, and have a 3.5mm audio jack port for headphones. This is perfect for a USB-C-based handheld or tablet, and it’s replaced my previous flimsy Viture dongle for whenever I want to put on my XREAL Air glasses to play some Steam games.

Spigen Head Strap for Apple Vision Pro

This is the Vision Pro accessory Apple should have made: it’s a top strap that you combine with the Solo Knit Band on the Vision Pro to relieve pressure and weight from the top of your head. I’ve been using it for the past month and cannot recommend it enough. The strap is comfortable, it’s easy to adjust (I’ve only done it once and never touched it again), and it attaches perfectly to the headset.

TineeOwl Kickstand Cover for iPad Pro

A few weeks back, I was finally able to find the iPad Pro accessory I’d been looking for since the debut of the M4 models: a back cover with a built-in kickstand. I used a similar cover for the M2 iPad Pro last year, and the idea is the same. This accessory only protects the back of the device, doesn’t have a cover for the screen, and comes with an adjustable kickstand to use the iPad in landscape at a variety of viewing angles.

The reason I wanted this product is simple. This is not a cover I use for protecting the iPad Pro; I only want to attach it in the evening, when I’m relaxing with the device on my lap and want to get some reading done or watch some TV. In fact, this cover never leaves my nightstand. When I’m done working for the day, I leave the Magic Keyboard on my desk, bring the iPad Pro into the bedroom, and put it in the cover, leaving it there for later.

John

Anker Prime Power Bank and Base

I’ve had this chunky Anker battery and its base for about a year now, and I love it. At 27,650mAh, it’s big, but still airline-friendly. The two USB-C ports can output 140W of power when used alone, while the USB-A port can deliver 35W. Using two or three of the battery’s ports at the same time splits the power among them, but unlike many power banks, at least one of Anker’s ports always delivers 140W, which is one of my favorite features.

Anker has included a screen on the front of the battery so you can quickly see how much charge it has left and how much power is being delivered by its ports, as well as how much time the battery will take to finish when recharging. The same information can also be accessed using Anker’s iPhone app.

The base, sold separately, sits on a corner of my desk where it serves double duty. The Prime Power Bank connects to the base magnetically for recharging at 100W via the base’s pins, topping it off quickly. The touch that I appreciate most, though, is that there are two USB-C ports and one USB-A port on the side of the base, which can split the base’s 100W output if multiple devices are connected.

LOC8 MagSafe Finder Wallet and Stand

This is a pick I stole from Federico, who mentioned it several months ago on Connected. I’ve been using an Apple leather MagSafe Wallet for a long time, and there are two things that I don’t like about it. The first is that it only holds three cards. I don’t like to carry a lot of cards in my wallet, and I don’t use cash, but space for three cards still isn’t enough. The LOC8 wallet’s room for a fourth card goes a long way if you’ve ever been frustrated by Apple’s MagSafe wallets. The wallet opens up, so it can be used as a kickstand too. And, of course, the LOC8 also has Find My support built in.

Miyoo Mini+

There are a lot of great gaming handhelds out there now, but of all the ones we’ve covered on NPC: Next Portable Console, the Miyoo Mini+ is my favorite to give to someone as a gift if they haven’t tried game emulation before. The overall build quality of the Mini+ is better than its competitors, and the 3.5” screen is big enough without compromising its pocketability. Plus, there’s a vibrant community of developers creating alternative operating systems for the Miyoo Mini+, which is perfect for anyone who likes to tinker with their devices without getting too deep in the weeds. The Mini+ can’t handle the current generation of console games, but if you know someone who loves the early classics like the NES, any device from the Game Boy lineup, SNES, PS1, and others, the Miyoo Mini+ is an excellent choice.

Foto&Tech Multipurpose Extra Thick Elastic Cable Tie and Organizer

These elastic ties are my stocking stuffer pick. Each tie is a simple loop of elastic with a toggle clasp attached, allowing you to wrap it around just about anything and cinch the elastic tight. They’re the sort of thing you don’t know you need until you start using them – at which point, if you’re like me, you’ll find more and more uses for them all the time. By now, I think I’ve bought four packs to bundle cables together, manage cables under my desk, securely attach power supplies to other items, and more. It’s not a flashy gift, but they come in a variety of colors, so you can pick one that matches your gift recipient’s style.

Jonathan

Klein Tools Ratcheting Modular Data Cable Crimper and Network LAN Cable Tester

Getting great Internet throughout your home is essential. The best way to achieve this is by setting up multiple wireless access points (WAPs) throughout your house and connecting them to your router via Ethernet. That way, the speed from each WAP will be as good as possible.

Running Ethernet throughout your home will take some time, but the result is very satisfying. One particularly satisfying part of the process is cutting your Ethernet cables and attaching connectors to their ends. However, since you want these connections to be around for a long time, you must ensure it’s done perfectly. This is where the cable crimper and cable tester from Klein Tools come in. The cable crimper cuts and strips the cable, then crimps the connector onto each end, all using the same squeezing action. Then, you can connect the cable tester to each end of the cable to check that the connection is strong and that you haven’t accidentally cut through a wire.

I used these models for running Ethernet in my house, and they were an irreplaceable part of the job. If you’re thinking of doing the same around your home, these are must-haves.

Anker Two-in-One USB-C to USB-C Cable

I’m sure most of you reading this will have many cables throughout your homes, ready to charge all the various devices you and your family have. In our constant charging state, we often need to plug in multiple devices at once, or we might need to charge one device at the same time as another family member.

You could, of course, use two plugs with two cables or a plug with two ports and two cables, but that gets messy fast. One solution is to get a charging cable that splits in two towards the end. There are tons of cheap options that do this, but getting one from a respected brand that carries a good charge to the two devices is better, so you’re not significantly slowing down the charging speed. Anker fits that bill with their lovely, braided two-in-one USB-C to USB-C cable. It can carry a charge of up to 140W, enough to power even a MacBook Pro.

JOBY GripTight ONE Micro Stand

We’ve all been there: you’re out with friends or family, and you want to take a picture of the whole group. Sure, one member of the group could take the picture, but then they wouldn’t be in it. You could get one person to take a selfie, but then there would be one massive head at the front of the picture. You could ask a passerby, but they could always run off with your phone – or worse, take a bad picture.

Using a tripod is the right answer, but what if you don’t want to lug one around all day? I found the ideal solution for this a couple of years ago with this micro – and it is micro – stand from JOBY. When folded up, this stand genuinely fits in your tiny jeans pocket (you know the one), so even if you don’t use it, it’s barely taking up space on your person. If you do need to use it, you’ll get lots of envious comments and a great group picture.

Satechi USB-C Apple Watch Charger

If you’re someone who likes to sleep with their Apple Watch on, finding time during the day to charge it up can be a whole planning process in and of itself. When you’re at home, there are plenty of chances while you’re working or watching TV, but if you’re out and about all day, it’s another story. That’s where this Apple Watch charger from Satechi comes in handy. It’s essentially the puck end of the regular charger, but with no cable. This allows you to plug it into your laptop’s USB-C port or a battery pack you’re carrying with you. It’s so tiny that you can grab it whenever you like or, as I do, carry it permanently in a side pocket of your bag.

KU XIU Foldable Three-in-One Charging Station

There is a seemingly endless array of multi-device portable chargers out there. Two-in-one, three-in-one, and even four-in-one variations are available if you’re looking. This one from KU XIU is a great choice, and there’s something particular about it that I love.

You see, when I travel with my wife, we both need to charge our iPhones overnight. We could bring two cables or two chargers, but that adds to the mess. What this three-in-one charger has – along with a place to magnetically charge your iPhone at up to 15W and your Apple Watch at up to 5W – is a spot for charging your AirPods wirelessly at 5W. This isn’t advertised at all, but with that spot, you can also charge a second iPhone.

It charges slowly, but overnight, that doesn’t matter. My wife and I can carry this one foldable charger and one USB-C cable when we travel, and we’re all sorted. Charging stations for couples aren’t a thing, and I think they should be, but in the meantime, this device fits the bill just right.

Anker 15W 6,600mAh Battery Pack

Some products just do what they say on the tin; this is one of those. Anker has a reputation for making excellent charging solutions, and the company’s MagSafe-compatible power banks are, in my opinion, the best in the business.

If you’re out all day, carrying a battery pack is always advisable, just in case your iPhone gets low on power. Anker has a wide range of options in this field, and this is a great one. It holds almost a full iPhone Pro Max charge, which is more than enough for one day, and you can also plug accessories into the USB-C port to charge them. The battery pack even features a tiltable stand to view your iPhone in StandBy mode while working at a coffee shop or another temporary location.

Nettbe Reusable Velcro Cable Ties

I have about 30 cables of various kinds in my tech supplies drawer. Without proper organization, it would be a nightmare to find anything. The key to my organization is using these simple, strong, and unassuming cable ties. Quite simply, they are double-sided hook and loop strips with a hole cut in them. This allows you to loop the end through the hole and pull it tightly around a cable before wrapping it fully to secure the cable even more. I love these.

Devon

Nanoleaf Essentials Smart LED Color-Changing Light Bulb

Smart lighting is a great way to get started with a smart home setup or easily enhance an existing one. These bulbs plug into standard light sockets, support Matter, and connect to HomeKit via Thread. From the Home app, you can adjust the brightness, temperature, and color of each bulb, or you can group them with the company’s other lighting products in the Nanoleaf app to create coordinated multicolor scenes.

I’ve put one of these in every lamp in my house. Having granular control over each lighting fixture has not only been handy, but a lot of fun, too. They’re relatively inexpensive and also come in a four-pack. Though they don’t require a proprietary hub, you will need some sort of Thread border router – like a HomePod mini, second-generation HomePod, Apple TV (Wi-Fi + Ethernet), or other fixture from Nanoleaf with border router functionality – to connect them.

AstroAI Tire Inflator Portable Air Compressor

Ever since I bought this tiny tire inflator based on a recommendation from Quinn Nelson (aka Snazzy Labs) for a similar product a few years ago, I’ve considered it a must-have for any car owner. When I see an indicator that one of my car’s tires is low on pressure, I don’t have to go looking for a shop or gas station with an air compressor. I simply pull this out of my trunk, hook it up to one of my car’s power outlets and to the tire, and turn it on. It’s saved me on multiple occasions.

I won’t be caught driving without one. I’ve even put one in my wife’s car and given it out as a gift a time or two. (Don’t tell my family, but I’m bringing one to Dirty Santa this year.) It’s an affordable tool that you won’t need often – but when you do, you’ll be very thankful to have it.

Anker MagGo Magnetic Charging Station

This tiny, quirky charging station makes me happy for a few reasons. First of all, it looks like a HomePod mini and takes up only slightly more space than the smart speaker. Secondly, it’s perfect for charging your iPhone wirelessly at 15W, either in vertical orientation for seeing your Lock Screen widgets or in horizontal orientation for StandBy. Finally, it features a ton of power options in such a small package, including three AC plugs, two USB-C ports, and two USB-A ports. And when you’re using a single USB-C port, the device can output up to 67W. For desk situations where power is limited, this is a great way to expand your options without sacrificing much space.

Sigmund

CanaKit Raspberry Pi 5 Starter Kit PRO

Raspberry Pi originally made its name in education but quickly gained a reputation as the best budget-conscious, low-powered tinkering computer ever created. With over 60 million units now sold, there’s a huge community out there creating projects that include everything from AI-powered robots and travel streaming rigs to recreating the Macintosh 128K and connecting a Macintosh SE to the Internet. I’m using my Raspberry Pi to run Home Assistant alongside Apple Home and supercharge my Apple TVs in a variety of ways based on playback state, the currently active app, and more to trigger lights, blinds, and even shortcuts on my iOS devices.

One For All Apple TV Replacement Remote

Before Apple’s highly regarded Siri Remote redesign in 2021, Universal Electronics collaborated with Apple on an alternative remote to appease the many cable company executives who wanted a more conventional remote to give to their customers when making the transition from traditional cable and satellite boxes to the Apple TV. Now with Siri onboard, UEI’s second generation Apple TV Remote became available for purchase earlier this year, sporting clicky navigation buttons, enhanced controls for live TV including a dedicated program guide and channel up and down buttons, and backlit keys activated by an accelerometer and ambient light sensor. It’s long been my go-to for late night Apple TV binges in the dark, and it comes highly recommended.

EZCOO 8K HDMI 2.1 Splitter

EZCOO’s top-of-the-line 8K HDMI 2.1 splitter is something of a video marvel. Not only does it split a video signal between two displays and bypass High-bandwidth Digital Copy Protection (HDCP) to display on older televisions, but it can also downscale 4K content to 1080p and send Dolby Vision’s superior video package to HDR10 displays thanks to its onboard Extended Display Identification Data (EDID) toggles that trick the HDMI source into sending signals it otherwise wouldn’t. From sending Dolby Vision to my projector to sending my Apple TV screen via iPad and a capture device to Apple Vision Pro for weekly FaceTime watch parties, this is by far my favourite video purchase of the year.

Cook’n’Escape Reusable Titanium Straw

My Apple Vision Pro journey over the past 10 months saw me move away from using the light seal over the summer, only to embrace it again once the rollout of Apple Immersive Video titles started to pick up. The one downfall of wearing Apple Vision Pro the way Apple envisioned it to be used is that it’s almost impossible to drink out of a mug or cup without fear of damaging its glass front or spilling your drink into your lap on a flight. That’s why I keep at least one reusable metal straw in my Apple Vision Pro travel case at all times – and why you probably should too.

Severance Season One Original Soundtrack on Vinyl (Innie Edition and Outie Edition)

While we still await official Severance merchandise from IMG and Fifth Season, last year saw Mondo Music release Theodore Shapiro’s incredible score on 12” vinyl in two limited editions. The “Outie” edition is limited to an initial pressing of 5,000 units on 140g white vinyl and comes in a sleeve with artwork from Greg Ruth and a 3/4 sleeve jacket. The more costly “Innie” edition – also limited to 5,000 – is pressed on 140g black vinyl and housed in a Lumon Industries folder that also includes a music dance experience card, record safety card, Eagan bingo sheet, Lumon disco bag inner sleeve, map of the severed floor drawn by Petey, and Lumon employee cards. For more insight into the soundtrack, you can listen to my conversation with composer Theodore Shapiro on a very special episode of Magic Rays of Light from 2022.


Access Extra Content and Perks

Founded in 2015, Club MacStories has delivered exclusive content every week for nearly a decade.

What started with weekly and monthly email newsletters has blossomed into a family of memberships designed every MacStories fan.

Club MacStories: Weekly and monthly newsletters via email and the web that are brimming with apps, tips, automation workflows, longform writing, early access to the MacStories Unwind podcast, periodic giveaways, and more;

Club MacStories+: Everything that Club MacStories offers, plus an active Discord community, advanced search and custom RSS features for exploring the Club’s entire back catalog, bonus columns, and dozens of app discounts;

Club Premier: All of the above and AppStories+, an extended version of our flagship podcast that’s delivered early, ad-free, and in high-bitrate audio.

Learn more here and from our Club FAQs.

Join Now]]>
What It’s Like to Hear Better with AirPods Pro 2 https://www.macstories.net/stories/what-its-like-to-hear-better-with-airpods-pro-2/ Tue, 29 Oct 2024 12:40:12 +0000 https://www.macstories.net/?p=77074

I’ll admit, I was a little apprehensive about taking Apple’s hearing test. I’ve spent my fair share of time at loud concerts and shouting above the din in crowded bars, so I was fairly certain the test would show my hearing isn’t what it once was. The question was, “How bad is it?” With the release of iOS 18.1 and an update to the AirPods’ firmware, I set out to find out.

The AirPods Pro 2 didn’t get a hardware update this fall, but they may as well have, because the new hearing features they received via a software update add a whole new dimension to them. We’ve written about the hearing features before. They include three components:

  • hearing protection, which lowers the volume of sounds that could potentially damage your hearing;
  • a hearing test to check whether you have hearing loss in either ear; and
  • a hearing assistance feature to boost frequencies where you’ve lost hearing.

Despite covering these features previously, I didn’t have a good feel for what they were like until I tried them myself, so I thought I’d share the process here and encourage others who can to give these new hearing features a try, too, because they work remarkably well.

Knowing that my hearing isn’t great made diving into Apple’s new hearing features both easier and harder in some ways. On one hand, I was eager to see if AirPods Pro could really make me hear better. On the other hand, I was concerned my hearing might be worse than I imagined and require something more.

Fun facts about my audio environment.

Fun facts about my audio environment.

To get over the hump of my apprehension, I built up to the hearing test by opening the Health app and poking around in the Hearing section a bit. I didn’t learn a lot, except that it turns out the AirPods’ hearing protection feature first kicked in Sunday, lowering a sound by a few decibels. It may have happened when I was banging dishes around as I unloaded the dishwasher, but I’m not sure. I also discovered that October has been a little quieter for me than September and that this year has been a little quieter than last. Both are good things, I suppose.

Setting up a hearing test.

Setting up a hearing test.

With that out of the way, I sat at the kitchen table with my iPhone and opened the hearing test in the Health app. The test walks you through checking that your environment is quiet and that your AirPods Pro 2 fit well before testing. You even have a chance to listen to an example tone before you start so you know what to expect.

More setup.

More setup.

Ready to go.

Ready to go.

You can tell that a lot of thought went into the process of preparing someone for the test. The steps were simple and clear, putting me at ease with what to expect by the time the actual test started. While you’re testing, your iPhone is automatically switched to Do Not Disturb mode, too, so you won’t be interrupted.

When the time came to begin testing one ear and then the other, I focused and listened carefully, trying not to psyche myself into hearing sounds that weren’t there. The test doesn’t take long, but it feels like a long time as you sit in silence tapping a circle each time you hear the test’s intermittent chimes.

When my left ear’s test was complete, the Health app told me that it had failed because it was interrupted by environmental sounds. Sure enough, a truck rolled down the street beneath my window near the end of the test, and while it wasn’t that loud, it was enough. So I retreated to my bedroom, away from the street and another floor off ground level.

Test complete.

Test complete.

That did the trick. I completed the test for both of my ears and found that, while I have some hearing loss as I suspected, it is less than I imagined. According to the test, I have mild loss in my left ear and little to no loss in my right ear. The app then prompted me to set up the hearing assistance feature, which amounted to tapping a button and reading about how the feature might take a while to get used to.

I spent most of my day at home alone, so I didn’t think I’d have a real opportunity to test the effect of hearing assistance until later. But then, I stood up, and boy, was I wrong. I had been sitting on my bed during the test, and when I got up, I noticed the difference the moment my feet touched the floor. As I walked across the room, every footstep was amplified and better defined.

Everything I did sounded richer and fuller. I sat at my desk and started typing, and I could hear the keys on the keyboard clacking more than ever. I made lunch, and the sounds of the packaging of the food I pulled out of the refrigerator were more noticeable. Then I started up a podcast episode that I had paused earlier in the day and immediately turned the volume down a couple of notches because the hearing assistance feature helps you hear media better too.

Managing the hearing assistance feature from Control Center.

Managing the hearing assistance feature from Control Center.

That’s cool and all, but if you’re wondering what the feature is like out in the world, the answer is that the difference is equally profound. To wrap this story up, I walked down to my local coffee shop to finish writing and editing. As I approached the front door of the shop, my instinct was to pull out my AirPods and drop them in their case before I placed my order, but I resisted. Instead, I walked up to the counter and ordered a drink from a soft-spoken barista. Not only could I hear the barista loud and clear, but I overheard a couple sitting at a table nearby, and when I headed outside to work for a bit, I could clearly hear nearby conversations, all against the backdrop of kids playing noisily at an adjacent park.

My brief experience with hearing assistance has been wild and surprising in a very good way. I’m fortunate that my hearing loss is mild. At the same time, though, the difference from using hearing assistance is so noticeable that I don’t want to take my AirPods out of my ears. I’m sure the novelty of hearing assistance will wear off and become the ‘new normal,’ but I won’t forget the experience of those first hours using it. It was remarkable.

Yesterday, Apple published a fascinating press release about how the company developed these hearing features. In it, Apple notes that 1.5 billion people experience some degree of hearing loss. That’s a lot of people, and many of them are untreated like me. My hearing isn’t so bad that I’ve ever felt like I should go to an audiologist, but by the same token, here I am writing this while listening to music and soaking in the sounds of my local coffee shop on a beautiful fall day, and it’s all crystal clear and great.

Give the hearing test a try. You won’t regret it.


Access Extra Content and Perks

Founded in 2015, Club MacStories has delivered exclusive content every week for nearly a decade.

What started with weekly and monthly email newsletters has blossomed into a family of memberships designed every MacStories fan.

Club MacStories: Weekly and monthly newsletters via email and the web that are brimming with apps, tips, automation workflows, longform writing, early access to the MacStories Unwind podcast, periodic giveaways, and more;

Club MacStories+: Everything that Club MacStories offers, plus an active Discord community, advanced search and custom RSS features for exploring the Club’s entire back catalog, bonus columns, and dozens of app discounts;

Club Premier: All of the above and AppStories+, an extended version of our flagship podcast that’s delivered early, ad-free, and in high-bitrate audio.

Learn more here and from our Club FAQs.

Join Now]]>
iPad mini Review: The Third Place https://www.macstories.net/stories/ipad-mini-review-the-third-place/ Tue, 22 Oct 2024 13:00:14 +0000 https://www.macstories.net/?p=76970 The new iPad mini.

The new iPad mini.

My first reaction when I picked up the new iPad mini last Thursday morning was that it felt heavier than my 11” iPad Pro. Obviously, that was not the case – it’s nearly 150 grams lighter, in fact. But after several months of intense usage of the new, incredibly thin iPad Pro, the different weight distribution and the thicker form factor of the iPad mini got me for a second. Despite being “new”, compared to the latest-generation iPad Pro, the iPad mini felt old.

The second thing I noticed is that, color aside, the new iPad mini looks and feels exactly like the sixth-generation model I reviewed here on MacStories three years ago. The size is the same, down to the millimeter. The weight is the same. The display technology is the same. Three minor visual details give the “new” iPad mini away: it says “iPad mini” on the back, it’s called “iPad mini (A17 Pro)” on the box, and it’s even called “iPad mini (A17 Pro)” (and not “iPad mini (7th generation)”) in Settings ⇾ General ⇾ About.

I’m spending time on these minor, largely inconsequential details because I don’t know how else to put it: this iPad mini is pretty much the same iPad I already reviewed in 2021. The iPadOS experience is unchanged. You still cannot use Stage Manager on any iPad mini (not even when docked), and the classic Split View/Slide Over environment is passable, but more constrained than on an iPad Air or Pro. I covered all these aspects of the mini experience in 2021; everything still holds true today.

What matters today, however, is what’s inside. The iPad mini with A17 Pro is an iPad mini that supports Apple Intelligence, the Apple Pencil Pro, and faster Wi-Fi. And while the display technology is unchanged – it’s an IPS display that refreshes at 60 Hz – the so-called jelly scrolling issue has been fixed thanks to an optimized display controller.

As someone who lives in Italy and cannot access Apple Intelligence, that leaves me with an iPad mini that is only marginally different from the previous one, with software features coming soon that I won’t be able to use for a while. It leaves me with a device that comes in a blue color that isn’t nearly as fun as the one on my iPhone 16 Plus and feels chunkier than my iPad Pro while offering fewer options in terms of accessories (no Magic Keyboard) and software modularity (no Stage Manager on an external display).

And yet, despite the strange nature of this beast and its shortcomings, I’ve found myself in a similar spot to three years ago: I don’t need this iPad mini in my life, but I want to use it under very specific circumstances.

Only this time, I’ve realized why.

Special Episode of AppStories, Now with Video Too

We recorded a special episode of AppStories all about the new iPad mini and my review. As usual, you can listen to the episode in any podcast player or via the AppStories website.

Today, however, we’re also debuting a video version of AppStories on our YouTube channel. Going forward, all AppStories episodes will be released in audio and video formats via standard RSS feeds and YouTube, respectively.

AppStories debuted in 2017, and with over 400 episodes recorded, it’s long past due for a video version.

It’s safe to say that bringing AppStories to YouTube is a good sign that our YouTube channel has graduated from an experiment to a full-fledged component of MacStories. If you haven’t subscribed to the channel yet, you can check it out and subscribe here, and it also includes:

  • the video versions of Comfort Zone and NPC: Next Portable Console;
  • podcast bonus material for NPC;
  • audio versions of Ruminate, Magic Rays of Light, and MacStories Unwind;
  • playlists of classic AppStories episodes; and
  • a growing collection of MacStories videos.

I hope you’ll enjoy the video version of AppStories. You can find our YouTube channel here.

Setting Up the iPad mini

Apple sent me a review unit of the blue iPad mini with 512 GB of storage (a new tier this year), alongside an Apple Pencil Pro and a denim Smart Folio. The denim color looks alright; the blue color of the iPad mini is, frankly, a travesty. I don’t know what it is, exactly, that pushes Apple every so often to release “colors” that are different variations of gray with a smidge of colored paint in the mix, but here we are. If you were expecting an ultramarine equivalent of the iPad mini, this is not it.

One is ultramarine; the other is "blue".

One is ultramarine; the other is “blue”.

The iPad mini (right) is visibly thicker than the M4 iPad Pro.

The iPad mini (right) is visibly thicker than the M4 iPad Pro.

Something that surprised me when I started setting up the iPad mini was the absence of any developer or public beta program to install iPadOS 18.1 on it. My iPad Pro was running the iPadOS 18.1 developer beta, so while I was able to migrate my iCloud account and system settings, I couldn’t restore from a backup because iPadOS 18.1 wasn’t available for the new iPad mini at all last week. That was unusual. I’ve reviewed my fair share of iPads over the years; traditionally, Apple releases a specific version of their current betas for members of the press to install on their review units.

With this iPad mini, I had to start from scratch, which I decided to use to my advantage. It gave me an opportunity to investigate some questions. How would I set up the iPad mini as a companion device to my iPhone 16 Plus and 11” iPad Pro in 2024? In a world where the Vision Pro also exists as a personal, private device for entertainment, what role would an iPad mini specifically set up for “media consumption” fill in my life?

And the biggest question of all: would there even be a place for it at this point?

The Role of the iPad mini

Look at any marketing webpage or press release about the iPad mini, and you’ll see that Apple is more than eager to tell you that doctors and pilots love using it. My last experiences in those fields were, respectively, Pilotwings on the Super Nintendo and Trauma Center on the Nintendo DS. I’m fairly certain that curriculum wouldn’t qualify me as an expert in either profession, so those iPad mini use cases aren’t something I can review here.

I can write about two things: how this iPad mini compares to the previous one from a hardware perspective and, more broadly (and more interestingly for me), what role the iPad mini may fill in 2024 in a post-OLED iPad Pro, post-Vision Pro Apple ecosystem.

What’s Different: Better Wi-Fi, Apple Pencil Pro, and No More “Jelly Scrolling”

As I mentioned above, the new iPad mini comes with the A17 Pro chip, and since it’ll need to power Apple Intelligence, it also now offers 8 GB of RAM – the bare minimum needed to run AI models on Apple’s platforms these days. I haven’t been able to test Apple Intelligence on the iPad mini, so all I can say is that, yes, the new model is just as fast as the iPhone 15 Pro was last year. For the things I do with an iPad mini, I don’t need to run benchmarks; whether it’s watching YouTube videos, browsing Mastodon, or reading articles in GoodLinks, there’s never been a single time when I thought, “I wish this iPad mini was faster”. But then again, the old iPad mini was fine, too, for the basic tasks I threw at it.

Where I did notice an improvement was in the Wi-Fi department. Thanks to its adoption of Wi-Fi 6E (up from Wi-Fi 6), the new mini benchmarked higher than the old model in speed tests and, funnily enough, came in slightly higher than my M4 iPad Pro as well. From the same spot in the living room in close proximity to my Wi-Fi 6E router, the three iPads performed speed tests at the following rates across multiple tests:1

  • Old iPad mini (Wi-Fi 6): 600 Mbps down, 200 Mbps up
  • M4 iPad Pro (Wi-Fi 6E): 643 Mbps down, 212 Mbps up
  • New iPad mini (Wi-Fi 6E): 762 Mbps down, 274 Mbps up

As you can see, the 6Ghz band helps the Wi-Fi 6E-enabled devices, resulting in amazing performance for streaming bandwidth-intensive applications. For example, I used my iPad mini to stream Astro Bot from my PS5 using the MirrorPlay app, and it was rock solid, on par with the latest iPad Pro. That wasn’t the case when I last tried to stream games locally with the iPad mini and iPad Pro paired with a G8 game controller last year.

The new iPad mini running Astro Bot from my PS5. I'm using the [GameSir G8 Plus controller](https://amzn.to/4eTYhXi) here.

The new iPad mini running Astro Bot from my PS5. I’m using the GameSir G8 Plus controller here.

For this change alone, I would make the case that if you’re looking for a compact tablet for videogame streaming (whether locally or over the Internet), the new iPad mini is a very compelling package. In fact, I’d argue that – display technology considerations aside – the iPad mini is the ideal form factor for a streaming companion device; it’s bigger than a phone, but not as heavy as a 11” tablet.

The other change in this iPad mini is support for the Apple Pencil Pro. Apple has been (rightfully) criticized over the past year for some of its confusing updates to the Apple Pencil lineup, but with this iPad mini, it feels like the company is now telling a clear narrative with this accessory. The new iPad mini supports two Apple Pencil models: the entry-level Apple Pencil with USB-C and the Apple Pencil Pro. This means that, as of late 2024, the iPad mini, Air, and Pro all support the same Apple Pencil models with no perplexing exceptions.

The iPad mini paired with the Apple Pencil Pro.

The iPad mini paired with the Apple Pencil Pro.

Now, you know me; I’m not a heavy user of the Apple Pencil. But I do think that the Pencil Pro makes for a really interesting accessory to the iPad mini, even when used for non-artistic purposes. For instance, I’ve had fun catching up on my reading queue while holding the mini in my left hand and the Pencil Pro in my right hand to quickly highlight passages in GoodLinks. Thanks to the ability to run a custom shortcut by squeezing the Pencil Pro, I’ve also been able to quickly copy an article’s link to the clipboard just by holding the Pencil, without needing to use the share sheet. The iPad mini also supports Apple Pencil Hover now, which is, in my opinion, one of the most underrated features of the Apple Pencil. Being able to hover over a hyperlink in Safari to see where it points to is extra nice.

With a squeeze of the Pencil Pro, I can instantly copy the URL of what I'm reading in GoodLinks (left).

With a squeeze of the Pencil Pro, I can instantly copy the URL of what I’m reading in GoodLinks (left).

None of these features are new (they’ve been supported since the new iPad Pros in May), but they feel different when the iPad you’re using is so portable and lightweight you can hold it with one hand. The Pencil Pro + iPad mini combo feels like the ultimate digital notepad, more flexible than the Apple Pencil 2 ever was thanks to the new options offered by the Pro model.

Silvia's much better handwriting with the Apple Pencil Pro, iPad mini, and [Steve Troughton-Smith's upcoming Notepad app](https://mastodon.social/@stroughtonsmith/113343155927102394).

Silvia’s much better handwriting with the Apple Pencil Pro, iPad mini, and Steve Troughton-Smith’s upcoming Notepad app.

We now come to the infamous phenomenon known as “jelly scrolling”. If you recall from my review of the iPad mini three years ago, this is not something I initially noticed, and I don’t think I was alone. However, once my eyes saw the issue one time months later, it ruined my experience with that display forever.

For those unaware, jelly scrolling refers to a display issue where, in portrait orientation, scrolling a page would result in one half of the screen “moving” more slowly than the other. It could go unnoticed for months if you weren’t paying attention or your eyes simply weren’t seeing it, but once they did, you’d see a jelly-like effect onscreen with the two halves of the display sort of “wobbling” as you scrolled. There are plenty of videos that demonstrate this effect in motion, and as I said, it was more of a, “Once you see it, there’s no way to unsee it,” sort of problem. When my eyes picked up on it months after the review, it bothered me forever that I didn’t mention it in my original story.

I’m happy to report that, in the new iPad mini, the jelly scrolling issue has been fixed without the need to change the underlying display technology of the device. The new iPad mini has an optimized display controller that ensures the entire panel will refresh at the same rate and speed. For this reason, even though it’s the same display across two generations with the same refresh rate, color gamut, pixel density, and brightness, the new iPad mini does not have one side of the screen that refreshes more quickly than the other.

There’s an argument to be made that a tablet that costs $500 in 2024 should have a refresh rate higher than 60Hz. I’d argue that the same is true for the iPhone 16 lineup: ideally, Apple should raise the baseline to 90Hz and keep ProMotion at 120Hz exclusive to Pro devices. However, as someone who uses the iPhone 16 Plus as his iPhone of choice, it would be hypocritical of me to say that the 60Hz display of the iPad mini is a dealbreaker. This device doesn’t have the same fancy display as my 11” iPad Pro, but for what I want to use it for, it’s fine.

Would I prefer an “iPad mini Pro” with OLED and ProMotion? Of course I would love that option. But with jelly scrolling out of the equation now, I’m fine with reading articles, watching videos, and streaming games on my iPad mini at 60 Hz.


There are other hardware changes in the iPad mini I could mention, but they’re so minor, I don’t want to dwell on them for too long. It now has Bluetooth 5.3 onboard instead of Bluetooth 5.0. The iPad mini, like the Pro and Air models, has switched to eSIM only for cellular plans, which means I have one fewer physical component to worry about. And the USB-C port has graduated from USB 3.1 Gen 1 speeds (5 Gbps) to USB 3.1 Gen 2 (10 Gbps), which results in faster file transfers. However, I don’t plan on using this iPad for production work that involves transferring large audio or video files (we have a YouTube channel now), so while it’s welcome, this is a change I can largely ignore.

The Most Compact Tablet for (Occasional) Split View

Over the past three years, I’ve gotten fixated on this idea: the iPad mini isn’t a device I’d recommend for multitasking, but it is the most compact Apple computer you can have for the occasional Split View with two almost iPhone-sized apps side by side. I don’t use this device for serious, multi-window productivity that involves work tasks. But I’ve been surprised by how many times I found myself enjoying the ability to quickly invoke two apps at once, do something, and then go back to full-screen.

Split View on the iPad mini. The Ivory + Spotify combo is something I do almost every day when I'm done working.

Split View on the iPad mini. The Ivory + Spotify combo is something I do almost every day when I’m done working.

Or, let me put it another way: the iPad mini fills the multitasking gap left open by my iPhone and the absence of a foldable iPhone in Apple’s lineup. Even on my 16 Plus, there are times when I wish I could use, just for a few seconds, two iPhone apps in vertical Split View. The iPad mini is the only Apple device I can hold with one hand while also using Split View or Slide Over. And there’s something to be said about that option when you need it.

When I’m unwinding at the end of the day, sometimes I like to put a YouTube video on one side of the screen and keep Mastodon open on the other. The iPad mini lets me do it. Or maybe I want to keep both my Ivory and Threads timelines open at the same time because some live event is going on. Or perhaps I just want to keep GoodLinks or Safari open and invoke Quick Notes to jot down an idea I had while reading. These aren’t highly complex, convoluted tasks; they’re simple workflows that benefit from the ability to split the screen or summon a temporary window. The iPad mini is the best device Apple makes for this kind of “ephemeral multitasking”.

Slide Over is equally useful on the iPad mini, especially because I can invoke it with just my thumb.

Slide Over is equally useful on the iPad mini, especially because I can invoke it with just my thumb.

And don't forget: Slide Over comes with its own window picker, too!

And don’t forget: Slide Over comes with its own window picker, too!

In a post-Stage Manager world, there’s something about the reliability of Split View and Slide Over that I want to publicly acknowledge and appreciate. I briefly mentioned this in the story I wrote about the making of my iOS and iPadOS 18 review: for the past three months, I’ve only used Stage Manager when I connect my iPad Pro to an external display. When I’m working on the iPad by itself, I no longer use Stage Manager and exclusively work in the traditional Split View and Slide Over environment instead.

The iPad mini is not an ideal multitasking machine. It doesn’t support Stage Manager, three-column app layouts aren’t available by default2, and apps in Split View can become so small that they feel like slightly wider iPhone apps. And yet, there is something so nice and – as I argued three years ago – delightful about controlling Split View multitasking with your thumbs as you hold the device in landscape, it’s hard to convey unless you try it.

Most iPad mini reviews, including mine from 2021, typically focus on the media consumption aspect of the device. And I’ll get to that before I wrap up. What I’m trying to say, however, is that I no longer buy the argument that you’d “never” want to multitask on such a small display. I’ve found tangible, practical benefits in the ability to “consume content” while doing something else on the side. This doesn’t mean that I’m going to write my next longform essay on the iPad mini. It means that multitasking is a spectrum, and I love how the mini lets me dip in and out of multiple apps in a way that the iPhone still doesn’t allow for.

The Third Place

My Ayn Odin 2 Mini, Vision Pro, Steam Deck OLED, and iPad mini.

My Ayn Odin 2 Mini, Vision Pro, Steam Deck OLED, and iPad mini.

As I used the new iPad mini last week, I was reminded of a PlayStation 2 advertising campaign from 2000 to promote the launch of Sony’s new console. The campaign, called ”The Third Place”, featured a commercial directed by David Lynch, among others that are wrongfully attributed to him, but which play along a similar theme.

The concept behind these eerie, cryptic commercials is actually quite fascinating and rooted in history. In sociology, there’s this concept of a third place, which represents a social environment separate from a person’s home (their first place) and workplace (their second place). Examples of “third places” include coffee shops, parks, theaters, clubs – places where people go to socialize, hang out, and ground themselves in a different reality that is socially and physically separate from what they do at home and what they do at work. In Ancient Greece, the agora was a classic example of a third place. The lines get blurry in our modern society when you consider places that can be work and social environments at once, such as co-working spaces, but you get the idea.

With their ad campaign (directed by TBWA, a name familiar to Apple users), Sony wanted to position the PS2 as an escapist device to find your third place in the boundless possibilities provided by the digital worlds of videogames. Truth be told, when I saw those commercials as a kid (I was 12 in 2000), I just thought they were cool because they were so edgy and mysterious; it was only decades later that I was able to appreciate the concept of a third place in relation to gaming, VR, and everything in between.

I’ve been thinking about the idea of a third place lately as it relates to the tech products we use and the different roles they aim to serve.

The way I see it, so many different devices are vying for the third place in our lives. We have our phones, which are, in many ways, the primary computers we use at home, to communicate with others, to capture memories of our loved ones and personal experiences; they are an extension of ourselves, and, in a sense, our first place in a digital world. We have our computers – whether they’re traditional laptops, modular tablets, or desktops – that we use and rely on for work; they’re our second place. And then there’s a long tail of different devices seeking to fill the space in between: call it downtime, entertainment, relaxing, unwinding, or just doing something that brings you joy and amusement without having to use your phone or computer.

For some people, that can be a videogame console or a handheld. For others, it’s an eBook reader. Or perhaps it’s a VR headset, a Vision Pro, or smart glasses that you can wear to watch movies or stream games. Maybe it’s an Apple TV or dedicated streaming device. Just like humans gravitate toward a variety of physical third places to spend time and socialize, so can “third place devices” coexist with each other in a person’s time separate from their family or work obligations. This is why, for most people, it’s not uncommon to own more than one of these devices and use them for different purposes. We’re surrounded by dozens of potential digital third places.

The tech industry has been chasing this dream (and profitable landscape) of what comes after the phone and computer for decades. In Apple history, look no further than Steve Jobs’ introduction of the original iPad in 2010, presented as a third device in between a Mac and iPhone that could be “far better” at key things such as watching videos, browsing your photos, and playing games. When a person’s primary computer is always in their pocket (and unlikely to go away anytime soon), and when their work happens on a larger screen, what other space is there to fill?

When I started looking at these products through this lens, I realized something. The iPad mini is the ideal third place device for things I don’t want to do on my iPhone or iPad Pro. By virtue of being so small, but bigger than a phone, it occupies a unique space in my digital life: it’s the place I go to when I want to read a book, browse the web, or watch some videos without having to be distracted by everything else that’s on my phone, or be reminded of the tasks I have to do on my iPad Pro. The iPad mini is, for me at least, pure digital escapism disguised as an 8.3” tablet.

From this perspective, I don’t need the iPad mini to run Stage Manager. I don’t need it to have a ProMotion display or more RAM. I don’t need it to be bigger or come with a Magic Keyboard. I need it, in fact, to be nothing more than it currently is. I was wrong in trying to frame the iPad mini as an alternative to other models. The iPad mini would lose the fight in any comparison or measurement against the 11” iPad Pro.

But it’s because of its objective shortcomings that the iPad mini makes sense and still has reason to exist today. It is, after all, a third device in between my phone and laptop. It’s a third place, and I can’t wait to spend more time there.


  1. For context, I have a fiber connection that maxes out at 1 Gbit down and 300 Mbits up. ↩︎
  2. Unless a developer adds specific support for the iPad mini to always mark this layout as available. My favorite RSS reader, Lire, can be used with three columns in landscape on the iPad mini. ↩︎

Access Extra Content and Perks

Founded in 2015, Club MacStories has delivered exclusive content every week for nearly a decade.

What started with weekly and monthly email newsletters has blossomed into a family of memberships designed every MacStories fan.

Club MacStories: Weekly and monthly newsletters via email and the web that are brimming with apps, tips, automation workflows, longform writing, early access to the MacStories Unwind podcast, periodic giveaways, and more;

Club MacStories+: Everything that Club MacStories offers, plus an active Discord community, advanced search and custom RSS features for exploring the Club’s entire back catalog, bonus columns, and dozens of app discounts;

Club Premier: All of the above and AppStories+, an extended version of our flagship podcast that’s delivered early, ad-free, and in high-bitrate audio.

Learn more here and from our Club FAQs.

Join Now]]>
Postcards and a Mac: Niléane’s Desk Setup https://www.macstories.net/stories/postcards-and-a-mac-nileanes-desk-setup/ Mon, 21 Oct 2024 18:01:20 +0000 https://www.macstories.net/?p=76962

It’s been a while since I last showed off my desk. The last time I did so as part of MacStories Weekly Issue 405 in February, I had just acquired an 11-inch iPad Pro, and my desk looked quite different than it does now. It had an imposing corner shelf holding a variety of plushies, accessories, and other knickknacks, in addition to providing support for my microphone arm. Overall, it felt a lot more cluttered than it does now.

As the months went on, I’ve had to rethink my desktop layout to accommodate the many changes that I’ve made to my device usage. Now more than ever, my M2 MacBook Air is at the center of everything I do – so much so that the iPad Pro is now nothing more than an eBook reader for me and rarely lives on my desk as a result. This summer, we also launched Comfort Zone, a new weekly show in the MacStories family of podcasts. Since Comfort Zone is both an audio and video podcast, I started recording video at my desk for the first time ever, which also meant that I had to tweak my desk to optimize it for lighting and a new microphone setup.

In the end, these changes have added up to a completely new desk setup. So today, I’m going to take you on a quick desk tour. Let me walk you through the main highlights of what makes this desk my favorite little corner in our home.

First, I should once again address that this is a small desk tucked into a small space. For some context, my partner and I live in a three-room apartment in Grenoble, France. While this means that we get to enjoy great views of the French Alps from most of our windows, it also means that we’ve had to reach a compromise to keep our workspaces separate during the day. As a result, my partner got to install their sizable desk in the dedicated spare room, and I got to settle mine in our bedroom. This is the main reason why I have to keep my desk space cozy and compact.

My desk corner in our main bedroom.

My desk corner in our main bedroom.

The main thing you may notice when looking at my desk is the monitor hooked up to my M2 MacBook Air. It’s the 28.2-inch Huawei MateView, which I grabbed on sale a couple of years ago for about $450.1 Huawei is not usually a brand that inspires confidence in me. And yet, I absolutely love this monitor and its unusual 3:2 aspect ratio. It makes for a taller, narrower canvas that is perfect for browsing the web (allowing you to see more of a web page at once) or working on designs in Figma, and it’s still comfortably wide enough for watching videos and playing videogames. For a monitor of this price, I’ve also been pretty impressed with the decent color accuracy and apparent contrast ratio that it provides at a 4K resolution. While I would probably save for a more expensive display today, I’m going to hold on to this one for as long as I can — mostly because I fear going back to a standard 16:9 or 16:10 aspect ratio now.

The M2 MacBook Air sits on a TwelveSouth HiRise Pro stand that’s height-adjustable and made of sturdy aluminum. I used to keep the Mac closed in a vertical stand tucked behind the monitor, but I’ve come to realize how handy it is being able to occasionally open it to use the built-in display alongside my main screen. I do this when I’m on video calls to keep the call window visible while I’m working, or when I want to watch a video on the side. Otherwise, I like to keep the MacBook lid closed and focus on a single screen.

Between the Mac and the monitor, I’ve installed a Belkin Thunderbolt 3 Dock Pro. I’ve owned this dock for a long time, and it still works perfectly enough today that I’ve not felt the need to upgrade to a newer, more expensive Thunderbolt 4 dock. It offers two Thunderbolt 3 ports, one full-sized DisplayPort, an Ethernet port, five USB-A ports (including one on the front), a standard USB-C port, a 3.5mm audio jack, and one front-facing SD card slot. Almost everything on my desk, including a Belkin height-adjustable MagSafe stand for my iPhone, is now hooked to the Mac by this dock, and I’ve still got ports to spare.

The Belkin Thunderbolt 3 Dock Pro

The Belkin Thunderbolt 3 Dock Pro

I've settled on this Belkin height-adjustable MagSafe stand, which perfectly fits under the monitor at its lowest height.

I’ve settled on this Belkin height-adjustable MagSafe stand, which perfectly fits under the monitor at its lowest height.

In my last desk setup showcase, I was using a Blue Yeti microphone for calls and the occasional podcast recording. This microphone is still in great condition and is now being used by my partner as part of their own desk setup. The reason I let it go is that I was not quite satisfied with how it made me sound. My voice already sounds deeper than I would like, and I found that the Yeti exacerbated that impression. So now, I’m using the Audio-Technica ATR2100X microphone.

This one isn’t perfect by any means, but it sounds a lot closer to what I want, and most importantly, it is very affordable. Another positive aspect about the ATR2100X is that it can connect via both XLR and USB-C. Since I didn’t want to invest in an audio interface just yet, this straightforward connectivity was perfect for me. The microphone is connected directly to the Belkin dock via USB-C, and I always keep a 3.5mm to Lightning cable dangling from its built-in headphone jack so I can use my AirPods Max for monitoring while I record Comfort Zone.

The microphone sits on the Elgato Wave Mic Arm LP.2 This microphone arm caused me quite a few headaches in the past few months. As you may have noticed, my desk has two built-in drawers that I find incredibly useful to store cables, medications, and other items that I need on a daily basis. However, the drawback of these drawers is that I simply can’t use the built-in clamp that comes with this microphone arm (or any other microphone arm) to attach it to the side of the desk. Elgato has a solution for this: the Heavy Base. You can use the Heavy Base’s offset mount to attach the Wave Mic Arm LP, then simply set the base down on your desk. This is perfect if your desk is too thick to accommodate a clamp. Unfortunately, for the past few months, the Heavy Base was out of stock pretty much everywhere I looked, and when retailers had one in stock, they always had the old version without the offset mount. In the end, it was only this month that I was finally able to order one from a German music store. It works exactly as intended, and the arm is firmly mounted to the base and stable.

My Audio-Technica ATR2100X attached to the Elgato Wave Mic Arm LP, itself mounted to the Elgato Heavy Base

My Audio-Technica ATR2100X attached to the Elgato Wave Mic Arm LP, itself mounted to the Elgato Heavy Base

When it comes to recording video for the podcast, all I use is an old iPhone SE from 2020. This phone used to belong to my partner, and today it barely holds a charge. But despite the iPhone SE’s age and its smaller camera sensor, the video quality is still miles ahead of most off-the-shelf webcams, and it has been serving me well for video calls over the past few years, and now Comfort Zone on YouTube. To mount the iPhone on my desk, I’ve come up with a somewhat hacky solution. With the help of a MagSafe-compatible case, I’ve attached the iPhone to a Belkin MagSafe mount, which itself is mounted on top of the beautiful LEGO Orchid set. Yes, I know; I’m a genius.

Despite the hacky vibes of this camera mounting contraption, I just love how I’ve been able to revive this phone as a webcam thanks to Continuity Camera on macOS. If you have an older iPhone lying around in a closet, I highly recommend trying to use it as a webcam. It’s a pretty good bargain.

Here's my genius video recording contraption made out of an old iPhone SE, a MagSafe mount, and a LEGO plant.

Here’s my genius video recording contraption made out of an old iPhone SE, a MagSafe mount, and a LEGO plant.

To light up the desk, I use the BenQ ScreenBar Halo, which John reviewed on MacStories. This light bar is perched on top of my monitor and is able to shine a diffuse glow behind the monitor, as well as down in front of the monitor where my Magic Keyboard and Trackpad are. The result is a super cozy vibe both at night and during the day. Purchasing the ScreenBar Halo prevented me from going down the rabbit hole of setting up a couple of light strips around the back of the desk, so I’m glad I went this route instead. It has been a wonderful addition to this atmosphere of this workspace corner.

In addition to the ScreenBar Halo on top of my monitor, I’ve also recently added the Elgato Key Light Neo. The Key Light Neo is a more affordable alternative to the company’s larger Key Lights, and it’s a great way to light yourself up on camera without taking up too much space on an already crowded desk. I’ve been using it for the past two Comfort Zone recordings, and it’s done a lot to improve the look of my video feed.

My monitor holds both the BenQ ScreenBar Halo and the Elgato Key Light Neo.

My monitor holds both the BenQ ScreenBar Halo and the Elgato Key Light Neo.

I’m a firm believer that good lighting and color choices are not enough to ensure that a desk looks and feels like a cozy corner where you’d want to spend ten hours at a time. For this reason, even though the corner shelf filled with plushies and knickknacks is gone, I still wanted to keep my field of view filled with fun and cute items. So you’ll probably be able to spot a tiny Baby Yoda figurine, a third-generation iPod Nano, and a yellow corgi-shaped fidget toy lying about the surface of the desk.

On the wall, I’ve hung a sample of my collection of favorite postcards. Those include a set of beautiful (and colorful) postcards from our last Berlin trip hugging the left wall, as well as postcards from my home island of La Réunion surrounding Stephen Hackett’s 2024 Apple History Calendar. I would like to say that I’ve arranged them in a specific and thoughtfully-considered pattern, but the only real pattern here is vibes.


Now, obviously, this setup will keep evolving. On my list of probable additions in the near future is an audio interface and a second yellow HomePod mini so I can finally listen to music in stereo when I’m not wearing headphones. I’m also not quite satisfied with my keyboard situation. I’m still using Apple’s Magic Keyboard, and the only real reason is that I can’t imagine myself doing without its built-in Touch ID sensor.

Still, I think I’ve reached a good place with this desk corner. I’ve been able to keep all of it contained, and so far, my partner is glad that our bedroom has not been overrun with my tech devices and accessories. Most crucially, this small corner feels like a space where I enjoy spending time, from early in the morning all the way to sleepless nights.


  1. The 28.2-inch Huawei MateView seems to have been discontinued sometime in 2023. ↩︎
  2. “LP” stands for “low profile”. ↩︎

Access Extra Content and Perks

Founded in 2015, Club MacStories has delivered exclusive content every week for nearly a decade.

What started with weekly and monthly email newsletters has blossomed into a family of memberships designed every MacStories fan.

Club MacStories: Weekly and monthly newsletters via email and the web that are brimming with apps, tips, automation workflows, longform writing, early access to the MacStories Unwind podcast, periodic giveaways, and more;

Club MacStories+: Everything that Club MacStories offers, plus an active Discord community, advanced search and custom RSS features for exploring the Club’s entire back catalog, bonus columns, and dozens of app discounts;

Club Premier: All of the above and AppStories+, an extended version of our flagship podcast that’s delivered early, ad-free, and in high-bitrate audio.

Learn more here and from our Club FAQs.

Join Now]]>
After Five Years of Pro iPhones, I’m Going iPhone 16 Plus This Year https://www.macstories.net/stories/iphone-16-plus-fun/ Wed, 02 Oct 2024 14:47:11 +0000 https://www.macstories.net/?p=76780 My iPhone 16 Plus.

My iPhone 16 Plus.

If you asked me two weeks ago which iPhone model I’d be getting this year, I would have answered without hesitation: my plan was to get an iPhone 16 Pro Max and continue the tradition of the past five years. I’ve been using the largest possible iPhone since the XS Max and have bought the ‘Pro Max’ flavor ever since it was introduced with the iPhone 11 Pro Max in 2019. For the past five years, I’ve upgraded to a Pro Max iPhone model every September.

And the thing is, I did buy an iPhone 16 Pro Max this year, too. But I’ve decided to return it and go with the iPhone 16 Plus instead. Not only do I think that is the most reasonable decision for my needs given this year’s iPhone lineup, but I also believe this “downgrade” is making me appreciate my new iPhone a lot more.

It all comes down to a simple idea: fun.

Realizing That, Indeed, Maybe I’m Not a Pro Anymore

This thought – that perhaps I could be just fine with a regular iPhone instead of a Pro variation – first popped into my head while I was watching Apple’s September keynote. With the addition of last year’s Pro-exclusive Action button and the cross-model introduction of the new Camera Control, I thought maybe I wouldn’t feel “left behind” in terms of major new iOS features. Historically, that’s always been the pull of the Pro line: there’s something exclusive to them – whether it’s the size, display technology, or design language – that pushes me to eschew the base model in favor of the more expensive Pro one, where “Pro” actually means “best”. But if the features I cared about most were trickling down to the non-Pro iPhones too, could my personal definition of “best” also change?

Besides feature availability, I also had a vibe-related realization during the keynote. More than in previous years, some parts of the photography segment were really technical and, for my personal taste, boring. Don’t get me wrong. I appreciate that Apple is unlocking incredible potential for photographers and filmmakers who want to shoot with an iPhone and have unlimited control over their workflow. It is necessary for the company to push the envelope and put that kind of power in the hands of people who need it. But that’s the issue: as I was watching the segment on audio mixes and nearly dozing off, for the first time in years I felt that Apple wasn’t targeting me – and that maybe that phone wasn’t meant for me.

I know, right? It sounds obvious. But if you’ve been writing about Apple or have been part of the “Apple community” for as long as I have, you know that there’s a kind of invisible social contract wherein true nerds are supposed to be getting and producing content about the most expensive iPhones every year. I know and say this because I’ve been guilty of this line of thinking before. There’s almost an expectation that whoever creates content about Apple needs to do so from the top down, purchasing the highest-end version of anything the company offers. But if you think about it for a second, this is a shortsighted approach: the vast majority of people can’t afford the most expensive Apple products and, in reality, most of our stories run the risk of sounding too aspirational (if not alienating) to them rather than practical.

This meta commentary about purchasing Apple products and the parasocial pressure of writing about them is necessary context because, regardless of my initial feelings during the keynote, I still went ahead and ordered an iPhone 16 Pro Max. Despite me not caring about any of the advanced camera stuff in the Pro models, despite the Action button and Camera Control arriving on the base models, and despite Brendon’s story on this very topic that resonated with me, I still thought, “Well, surely I’m supposed to be getting an iPhone 16 Pro Max. I can’t be the type of person who ‘downgrades’ to a regular iPhone 16, right?

And so preorder I did, ever so convinced I had to stick with a more expensive (and more visually boring) iPhone because of the always-on display, ProMotion, telephoto lens, and increased battery life.


When the 16 Pro Max arrived, I could instantly tell that something felt off about it this year. I’m not saying that it wasn’t a good upgrade from my iPhone 15 Pro Max; the improved ultra-wide camera was great, battery life was magnificent, and the thinner bezels looked nice. What I’m saying is that, more so than in previous years, I felt like it was almost “too much iPhone” for me, and that its changes were only marginally improving upon my experience from the previous generation. Meanwhile, I was giving up the fun looks, creative constraints, and increased portability of an iPhone 16 Plus to keep up my end of an unspoken bargain with my audience – or maybe just myself.

The more I used the iPhone 16 Pro Max, the more I felt that it crossed a threshold of weight and screen size that I was not expecting. I’ve always been a strong proponent of large iPhones, but for the first time, the 16 Pro Max felt too big and heavy. This idea solidified when Apple eventually sent me a review unit of the iPhone 16 Plus: there I was, using an iPhone slightly smaller than the Pro Max (but still big enough), which was also 30 grams lighter, and, more importantly, had a stunning ultramarine color that put a smile on my face whenever I used it.

I used the iPhone 16 Plus for a few days alongside my iPhone 16 Pro Max. During that experiment, I realized that my initial feelings were right and I should have trusted my original instincts. The iPhone 16 Plus had all the things I wanted from a new iPhone (large screen, good battery, Action button, Camera Control, A18 performance) in a more accessible package that traded advanced photography features for increased portability and, yes, pure aesthetics. And just like I accepted a few months ago that I’m not necessarily an AirPods Pro person but actually prefer the base model AirPods, so I decided to return my iPhone 16 Pro Max and get an iPhone 16 Plus instead.

After a week, I haven’t missed the bigger, heavier iPhone 16 Pro Max at all. In fact, using the iPhone 16 Plus and forcing myself to be creative within its photographic constraints has reignited in me a passion for the iPhone lineup that I hadn’t felt in years.

Using (and Loving) the iPhone 16 Plus

Let’s address the elephant in the room: I’m not missing ProMotion and the always-on display as much as I feared I would.

I’ve never been a heavy user of Lock Screen widgets, so not seeing glanceable information on my Lock Screen without waking up the display is not a big deal. I thought I was reliant on the always-on display, but it turns out, I was just leaving it on because I could. If anything, I’d argue that not always seeing my iPhone’s display when I’m at my desk helps me stay more focused on what I’m doing, and it’s making me appreciate using my Apple Watch1 to, well, check the time even more. In a way, the absence of the always-on display is the best Focus mode I’ve ever tested.

Plus, raising my iPhone or tapping the screen to wake up the display is not the end of the world.

The lack of ProMotion took a longer adjustment period – where by “longer” I mean two days – but now it’s fine. I’ve been switching between my iPad Pro with a ProMotion display and the iPhone 16 Plus without one, and I lived to tell the tale. I wish I had a better way to convey this that doesn’t boil down to, “My eyes got used to it and it’s okay”, but here we are. I was firmly in the camp of, “I can never go back to a non-ProMotion display”, but when you use a device that doesn’t have it but makes you happy for other reasons, it’s doable. Plenty of folks who claim that non-ProMotion iPhones are a non-starter also enjoy using the iPad mini; it’s the same argument. If next year’s “Plus” equivalent model (or whatever replaces it) gets ProMotion, then great! I’ll happily take it. Otherwise, it’s fine.

The feature I’m missing most from the iPhone Pro Max is the telephoto lens. I took a lot of pictures of my dogs using that 5x zoom, and I wish my iPhone 16 Plus had it. But something that Brendon suggested in his story came true for me: the limitations of the iPhone 16 Plus are forcing me to be creative in other ways, and it’s a fun exercise. I need to frame subjects differently, or get closer to them, and accept that I can’t optically zoom from far away like I’ve been doing for the past year.

I took plenty of amazing pictures for years using iPhones without a 5x lens, and I still cherish those photos. When I look at some of the pictures I’ve taken over the past week with my iPhone 16 Plus, I can’t complain. So what if I don’t have access to the absolute best camera Apple makes for professional users? A base model iPhone can still capture remarkable shots. I can live without “best”.

Zelda and Ginger say hi.

Zelda and Ginger say hi.

The shelf above my desk.

The shelf above my desk.

Which brings me to the camera-related changes in this year’s iPhone lineup.

On one hand, I find the default behavior of the Camera Control button too fiddly. The “half-press” faux clicks are tricky to get right, impossible to explain to other people, and, worst of all, not as configurable as I hoped. If I have to carefully swipe on a thin capacitive button to access additional Camera controls, I might as well just touch the screen and get it done faster thanks to larger UI elements. I would have preferred the ability to assign half presses to specific features, such as toggling zoom levels, switching to the selfie camera, or choosing between 12 and 48 MP shooting modes.

My Camera settings in iOS 18.1 beta.

My Camera settings in iOS 18.1 beta.

For now, thanks to new options available in the iOS 18.1 beta under Accessibility, I’ve outright disabled half presses, and I’m just using Camera Control as a shutter button. I may reconsider when Apple ships the two-stage shutter mode for auto-focus later this year. But with this setup, I love using Camera Control as a “simple” button that opens the Camera from anywhere and takes a picture. It’s become my default way for launching the Camera and allowed me to get rid of all the other Camera shortcuts I had on the Lock Screen and in Control Center.

On the other hand, I immediately became a fan of the new photographic styles on the iPhone 16 and the ability to bring back shadows in my photos thanks to undertones.

For a few years now, I (and many others) have felt like the iPhone’s camera was producing rather uninspired results where everything looked too homogeneous and equalized. I didn’t know how to put this into words until I read and watched Nilay Patel’s review of the iPhone 16 models. Because the camera aggressively balances highlights and shadows so that everything is bright and visible in a picture, nothing truly stands out by default anymore. I understand why Apple does this (we spend so much money on these phones; surely we want to see every detail, right?), but I still disagree with the approach. I’d rather have fewer details in a photo with character that latches onto my visual memory than a “perfect” shot where everything is nicely lit, but ultimately forgettable.

Post by @viticci
View on Threads

This year, rather than fixing the root of the problem, Apple pretty much said, “You figure it out”. And that’s what I did: I simply tweaked the ‘Standard’ photographic style to have tones in the -0.5/0.7 range and enabled the option to preserve this setting, and that was it. Now every picture I take pops a little bit more, has more shadows, and feels less like an ultra-processed digital artifact despite the reality that, well, it still is. The fact that I can even alter styles after a picture is taken (the amber and gold styles are my favorites so far) with individual controls for color intensity and tone is just icing on the cake.

And now for the hardest part of this story: expressing in a blog post the fleeting, intangible quality of “fun” and why using a colorful iPhone after years of stone-like slabs makes me smile. Most of this is going to sound silly, and that is the point.

The iPhone 16 Plus’ ultramarine color is amazing to see in person. Everything about it stands out: the Apple logo on the back, which has a texture I prefer to the iPhone 16 Pro Max’s glass; the flat, darker sides; the different, shiny hue of the Camera Control button; and especially the brilliant color of the glass camera bump. If you don’t like the idea of your iPhone being very vibrant and recognizable, I get it. Personally, I love that something so colorful is also a piece of cutting-edge technology; it creates a contrast between this device looking toy-like in nature and it actually being a powerful pocket computer. It reminds me of my beloved pink and blue PSPs. There’s something about advanced tech that meets color. I don’t know, ask the iMac about it.

I love the fact that when I pull out this phone, people look at it and ask me questions. If you’d rather not have people start a conversation with you about your phone, I also get it! But as we’ve established, I love talking to people about tech, and I appreciate that this phone is catching people’s attention so that I have to explain what it is and what it does.

I told you this part was going to sound silly, so stay with me: I love how this phone looks alongside the tattoos on my right hand and left arm. I mean, I’m a walking definition of vibrant colors. If you think I’m crazy for believing in such ideas, thank you. When I’m working at my desk and look down at the blue phoenix on my left arm, then see the ultramarine iPhone 16 Plus next to me, I chuckle. This thing looks good.

For now, I’m using the iPhone 16 Plus without a case because I like how the device feels in my hands and want to show off its color. I’m a little concerned about the durability of the blue paint around the camera lenses, though, and I’m considering getting something like an Arc Pulse to protect the camera bump on the back.

I wasn’t expecting this, but the 30-gram difference between the iPhone 16 Pro Max and 16 Plus is noticeable in the hand. I’ve been using the 16 Plus a lot to catch up on my queues in GoodLinks and Unwatched, and I haven’t felt the same pressure on my left wrist (which has been acting up lately) as I did when I was testing the 16 Pro Max. I’m a little disappointed about the rumors that the iPhone 16 Plus will be the last of its kind; however, if Apple will indeed release a much slimmer “iPhone Air” in 2025, I guess this exercise of not using a Pro model for a year will pay off.

Lastly – and I can’t believe I’m typing this after my iOS 18 review – I have to give a shoutout to dark and tinted icons on the iPhone 16 Plus. I installed a deep blue/purple wallpaper to match my phone’s color; when combined with dark icons (which I’m using by default) or some variations of the tinted ones, the results aren’t bad at all:

Fun, Unique Tech for Everyday Life

What I’m trying to convey in this story is the following concept:

For the past few years, I never cared about my iPhone as a physical object. I was more concerned with the resolution, cameras, and other specs – what was inside the phone. The devices themselves weren’t eliciting any particular reaction; they were the same slabs year after year, replaced in a constant cycle of slab-ness for the pursuit of the “best” version of whatever Apple made during that year. This is why the last “new” iPhone I truly remember was the groundbreaking iPhone X. If you asked me to tell you what the 12, 13, 14, and 15 Pro Max felt like in everyday usage, I wouldn’t be able to answer precisely. One of them introduced the Dynamic Island and another was made of titanium, I guess? They were great phones, albeit forgettable in the grand scheme of things.

This year, I’ve realized that using devices that have something fun or unique about them compounds my enjoyment of them. This isn’t about those devices being “pro” or colorful; it’s about them adding something fun and different to my life.

I enjoy using the 11” iPad Pro because it’s thin and has the nano-texture display that lets me work outside. The Steam Deck OLED with matte display produces a similar feeling, plus it’s got a unique ergonomic shape that fits my hands well. I like the orange Action button and Digital Crown circle of my Apple Watch Ultra 1 and how they pair with my orange Sport band. The Meta Ray-Bans are good-looking glasses that also happen to have a camera and speakers. The Legion Go is bulky, but the controllers feel great, the display looks amazing, and the console is extremely moddable. Each of these devices has some flaws and isn’t the “best” option in its respective field; however, as products I use in everyday life, they’re greater than the sum of their parts.

The iPhone 16 Plus isn’t the most powerful model Apple makes. But for me, its combination of color, texture, reduced weight, and modern features makes it the most pleasant, fun experience I’ve had with an iPhone in a long time.


  1. Speaking of which, thanks to audio playback and Live Activities in watchOS 11, I don’t miss seeing these features on the iPhone’s always-on Lock Screen that much either. ↩︎

Access Extra Content and Perks

Founded in 2015, Club MacStories has delivered exclusive content every week for nearly a decade.

What started with weekly and monthly email newsletters has blossomed into a family of memberships designed every MacStories fan.

Club MacStories: Weekly and monthly newsletters via email and the web that are brimming with apps, tips, automation workflows, longform writing, early access to the MacStories Unwind podcast, periodic giveaways, and more;

Club MacStories+: Everything that Club MacStories offers, plus an active Discord community, advanced search and custom RSS features for exploring the Club’s entire back catalog, bonus columns, and dozens of app discounts;

Club Premier: All of the above and AppStories+, an extended version of our flagship podcast that’s delivered early, ad-free, and in high-bitrate audio.

Learn more here and from our Club FAQs.

Join Now]]>
A Single Apple EarPod Has Become My Favorite Wired Earbud for Gaming https://www.macstories.net/stories/a-single-apple-earpod-has-become-my-favorite-wired-earbud-for-gaming/ Tue, 24 Sep 2024 18:21:54 +0000 https://www.macstories.net/?p=76718 Nintendo Switch with [Hori's Split Pad Compact](https://amzn.to/3zlDpZD) controllers, Steam Deck OLED, and Ayn Odin 2. Also, [you should play UFO 50](https://wavelengths.online/posts/ufo-50-a-review).

Nintendo Switch with Hori’s Split Pad Compact controllers, Steam Deck OLED, and Ayn Odin 2. Also, you should play UFO 50.

Picture this problem:

Because of my podcast about portable gaming NPC with John and Brendon, I test a lot of gaming handhelds. And when I say a lot, I mean I currently have a Steam Deck, modded Legion Go, PlayStation Portal, Switch, and Ayn Odin 2 in my nightstand’s drawer. I love checking out different form factors (especially since I’m currently trying to find the most ergonomic one while dealing with some pesky RSI issues), but you know what I don’t love? Having to deal with multi-point Bluetooth earbuds that can only connect to a couple of devices at the same time, which often leads to unpairing and re-pairing those earbuds over and over and over.

As you know, a while ago I came to a realization: it turns out that Apple’s old-school, wired EarPods are still pretty awesome if you want a foolproof, universal way of connecting a single pair of earbuds to a large collection of devices. Handheld manufacturers, in fact, weren’t as courageous as Apple and, despite modern advancements in Bluetooth, decided to leave a universal audio jack port in their portable consoles. So whether I’m doing side quests in Dragon’s Dogma 2 on Windows, playing Wind Waker on a portable Wii (not a typo), or streaming Astro Bot from my PlayStation 5, I can grab my trusted wired Apple EarPods and know that they will work with any type of device. That’s something oddly liberating and simple about that, and I’m not alone in feeling this way.

Now picture a second problem:

I mostly play video games at night, and I want to remain present and be able to hear my surroundings. Dog owners will understand: we have two sleeping in the bedroom with us, and I have to be able to hear that they’re sleeping well, snoring, or whatever. Let me tell you: you don’t want to accidentally miss one of your dogs throwing up in the bedroom because you were too “in the zone” with both your gaming earbuds in. I learned my lesson the hard way.

Now, I could have left my Apple EarPods alone and simply chosen not to put the right EarPod in, leaving the wire hanging there, unused. But I haven’t gotten to this point after 15 years of MacStories by not challenging the status quo and “leaving things be”. Instead, I grabbed my scissors and cut the wire for the right EarBud just above the connector where the main cable splits in two halves.

Behold: the single Apple EarPod I’ve been using as my go-to gaming “headphone” for the past two months.

The EarPod.

The EarPod.

I’ve been using The EarPod with all my gaming handhelds, and it’s, honestly, been perfect. After removing the right channel, audio is automatically routed to the left EarPod as mono; regardless, there are ways both on Linux and Windows to force mono audio in games instead of stereo. The result is a comfortable, good-sounding, inexpensive, easier to unfurl wired earbud that works with everything and allows me to keep an ear on my surroundings, but in particular my dog Ginger, who – for whatever reason – doesn’t want to get off the bed when she’s sick. Bless her.

Could I have purchased one of the many results that come up on Amazon for “mono earbud single ear”? Yes. But I genuinely love the shape and sound of Apple’s EarPods; I just wanted to be in a place where I only had to manage one of them.

Plus, this is MacStories. I’ve done far worse than cutting an EarPod wire. If both of those very specific problems I mentioned above also apply to you, well, I guess I can’t recommend modding Apple’s EarPods enough.


Access Extra Content and Perks

Founded in 2015, Club MacStories has delivered exclusive content every week for nearly a decade.

What started with weekly and monthly email newsletters has blossomed into a family of memberships designed every MacStories fan.

Club MacStories: Weekly and monthly newsletters via email and the web that are brimming with apps, tips, automation workflows, longform writing, early access to the MacStories Unwind podcast, periodic giveaways, and more;

Club MacStories+: Everything that Club MacStories offers, plus an active Discord community, advanced search and custom RSS features for exploring the Club’s entire back catalog, bonus columns, and dozens of app discounts;

Club Premier: All of the above and AppStories+, an extended version of our flagship podcast that’s delivered early, ad-free, and in high-bitrate audio.

Learn more here and from our Club FAQs.

Join Now]]>
First Look: Logitech’s MX Creative Console Is Poised to Compete with Elgato’s Stream Deck Lineup https://www.macstories.net/stories/first-look-logitechs-mx-creative-console-is-poised-to-compete-with-elgatos-stream-deck-lineup/ Tue, 24 Sep 2024 13:28:12 +0000 https://www.macstories.net/?p=76689 Source: Logitech.

Source: Logitech.

Today, Logitech revealed the MX Creative Console, the company’s first product that takes advantage technology from Loupedeck, a company it acquired in July 2023.

I’ve been a user of Loupedeck products since 2019. When I heard about the acquisition last summer, I was intrigued. Loupedeck positioned itself as a premium accessory for creatives. The company’s early products were dedicated keyboard-like accessories for apps like Adobe Lightroom Classic. With the Loupedeck Live and later, the Live S, Loupedeck’s focus expanded to encompass the needs of streamers and automation more generally.

Suddenly, Loupedeck was competing head-to-head with Elgato and its line of Stream Deck peripherals. I’ve always preferred Loupedeck’s more premium hardware to the Stream Deck, but that came at a higher cost, which I expect made it hard to compete.

The Logitech MX Creative Console slots nicely into my existing setup.

The Logitech MX Creative Console slots nicely into my existing setup.

Fast forward to today, and the first Logitech product featuring Loupedeck’s know-how has been announced: the MX Creative Console. It’s a new direction for the hardware, coupled with familiar software. I’ve had Logitech’s new device for a couple of weeks, and I like it a lot.

The MX Creative Console is first and foremost built for Adobe users. That’s clear from the three-month free trial to Creative Cloud that comes with the $199.99 device. Logitech has not only partnered with Adobe for the free trial, but it has worked with Adobe to create a series of plugins specifically for Adobe’s most popular apps, although plugins for other apps are available, too.

I use Adobe apps, but my interest in the MX Creative Console is its ability to run keyboard shortcuts, trigger various system events, and string together multiple actions as macros. For example, I’m using the MX Creative Console to navigate RSS, add files to Dropover, manage my windows, and take screenshots. Those are things you can do with a Stream Deck, too, but Logitech’s MX Creative Console has a few special things going for it that I love.

Up close with the MX Creative Console's keypad.

Up close with the MX Creative Console’s keypad.

First, there’s the fact that the MX Creative Console comes in two parts. The first is a wireless dialpad with a big knob, a scroll wheel, and four programmable buttons; the dialpad is wireless because it has no screens, allowing it to run on AAA batteries. The second part is a keypad with nine customizable buttons plus two buttons for paging among multiple sets of the nine buttons. The two devices can work together, allowing, for example, a press of something like a brightness button on the keypad to control brightness via the dialpad’s knob.

The keypad’s design is closer to that of a Stream Deck than a Loupedeck, which sacrifices some of the Loupedeck’s premium feel, but I still prefer it to a Stream Deck. The keys have a similar but perhaps slightly shallower throw and aren’t as concave as the Stream Deck. That means the icons assigned to each key’s little display aren’t as distorted by the shape of the keys as they are with a Stream Deck’s. There’s also a subtle lip on the edge of each key and a bump on the center key that makes it easy to orient your hand on the MX Creative Console’s keypad without looking at it.

Source: Logitech.

Source: Logitech.

As for the dialpad, it connects to your computer wirelessly via Bluetooth or Logitech’s proprietary Bolt dongle. Either way, the dialpad can be paired with up to three devices, just like many of the company’s keyboards – something I wish Apple would do with its own input devices. In addition to a knob that’s excellent for adjusting sliders or scrolling horizontally, it includes a scroll wheel for navigating long vertical pages and four programmable buttons. The dialpad is even compatible with the iPad, too, connecting via Bluetooth and operating the same way a third-party mouse does for scrolling and clicking.

Overall, my first impressions of the MX Creative Console’s hardware have been positive. By separating the device into two parts, it’s far more portable than many similar devices. I wouldn’t hesitate to throw one or the other or both into my bag because of their compact size and minimal weight. When I’m at my desk, the keypad includes a stand that holds the device at a little more than a 45° angle, too.

Programming the MX Creative Console's keypad with Logi Options+.

Programming the MX Creative Console’s keypad with Logi Options+.

I’m less excited about the MX Creative Console’s software. I’ve been using a beta version, so I’ll reserve judgment until its final release in October, but so far, programming the device isn’t great. That’s true of the Stream Deck too. Like it, Logitech uses a cross-platform app, Logitech Options+, that appears to be built with web technologies and just isn’t very good. Loupedeck users will recognize elements of Loupedeck’s software when they dig into Options+ to program the dialpad or keypad. But that familiarity isn’t an advantage because Loupedeck’s software was one of its weakest points as well.

Logitech has done an admirable job of competing on hardware, but at least in its beta form, Options+ feels like it’s trying to steal Stream Deck’s crown for janky setup software. The only silver lining is that anyone who has used a Stream Deck or Loupedeck before won’t be surprised by Options+’s limitations.


Still, the Logitech MX Creative Console is excellent overall. I’d prefer better software support, but again, it’s worth noting that the version of Options+ I’ve been using is a beta, and it does get the job done. Although the hardware isn’t as nice as the Loupedeck Live S, I prefer it to a standard Stream Deck and appreciate that it’s been split into two components which allows for a variety of desk setups and easier portability. I can’t wait to see where Logitech takes the MX Creative Console next and how Elgato responds.

The Logitech MX Creative Console is available in black and light gray and can be pre-ordered today on Amazon or Logitech’s website. According to Amazon’s listing the device pre-orders will be delivered on October 16th.


Access Extra Content and Perks

Founded in 2015, Club MacStories has delivered exclusive content every week for nearly a decade.

What started with weekly and monthly email newsletters has blossomed into a family of memberships designed every MacStories fan.

Club MacStories: Weekly and monthly newsletters via email and the web that are brimming with apps, tips, automation workflows, longform writing, early access to the MacStories Unwind podcast, periodic giveaways, and more;

Club MacStories+: Everything that Club MacStories offers, plus an active Discord community, advanced search and custom RSS features for exploring the Club’s entire back catalog, bonus columns, and dozens of app discounts;

Club Premier: All of the above and AppStories+, an extended version of our flagship podcast that’s delivered early, ad-free, and in high-bitrate audio.

Learn more here and from our Club FAQs.

Join Now]]>
visionOS 2: The MacStories Review https://www.macstories.net/stories/visionos-2-the-macstories-review/ Wed, 18 Sep 2024 14:30:01 +0000 https://www.macstories.net/?p=76574

In the lead-up to this year’s WWDC, it was hard to predict what an update to visionOS would look like. After all, the initial version had only shipped four months earlier when Apple Vision Pro became available for purchase in the United States. Given how late in the software cycle visionOS 1 shipped, it was reasonable to wonder if there would be a visionOS 2 announced at all, and if so, how much it could realistically add to an operating system that had just debuted the previous quarter.

Of course, Apple’s software cycle waits for no one, so like watchOS before it, visionOS is receiving a 2.0 version rapidly on the heels of its initial release. But the shortened development window doesn’t mean that this update isn’t a significant one. I believe that the 2.0 moniker is well deserved based on the features and enhancements included in this release, especially given the quieter updates across all of Apple’s platforms this year in the wake of Apple Intelligence.

visionOS 2 moves spatial computing forward with an array of welcome quality-of-life improvements, deeper integration with some of Apple’s other platforms, additional tools for developers to create spatial experiences, system app updates in line with Apple’s other platforms, and a new way to experience photos that you have to see to believe. The combination of user experience refinements and new features makes for a solid update that Vision Pro users are definitely going to notice and enjoy.

Some of the changes we’ll dive into feel so obvious that you might wonder why they weren’t included in visionOS to begin with. Having used Vision Pro almost daily since it was released, I fully understand the sentiment. But then I remember that the iPhone didn’t gain the ability to copy and paste text until iPhone OS 3, and I’m reminded that developing new platforms takes time – even for a company as big as Apple.

So while some might seem basic, many of the changes included in visionOS 2 improve users’ experiences in significant ways every time they interact with the platform. The end result is a smoother, more intuitive operating system that will delight Vision Pro believers and, if Apple has its way, convince more skeptics to take the plunge into spatial computing.

Let’s jump into what visionOS 2 has to offer.

[table_of_contents]

Quality-of-Life Improvements

The biggest changes to visionOS this year are ones that many users won’t explicitly notice or that users of other Apple platforms take for granted. Individually, they might not even seem like that big of a deal. But taken together, these updates add up to a much improved user experience over that offered by visionOS 1. And it all starts with the way you navigate the operating system.

Apple is known for creating interaction models so intuitive that they seem completely obvious in hindsight. Multitouch on iPhones and iPads is the most obvious example, but part of what makes all of Apple’s products so appealing is that they’re easy to pick up and use from the get-go. Creating the same effect for spatial computing, a paradigm that most people have no experience with, is a big challenge.

The company’s first attempt was admirable, but flawed. For this update, system navigation has been rethought to rely less on hardware and interactions borrowed from other platforms. Instead, visionOS 2 leans into the Vision Pro’s strengths, hand- and eye-tracking, to introduce whole new ways of interacting with the system.

First up: the Home View. Pressing the Digital Crown isn’t a bad way to open the Home View (and it’s still supported in this update), but it can get exhausting and annoying to perform the motion repeatedly when opening multiple apps at a time.1 What if there were a way to access the Home View without having to lift your arm at all? That’s exactly the thought behind an all-new gesture in visionOS 2.

No matter where your hand is within the view of Vision Pro, if you hold it open and look at your palm, a Home button will appear just above your hand. Press your fingers together to activate the button, and you’ll find yourself in the Home View, no arm-raising required.

The new Home View gesture.

There’s a lot to like about this new gesture. It may not be the most important detail, but I think it’s great that Apple carried over the Home button design from the original iPhone to visionOS – a nice little nod to the past in the company’s most futuristic device.

Mostly, though, I appreciate this gesture because it’s something completely new that takes advantage of what Vision Pro is best at. When you have a young platform with groundbreaking input methods, you should use them to their fullest extent, including in frequently performed actions like opening the Home View. The home gesture breaks from the past and embraces the fact that visionOS is something new.

It also, I think, qualifies as one of Apple’s trademark “so intuitive it’s obvious” interactions. The whole premise of visionOS is based on eye and hand controls. It’s natural to use your hand, which already performs tapping and scrolling gestures, to navigate the system, too. The act of turning my hand over, looking at it, and pinching my fingers together has become second nature to me – to the point that I do it without thinking or losing my flow. I can open the app I need and get back to what I was doing with minimal interruption, which is exactly what good navigation enables.

At the same time, old habits die hard. I still find myself sometimes pressing the Digital Crown to get to the Home View, too. There might come a day when Apple can remap that button to do something else, but it was smart not to change it this time around. Giving users both options is a good call; I think the new gesture will be the one to stick long-term, though.

The system’s gesture for accessing Control Center has been overhauled as well. In this case, the old method is nowhere to be seen, completely replaced by the new one. I say good riddance.

It’s understandable how visionOS developers landed on the idea of placing an upside-down caret above the user’s view to activate Control Center. After all, Control Center spawns from the top of the interface in most of Apple’s other operating systems. But in this case, the traditional model didn’t translate well. Accessing Control Center required users to roll their eyes upwards to activate a small caret icon and then tap on it. Somehow, the gesture was both too difficult to activate purposefully and too easy to activate accidentally.2 It needed to go.

The new Control Center gesture is an extension of the new home gesture. To activate it, you start by looking at your palm, but once the Home button appears, rather than pinching it, you turn your hand over so that its back is facing you. This will activate a delightful new piece of UI called the status bar. It looks like a widget and serves several purposes, the first of which is opening Control Center. Tap on it, and Control Center will appear, giving you access to system controls, notifications, Mac Virtual Display, and more.

Activating Control Center with the new gesture.

I find myself using Control Center a lot in visionOS. The typical interface for spatial computing consists of only your environment (whether physical or digital) and the windows and objects you’ve opened. There is no concept of a persistent status bar or menu bar like on other platforms, and visionOS doesn’t support widgets. Key information like the time, date, current focus, and your device’s battery life are all tucked away in Control Center alongside media playback controls, Wi-Fi settings, and even Spotlight. So it’s important that users have easy access to it, which is exactly what this new gesture ensures.

Like the Home View gesture, this new way of accessing Control Center fits naturally with the way visionOS works. It relies on the user’s hands and eyes without requiring any exaggerated eye or head movements, which could become fatiguing over time. I’m a big fan of the way this gesture has been implemented. I use it all the time and have never run into the reliability issues I had with the previous caret approach.

Sometimes, though, I don’t even have to complete the gesture to access the information or control that I’m looking for because visionOS 2 introduces a status bar that appears just before you tap to open Control Center. It includes the information users most often turn to Control Center for: the current time, device battery percentage, Wi-Fi status, Focus mode, and system volume. It also brings up a system volume slider when the user pinches and holds it, giving near-instant access to volume controls.

Isn’t this status bar just the cutest?

This is one of my favorite new features in the operating system this year. It’s an adorable UI, it offers a massive amount of useful information while staying out the way when it isn’t needed, and it just feels futuristic to activate and move around with my hand. It might seem like a small addition, but it’s one that users will benefit from every time they use Vision Pro, and that has a big impact over time. It’s also yet another sign of the platform maturing and embracing spatial computing as a paradigm rather than trying to shoehorn old methods into a new system.

Having volume control easily accessible also takes some burden off of the Digital Crown, which serves two functions when turned: adjusting the level of environmental immersion and turning the volume up and down. When making these adjustments, the system presents the user with two virtual dials: one for immersion and one for volume. By default, the Digital Crown controls immersion, but if the user directs their gaze to the volume dial, it will switch to controlling volume instead. I never liked this dual role; I felt like I could never point my eyes to the right place for controlling volume on the first try.

By default, a turn of the Digital Crown controls either immersion or volume (top), but the volume dial can be removed (bottom).

By default, a turn of the Digital Crown controls either immersion or volume (top), but the volume dial can be removed (bottom).

visionOS 2 offers a couple of ways to solve this. A new Digital Crown section in the Settings app includes the option to change the default focus from immersion to volume, which is great for users who want to change volume with the Digital Crown. My preferred approach, though, is to dedicate the Digital Crown wholly to controlling immersion while using the volume control built into the status bar. I accomplished this by toggling off ‘Use Digital Crown for Volume’ in the Sound settings. Now, adjusting immersion with the Digital Crown is more straightforward, and I still have quick access to volume controls in the software. It’s a win-win.

Home View Customization

It’s hard to believe, but Vision Pro shipped without the ability to rearrange app icons in the Home View. A selection of system apps were shown on the first page alongside the Compatible Apps folder, which contains all of the iPad apps installed on the system, and the rest of the apps in the Home View were simply arranged alphabetically. If a user had a favorite third-party app, iPad app, or non-prioritized system app that they wanted quick access to in the Home View, they were out of luck. There wasn’t any way to move them around.

That’s all changed in visionOS 2. Now, users can arrange icons however they like, and iPad apps can even escape the Compatible Apps folder. This makes the Vision Pro Home View much more customizable and adaptable to users’ needs, but it does come with some limitations and quirks that are worth taking into account.

Rearranging app icons feels very familiar if you’ve ever used an iPhone or an iPad. Simply pinch and hold an icon, and the entire Home View UI will enter “jiggle mode,” with each icon wobbling slightly to let you know that it can be moved. You can then grab an icon and drop it wherever you want within the current page or hold it against the left or right edge of the view to scroll to another page and drop it there. If you add a new icon to a page that was previously full, the icon in the bottom right corner will move to the next page, causing subsequent icons to reflow. You can also delete an app in jiggle mode by selecting the ‘-‘ button attached to it.

Rearranging app icons in the Home View.

App icons cannot be freely placed in the Home View. Rather, icons begin at the top left of the view and fill in left-to-right until the row is full, and then a new row begins. Again, this is very much the way iPhone Home Screens worked until this year, except rather than a rectangular grid, visionOS app icons are arranged in a honeycomb pattern with a row of four icons followed by a row of five, then another row of four. Like iPhone Home Screens, pages in the visionOS Home View don’t have to be completely filled before the user is able to start a new page.

Rearranging icons in the Compatible Apps folder works a bit differently. When you pinch and hold an icon within the folder, a menu appears with three options:

  • Add to Home View
  • Edit Folder
  • Delete App

The ‘Delete App’ option will remove the app from the device, naturally. ‘Edit Folder’ puts the Compatible Apps folder in jiggle mode, allowing you to rearrange icons within the folder. But to move an app out of the folder, you can’t simply drag it out while in jiggle mode. You have to pinch and hold it, then choose the ‘Add to Home View’ option. This will place it in the next available slot in the main Home View.

Adding an iPad app to the Home View.

iPad apps maintain their rounded rectangular shapes when added to the Home View, but the system adds a semi-transparent circle around the icons to make them blend in better with visionOS apps’ circular icons. While I find this sort of bubble effect cute, I do think it still looks out of place next to a full-size visionOS app icon, and I wonder if sticking to the plain rounded rectangle shape would have looked better.

Moving compatible apps out of the Home View and back into the Compatible Apps folder is even stranger than getting the icons out in the first place. To do so, you have to enter jiggle mode and then choose the ‘-‘ button attached to the app icon. That’s the button used to delete visionOS apps, but for iPad apps, it brings up a menu with two options: ‘Remove from Home View’ and ‘Delete.’ The ‘Remove from Home View’ option will move the icon back to the Compatible Apps folder, but it’s dangerously close to that ‘Delete’ button, so proceed with caution.

Speaking of folders, the Compatible Apps folder is the only one allowed in the visionOS Home View. Users can’t create custom folders of their own. If you try to hold an app icon over another one the way you would on an iPad to create a folder, the lower icon will simply move out of the way to make room for the one you’re dragging. visionOS apps can be arranged into pages but not folders, and only iPad apps can be tucked away in the single folder that exists. This is an odd omission and one I hope to see rectified soon.

With 13 apps per page and no way to organize apps into folders, the Home View can get out of hand pretty quickly. Pages start to pile up. On top of that, if you rearrange any icons, the system stops sorting apps alphabetically, so all future downloads are simply tacked on to the end of the last page. My Vision Pro currently has six pages of apps, and I’ve surrendered all but the first two pages to chaos. Anything beyond that just gets opened in Spotlight.

This system reminds me a lot of the iPhone Home Screen before the introduction of the App Library. Having access to my most-used apps at the beginning of the Home View is a game-changer, and it does remove a lot of friction when getting started on what I want to do. At the same time, I’m never going to take the time to organize most of the apps on my device. They’d be better off in a folder or just hidden away until I explicitly decide I want them in the Home View.

The changes to the Home View in visionOS 2 are a leap forward, but there’s another step that has to be taken. Users need folders – or better yet, a full-on App Library – to organize the majority of their apps so that the most important ones can take center stage within the Home View. Until that happens, I’d recommend you only bother with the first page or two and try not to think about the jumbled mess that lies beyond your tidy, custom Home View.

Beyond apps, the Home View also houses two other sections, People and Environments, which aren’t customizable. I personally find the People view unhelpful due to the fact that I can’t manually add contacts to it. The ability to favorite contacts that already appear in the view is nice, but its selection of people seems to be heavily influenced by my call history, which isn’t the best source of information on who I communicate with. If the People view was a customizable launcher for calling or messaging my favorite contacts, that might something I’d take a second look at; as it stands, I just wish visionOS included a full-fledged Contacts app.

Fancy a trip to Bora Bora?

The Environments view displays an expanding mini-portal into the destination option your eyes are currently focused on – a nice touch. And, of course, it includes the new visionOS 2-exclusive Bora Bora environment, a sunny beach getaway that I find so relaxing. I would like the option to move Bora Bora up to the top of my Environments list, but alas, it can’t be reordered. This is only a matter time, though. As Apple adds more environments (and hopefully opens up the option for third-party environments to be used systemwide), this list will have to be rethought with customization or, at the very least, some method of categorization.

Keyboard Awareness and Mouse Support

I’ve spilled quite a bit of ink3 extolling the power and fluidity of eye- and hand-tracking in visionOS, but let’s be honest: you need to connect peripherals if you’re going to do any productivity-related tasks on Vision Pro. Decades into its existence, the hardware keyboard still reigns supreme as the best input method for text longer than a sentence or two. A pointing device is less necessary given the accuracy of the device’s eye-tracking, but it still comes in handy sometimes. visionOS 2 brings new features for both input types.

On the keyboard side, this update introduces Keyboard Awareness. When you’re in an immersive environment, the system can now identify a Magic Keyboard or MacBook keyboard sitting in front of you and display it within your view. This is very similar to the People Awareness feature Apple touted when Vision Pro was first announced. The environment is still fully immersive, but it fades out around the keyboard to allow video from the outside world to pass through.

I can see my keyboard through all the white sand.

I can see my keyboard through all the white sand.

I’ve found this feature to be immensely useful when writing with Vision Pro. The built-in environments help me block out distractions around me and focus when I need to, but previously, I could only use them at about 50% immersion because I still needed to see the keyboard in front of me from time to time. Now, I can be fully immersed in whatever location I find most conductive to my work, and I can still look down at my keyboard if I lose my hand placement or need to find the location of a key.

It’s also one of the features that has come the longest way throughout the visionOS beta cycle. In the first version of the beta, the system sometimes struggled to identify my Magic Keyboard, especially under low light. Now, Keyboard Awareness works reliably, and in some conditions, it can even recognize my iPad’s Magic Keyboard, which isn’t an officially supported device. I hope this is a sign that the feature will be expanded to support third-party Bluetooth keyboards in the near future.

I’m not going to lie: it feels pretty cool to make the environment disappear like this.

As great as Keyboard Awareness is, it can also be a bit distracting to have the keyboard always present in my view even when I’m not typing. Thankfully, the new ‘Bring Hands Near to Reveal’ option found in the Awareness & Safety section of Settings can help with that. This setting disables keyboard passthrough until you reach your hands towards the keyboard to type. It’s quite a cool effect, and it gives you the best of both worlds: full immersion when you aren’t typing and Keyboard Awareness when you need it.

For those who want pointer control but don’t like using a trackpad, visionOS 2 introduces Bluetooth mouse support. Using a mouse in visionOS works very similarly to using a trackpad, and the same basic set of interactions are supported: clicking, dragging, scrolling, and secondary clicking. I’m surprised by the lack of special gesture support for the Magic Mouse (and even the Magic Trackpad), but I suppose it’s good that Apple isn’t giving its own peripherals special treatment in this area – at least not yet.

In my use cases, pointer control works well for text editing and working within individual apps, but I don’t find it all that intuitive for actually navigating the system. Hand gestures make more sense to me for opening apps, arranging windows, and adjusting system controls, but I’m glad that it’s possible to accomplish all of those things with a pointer for those who prefer it. Pointer control in visionOS is basic, but it gets the job done, and now it can even be done with a mouse.

Other Quality-of-Life Improvements

Hand-Tracking: The refresh rate for hand-tracking in visionOS 2 has been improved significantly. Previously, the user’s hand position was updated at a rate of 30 Hz, but it now matches the system’s current refresh rate, which is usually around 90 Hz. This three-fold increase results in smoother hand animations in video passthrough4 and tracking data that is much closer to real-time for developers. Developers can also optionally take advantage of new hand-tracking prediction in visionOS 2 to anticipate upcoming hand movements.

Reopen Apps After Restart: This new option (available in the General section of the Settings app) is the visionOS equivalent of the Mac’s ‘Reopen windows when logging back in’ feature. When the Vision Pro reboots, it will automatically respawn any windows that were left open before it powered down. Given the Vision Pro’s battery life restraints, system reboots are somewhat common, whether they occur while the device is in use or while it’s being stored. Either way, this feature allows users to hit the ground running after a reboot rather than being forced to recreate their setups from scratch.

Guest User: The operating system can now store eye and hand data for a single guest user for up to 30 days, allowing them to use the device multiple times over that period without the need to go through the setup process each time. This is a welcome change for Vision Pro owners who share their devices with others on occasion, but it still isn’t full multi-user support. It’s limited to only one guest user at a time, and data is removed after 30 days. I see this as more of an extended demo option than an actual solution for device sharing, a practice that Apple seems fundamentally opposed to.

Low-Light Performance: Vision Pro works best in well-lit environments. However, there are times when a user might need to use it in a dark space. Trying to do so with visionOS 1 resulted in a slew of warnings and repeated window respawns that could become frustrating quickly and dissuade users from trying to wear Vision Pro in non-ideal conditions. visionOS 2 makes this experience much more tenable. It still displays a warning when the device detects that it’s being used in low light and disables features like hand passthrough and window anchoring, but the system holds up better in low light, allowing me to actually use my computer in the dark. Imagine that!

Compatible App Appearance: Speaking of using Vision Pro in low light, one sure-fire way to overwhelm your eyeballs in visionOS 1 was to open an iPad app in a dark environment. Compatible apps always defaulted to light mode, which in many cases did not mix well with visionOS’s semi-transparent native aesthetic and contrasted somewhat painfully with a dark room or a virtual environment in dark mode. Thankfully, this update adds a new option in the device’s Appearance settings to switch the appearance of compatible apps to dark mode if you prefer. I made this change immediately after installing the first visionOS 2 beta and haven’t thought about going back since. iPad apps in dark mode just fit better with visionOS.

My persona is on the moon! But also in White Sands?

My persona is on the moon! But also in White Sands?

Persona Improvements: Apple continues to improve the accuracy of skin tones and clothing colors in Personas, users’ 3D avatars for video calls. The biggest difference I’ve noticed in Personas this year is better recreation of hand movements, likely due to improved hand-tracking system-wide. Users can also now set backgrounds for their Personas while in video calls. There are system-provided options for color backgrounds and backgrounds based on visionOS environments, or users can choose custom backgrounds from their photo libraries.

Train Support: Travel mode now supports trains in addition to planes, a feature I have been unable to test due to the disappointing lack of public transit in my area. Regardless, the ability to use Vision Pro on a moving train is a nice addition I hope I’ll be able to take advantage of someday.


The fact that such a large portion of this review is dedicated to covering these everyday quality-of-life improvements speaks to two things: the nascent nature of spatial computing and the reality that even the foundations of visionOS are still not totally settled. Vision Pro is an entry point into a whole new world for Apple, and early adopters are tagging along. I wouldn’t be surprised to see even some of the details covered above change in future versions of visionOS as the company continues to understand best practices for building a spatial computing platform for the masses. For now, these are big enhancements over what we saw in visionOS 1, and I’m thrilled by the difference they make in day-to-day use.

AirPlay Receiver

It’s been clear from the moment Vision Pro was announced that Apple envisions it as being one of several devices from the company in a user’s life. No feature demonstrates that more than Mac Virtual Display. The ability to mirror and control a nearby Mac from visionOS helps fill gaps in third-party software support for the device and offers Mac lovers one more reason to give Vision Pro a try.

Mac Virtual Display is poised to receive a major upgrade later this year in the form of a panoramic view that’s equivalent to two 4K displays side-by-side, but that feature hasn’t been made available for testing yet. Instead, visionOS 2 adds integration with two other devices, iPhone and iPad, in the form of AirPlay Receiver.

If you’ve ever used AirPlay to mirror your device’s screen to a TV before, the process will be very familiar. The first step is to enable AirPlay Receiver in the Vision Pro’s Settings app under General → AirPlay & Continuity. Then, you can activate the Screen Mirroring feature in Control Center on your iPhone or iPad and choose Vision Pro as the output. A frameless replication of the screen will then appear at the center of your view in a window that can be moved and resized like any regular visionOS app window.

Activating AirPlay Receiver from my iPad Pro.

AirPlay Receiver seems to only work for full-on screen mirroring, not for streaming specific content from your device to Vision Pro. The only app I’ve found that does support sending content via AirPlay to Vision Pro is the Photos app, which will allow you to stream a photo memory to Vision Pro, though it oddly lists the device as an Apple TV. I’m honestly not sure whether AirPlay Receiver is meant to be limited to screen mirroring or not, but in its current version, it functionally is.

However, the AirPlay Receiver window in visionOS can function as an external display rather than a 1:1 screen mirror under certain conditions. For example, when you open a picture or video in the Photos app on iPhone, the mirrored screen on Vision Pro morphs into a 16:9 window that features only the selected content with no UI chrome around it. And when you start a Keynote presentation on your iPad, the window will adapt to show your presentation in full view while your iPad shows a presenter view with notes and presentation controls. The AirPlay implementation in visionOS 2 isn’t quite feature complete, but as long as you start with screen mirroring, you can eventually use AirPlay’s other features as well.

AirPlay Receiver morphing into a full video preview.

There are a few major differences between AirPlay Receiver and Mac Virtual Display. First of all, AirPlay mirroring can only be initiated on the source device, not Vision Pro. This means that, in order to set up AirPlay Receiver, you have to be wearing Vision Pro and then operate your iPhone or iPad via Vision Pro’s passthrough video, a task that can be tricky depending on how clearly you’re able to see your real-world device in passthrough mode. This certainly isn’t as straightforward as starting the process from visionOS, which is an option for Mac Virtual Display.

Secondly, audio from the source device does pass through to Vision Pro when using AirPlay Receiver. Anyone who’s used Mac Virtual Display and been shocked to hear audio coming from their MacBook speakers several feet away will attest that this is a useful feature. It only makes sense that if your device’s screen is being mirrored on Vision Pro, its audio should be mirrored there as well.

Finally, the iPhone or iPad that is mirroring its screen cannot be controlled by peripherals paired to Vision Pro. Whereas Mac Virtual Display allows you to share Vision Pro’s keyboard and pointing device with the Mac being mirrored Universal Control-style, AirPlay Receiver is a one-way street. It can only receive information, not send any back to the source device. This means that you’ll have to control your iPhone or iPad either via touch input or a separate set of peripherals when using AirPlay Receiver.

While this feature isn’t as robust as Mac Virtual Display, I still think it’s a great addition to visionOS. It helps solve the same problem as Mac Virtual Display, namely that a lot of apps simply aren’t available on the platform yet. If you have a favorite productivity tool, reading app, or streaming service that you’d like to use in a spatial computing environment but isn’t available on the visionOS App Store, you previously had to rely on the web as a fallback or use an app like Bezel. Now, you can simply run your preferred app on another device and beam it to Vision Pro natively.

And the quality of the mirroring is remarkable. It’s clear and high-quality, with crisp text and almost no discernible latency. Animations aren’t quite as smooth as they are on the device’s actual display, but they are being relayed over a wireless signal, after all. I’m quite impressed with how smooth the experience of using AirPlay Receiver is.

This almost feels like too much power to have all at once.

This almost feels like too much power to have all at once.

It’s worth nothing that AirPlay Receiver can only be used with a single source device at a time. If you’re AirPlaying from your phone to Vision Pro, you can’t also bring your iPad into the mix; you’ll have to turn off mirroring on one device before you enable it on another. However, it is possible to use AirPlay Receiver and Mac Virtual Display simultaneously. You can lock your iPhone or iPad while mirroring it to Vision Pro, and mirroring will temporarily pause without terminating the connection. When you unlock the device, mirroring will resume automatically.

I’ve personally found this feature most useful when mirroring my iPad Pro in the Magic Keyboard case. Trying to navigate my iPhone or iPad mini via touch while looking up at a mirrored display on Vision Pro just isn’t natural enough for day-to-day use in my book, though I appreciate that I have the option if I ever need it. But with the Magic Keyboard’s trackpad and hardware keyboard in play, AirPlaying my iPad Pro to Vision Pro has proved quite useful.

In fact, this very review was written on my iPad Pro while it was being mirrored to Vision Pro via AirPlay Receiver. My favorite tool for long-form writing is iA Writer, an app that isn’t available on Vision Pro even in iPad compatibility mode. In order to write in my app of choice, I’ve relied on AirPlay Receiver throughout the writing process, and it’s been excellent. It offers me the best of both worlds: I can use the non-visionOS app I need (on a virtual display much larger than my iPad’s, no less) in combination with environments and a bunch of visionOS app windows displaying my notes, resources, and more.

My view when writing this review.

My view when writing this review.

That said, I do wish this integration between Vision Pro, iPhone, and iPad could be taken a bit further and borrow a couple of features from the Mac integration. Being able to start Mac Virtual Display from Vision Pro and control the connected Mac with the peripherals I’m already using makes for a much simpler setup process than AirPlay Receiver. Don’t get me wrong; I’m a big fan of the feature. I’m just such a big fan that I’d love to see it evolve into iPad Virtual Display and iPhone Virtual Display down the road.

Will that evolution ever come? It’s impossible to tell. But I would have never predicted that AirPlay Receiver would be as high-quality and handy as it is, so there’s certainly hope. Either way, tighter integration between Vision Pro, iPhone, and iPad is a win for Vision Pro owners who use other Apple products, and I’m happy to see it.

Photos

While the Photos app for iPhone and iPad was completely rethought this year, Photos for visionOS incorporates the latest features in line with Apple’s other platforms while remaining mostly the same in its design. The Library tab continues to display all of your photos and videos5, while the new Collections tab now houses albums, memories, media types, utilities, and new features like Recent Days and Featured Spatial Photos. The continued use of a tabbed interface rather than a unified one brings Photos for visionOS more in line with the Mac than the iPhone or iPad.

The new Collections tab in the Photos app.

The new Collections tab in the Photos app.

The Collections tab is a fascinating addition, especially in its customizability. Pinching the customize button in the top right corner of the app opens up a screen that allows you to enable, disable, and rearrange sections freely. You can literally make the Collections tab whatever you want it to be, which is an amazing feature for an app as personal and as central to people’s digital lives as Photos.

Customizing the Collections tab.

It’s a lot of fun moving around elements within the Collections tab to get them just the way you want. But what’s even more fun is diving into its various sections to rediscover pictures and relive memories. Federico has more details in his review of iOS 18, but the new Recent Days feature is a great way to look back at photos I’ve captured recently, and it does a good job of highlighting the ones I want to see most.

While the Collections tab is excellent, it’s also an experience you can get on other Apple devices. What you can’t experience anywhere else, though, is spatial content, and visionOS 2 introduces the ability to turn your regular photos into three-dimensional spatial photos with the press of a button.

Converting a spatial photo. Just look at that animation

When I saw this demonstrated in Apple’s WWDC keynote in June, I immediately thought, “There’s no way this feature works that well on regular people’s pictures.” The company’s photography demos have always been idealized in order to show off features in the best possible light. I couldn’t imagine that spatializing photos would be something I’d actually want to do as more than just a gimmick.

Wow, was I wrong.

From the first moment I tried this feature for myself, I could tell that Apple was onto something very special. It literally is as easy as the demo makes it seem: open a photo, select the cube button in the upper left corner, and wait a moment for the system to process the image. After a short, eye-catching animation, you’re presented with a 3D version of your photo that, in a lot of cases, is downright breathtaking.

It’s difficult to describe and impossible to replicate on a two-dimensional screen, but something about seeing my personal photos – some of which I’ve looked at dozens or hundreds of times before – translated into 3D makes the experience of viewing them more impactful. Spatialized photos feel less like pictures and more like memories. I know that sounds ridiculous, but it’s true.

What sets spatial photos apart from regular ones is depth. With its dual displays, Vision Pro is capable of displaying 3D content in a way that makes your mind truly believe that elements are reaching out towards you. Whereas depth in 2D images is implied through composition and focus, with spatial photos, depth can literally be experienced.

It doesn’t translate to two-dimensional video, but shifting your perspective when looking at a spatial photo gives an impression of true depth.

The depth of a spatial photo makes it feel more real, like you’re looking through a window into the actual world rather than at a screen. Spatial photos are also dynamic, shifting the point of view in response to head movements and window placement. Shift your head to the left, and the entire scene changes slightly to give the impression that you’re looking at it from an angle. Move the photo closer to you, and the subject appears as if standing behind a window you’re looking through. Move it further away, and you’ll reach a point where the subject breaks through the threshold of the window and is now popping out at you. And for a full-on immersive effect, the panorama option blows up the image to a larger-than-life size that takes up your full view and fades seamlessly at the edges into your surroundings, giving it a dream-like quality.

A spatial photo in panorama view.

A spatial photo in panorama view.

Taken together, these elements of spatial photos leave you feeling like you’re no longer looking at a photo on a headset but actually reliving the moment the picture was taken.

I’ve spent hours converting all kinds of pictures into spatial photos: wedding pictures, newborn photos of my son, vacation shots, memories of loved ones who are no longer with us. Viewing these spatial photos is a significantly different experience than looking at them on my phone or even my TV. It’s more personal, more immersive, and more emotional – all things that I’m looking for when I dive into my photo library for a trip down memory lane.

I can’t say for certain that this feature will sell Vision Pros, but I would not be surprised to hear of people who buy the device for this purpose alone. It really is that good. If you love looking at pictures in your photo library and reminiscing, I highly recommend booking a Vision Pro demo and converting some of your own photos to see what it’s like. It’s not easy to put into words, but it’s an experience that’s won me over, and I think it will many others, too.

Feelings aside, spatial photos are also impressive from a technical level. The conversion is performed by a machine learning model that does not depend on depth data being built into the photo. In fact, I’ve thrown pictures at it that I knew for a fact included no depth information at all, and they converted to spatial photos just as well as portrait images shot on my iPhone. The model analyzes the contents of the photo and figures out how different elements relate to one another spatially, and then it applies that depth data to the photo to create an 3D image.

I wish I had a way to share the experience of seeing this photo in all its spatial glory with you. Want to come over and try out Guest Mode?

I wish I had a way to share the experience of seeing this photo in all its spatial glory with you. Want to come over and try out Guest Mode?

The conversion is able to capture real-world depth incredibly well. One particular example of this is a picture from my son’s newborn photoshoot. The picture was taken from inside of his crib, with him lying on the mattress and my wife and I standing over the crib looking in. The machine learning model was able to perfectly ascertain the angle of the railing and present its depth in a way that it feels like you’re inside the crib when you look at the photo. I’m amazed at how well it’s able to nail this sort of thing.

Of course, there are instances where the conversion trips up. As with Portrait mode on the iPhone, hair can be a tricky element for the model to separate from the background. In certain instances where elements overlap, the model can misinterpret the distance between them, creating an effect that’s entertaining but not realistic. Reflective and transparent surfaces don’t translate well. And there are certain one-off photos that, for one reason or another, just don’t mesh with the spatial conversion process. It isn’t perfect, but it has way more hits than misses in my experience.

Spatial photo conversion offers a truly wonderful way to re-experience pictures from the past, and I’m a huge fan of it. At the same time, I’d love the ability to take spatial photos myself so that I can capture real-world depth and then view it later rather than relying on an algorithm. That’s why I’m so excited that in a matter of days, I’ll have a spatial photo camera with me everywhere I go in the form of the iPhone 16 Pro Max. In fact, every device in the iPhone 16 lineup is able to capture spatial photos, and the latest iOS 18.1 beta even brings this capability to the iPhone 15 Pro and Pro Max. This is great because it allows current Vision Pro owners to experience even more of their memories as spatial photos and gives every iPhone user the chance to start building their spatial photo libraries now. Spatial photos: they’re here to stay.

The biggest downside of spatial photos, though, is the fact that they can’t be easily shared with other people – at least, not in their full three-dimensional glory. If I want to show someone else a picture that I love, I have to switch my Vision Pro into guest mode and walk them through the setup process first. That’s not something I’m going to do often, and it speaks to the solitary nature of Vision Pro as it stands today. This is a problem that won’t be solved unless spatial computing becomes more widespread, and it’s a shame.

If you want to share a spatial photo with another Vision Pro owner, you can always send it to them via iMessage, but visionOS 2 introduces a new way of looking at pictures together: SharePlay in the Photos app. When you’re on a FaceTime call with another Vision Pro user, a Share button will appear above the Photos app window. Pinch it, and the other user will be shown photos and videos that you select so that you can view them together in real time. This is a great way to look at vacation photos with a friend, reminisce with long-distance family members, or share your work with a client.

Of course, I do have a few hopes for improvements to spatial photos in the future. I’d like the option to convert my entire library to spatial photos automatically, even if that means the process has to run in the background while my Vision Pro is charging.6 I also wish I didn’t have to choose between viewing my pictures as Live Photos and spatial photos; why not both? And finally, I’d like to eventually be able to convert my videos to spatial versions, too. I know that’s a big ask, but after seeing what the Photos team has done with spatial photo conversion, I’m confident that they can do it. That is, after they take their celebratory lap for knocking this feature all the way out of the park, of course.

Apps

Several system apps across Apple’s platforms are receiving new features and enhancements this year, and visionOS 2 benefits from the majority of them.7 Just to name a few:

  • the all-new Passwords app spun out of Settings
  • the Maps app’s new Library for saving locations, guides, and routes
  • Math Notes, section folding, and text highlights in Notes
  • integration of scheduled tasks from Reminders into Calendar
  • tab bars for navigating system apps borrowed from iPadOS like Podcasts and News
  • the new action library in Shortcuts
  • compact view mode and persistent file downloads in Files

I’ll let Federico and John fill you in on all the details of system app updates that apply across platforms, but I do want to highlight a few visionOS-specific updates worth mentioning.

Safari

Watching Comfort Zone the way it deserves.

Profile users, rejoice! Safari for visionOS now supports profiles, allowing you to separate your personal login credentials, browsing history, tab groups, bookmarks, extension settings, and more from your browsing for work, school, and other uses. There’s no sign of the Highlights or Distraction Control features added to other platforms this year, but the app does play a bit of catchup with pinned tabs and the ability to have Siri read a webpage to you. Safari can now display panoramic photos in immersive mode, too.

The biggest update to Safari in visionOS 2, though, pertains to video. Now, videos played in Safari can integrate into environments the same way videos from other apps can. This is important because major players like Netflix and YouTube have yet to develop apps for Vision Pro, so users have to watch those services in Safari instead. This change puts those viewing experiences on par with streaming apps, allowing them to be blown up to a gargantuan size and automatically placed within an environment for an ideal cinematic experience. This environment integration is also available for developers to add to video players in third-party apps.

TV

Speaking of cinematic experiences, Vision Pro is already the ideal way to watch films with its support for 4K, high dynamic range, 3D, spatial audio, and variable frame rates.8 It’s also the only place to experience Apple Immersive Video, the company’s 8K, 3D, 180-degree video format behind titles like Boundless and Prehistoric Planet Immersive, which is receiving a couple of updates this year.

You can now minimize immersive videos from full-screen view into a window. This allows users to enjoy the content without being fully immersed in it, and it’s also an alternate way to observe the 3D effects within the content.

On the accessibility side, captions can now be repositioned while watching immersive videos. Unlike traditional content, immersive video does not have a fixed point of view, and there’s action happening on all sides. That’s why it’s important that viewers who rely on captions be able to place them in whatever position works for their personal viewing needs and preferences. I’m thankful that Apple is continuing its work to ensure that spatial computing and immersive content are accessible to everyone.

The ‘Dim Flashing Lights’ setting is now available on visionOS as well. This is an important accommodation that was first added to the Apple TV last year, and it’s good to see it come to Vision Pro.

The TV app for visionOS will be adding support for Multiview later this year, allowing viewers to watch multiple sporting events at the same time. I’m a big fan of Apple’s Multiview implementation on Apple TV and iPad with easy source and layout switching, so I’m interested to see how this looks when it becomes available. I’m also curious to see when InSight, the feature that displays real-time information about actors, characters, and music onscreen, will come to visionOS after its launch on tvOS, iPadOS, and iOS this year.

Messages

Dictating a message, no pinch required.

While Messages on visionOS 2 doesn’t include text effects or message scheduling like other platforms, it does add emoji Tapbacks and RCS support alongside a visionOS-exclusive feature this year: Look to Dictate. This option was previously available for Safari’s search bar and other search fields, but it’s now coming to the Messages composition field as well. Just look at the microphone icon to enter dictation mode and start speaking without the need to pinch your fingers together.

Mindfulness

The most relaxing thing you’ll see today – now synced to your breath.

The Mindfulness app is one of the coolest experiences available on Vision Pro. It’s a true reimagining of what a meditation tool can be in an immersive environment. I find the digital petals floating around me to be both soothing and centering when it’s time for me to meditate. In visionOS 2, the movement of the petals and associated sounds can now be synced to your breath thanks to the new ‘Follow Your Breathing’ setting.

I don’t recommend replicating this experiment, but I tried enabling the feature and holding my breath during a meditation, and the animation held steady, waiting for me to exhale before starting its next motion. While I personally find that I benefit from a bit of guidance on my breathing rate when meditating, this is a good option for those who breathe more slowly or quickly than the predetermined guide and want the app to match their rhythm.

Developer Tools

Typically, a review like this one focuses exclusively on user-facing features, not behind-the-scenes tools for developers that users won’t ever use or potentially even recognize. In the case of visionOS 2, though, I think it’s worthwhile to point out a few new frameworks and APIs being made available to developers this year because they represent the evolution of the platform and the direction Apple believes spatial computing is headed. These additions aren’t necessarily things you’ll find in operating system feature lists, but they set the foundation for new types of apps and experiences in visionOS that will be important for the platform’s future.

HealthKit has made its way to visionOS.

HealthKit has made its way to visionOS.

I didn’t expect HealthKit to come to visionOS so quickly based on how long it took Apple to bring the feature to iPad, but I’m pleasantly surprised that it’s coming in visionOS 2. Now, apps on Vision Pro can request users’ permission to access, display, and update their health data. This makes the platform more viable for developers of health- and fitness-focused apps. I hope to see more apps in these categories in the Vision Pro App Store soon.

Apple isn’t taking the lead in these categories on the platform, though. Neither the Health app nor the Fitness app are included in visionOS 2, which is a shame. This lack of first-party apps in the health and fitness categories could be an opportunity for third-party developers to fill the gaps and create compelling experiences that entice new users.

Developers can now set the position of new apps windows when they appear, an improvement over the previous sytem that always opened windows directly in front of the user’s gaze. Windows placement can be set relative to existing windows in the space or relative to the user. A new option called ‘push window’ allows an app to open a new window in the place of an existing one, covering up the original window. New window controls may not be the most exciting addition to an operating system, but these options empower developers to create more tailored experiences for users of their apps, which is always a good thing.

The majority of the new developer tools in visionOS 2 pertain to augmented reality. Room tracking lets developers know when a user enters a new room, and apps can respond accordingly.9 The system is now capable of detecting not only horizontal and vertical surfaces, but slanted ones as well, giving developers a fuller understanding of the space around the user. Volumetric apps can be opened alongside other apps, and they can be resized, tilted, and customized with ornaments.

Apple’s virtual teacup looks great next to my Stanley.

Virtual objects can be attached to physical surfaces within a room, too. I was able to test this feature by opening a USDZ file for a 3D object in Quick Look and moving it around the room. As you move an object close to the level of a table or the floor, it will snap to that surface with a satisfying clicking sound. Then, you can slide that object around the surface and put it exactly where you like.

Once an object is in place, it does not move at all. You can walk around it, look away from it then back, and even leave the room and come back later, and the object will stay precisely in place no matter what. It truly is remarkable how stable these virtual objects are, and it makes them feel integrated with the physical world in a much deeper way than was possible before. Apps that place virtual objects within the user’s environment will be able to give much more realistic impressions thanks to these improvements.

Meanwhile, new object tracking APIs allow developers to use physical objects as anchors for virtual elements. Imagine buying a physical product that comes bundled with a visionOS app for augmenting and interacting with it, or an app that identifies objects within your space and lets you select them to learn more about them. These are the kinds of experiences made possible by object tracking, and I’m excited to see how developers put these tools into practice.

Playing Chess with Sigmund in Game Room.

Playing Chess with Sigmund in Game Room.

TabletopKit makes it easier for developers to create spatial experiences based around a table. One of the coolest things I’ve ever done with my Vision Pro is playing board games with a friend in Game Room from Resolution Games via SharePlay. It does an incredible job of replicating the experience of sitting across from someone while playing a tabletop game together, and TabletopKit will enable more developers to create experiences like this. Apple featured indie game studio GRL Games’ upcoming title Haunted Chess in the WWDC keynote as an example of TabletopKit in action. I hope to see a lot of these types of games available on the platform before long.

These new developer tools follow the same pattern as the user-facing quality-of-life improvements in visionOS 2: they seem like table stakes10 in a vacuum, but when considered within the short history of visionOS, they represent a rapid pace of change in response to users’ experiences and developers’ needs. Together, they form a more complete foundation for building spatial computing applications that in turn offer more immersive, enjoyable, and beneficial experiences for users. A feature like room tracking might not seem like such a big addition now, but it will likely be considered an essential element of spatial computing in the future. And thanks to these new tools and the developers implementing them, that future starts now.

What’s Next?

That’s what’s new in visionOS 2 this year. But before we go, I’d like to consider what’s next for the platform. Really, I’d like to share my personal wishes for the future of visionOS. I have several, and many of them are big asks, but given how much has changed in visionOS over its short lifespan so far, I expect that at least some of these requests will be checked off the list when visionOS 3 ships, presumably next fall.

Here’s what I’d like to see come next:

Improved multitasking: Putting together multi-window setups in visionOS by adding windows one at a time from the Home View, Spotlight, or Siri is too cumbersome and time-consuming. Even once a setup is perfect, you have to tear it all down when moving on to the next task. I want a spatial computing version of Stage Manager that allows me to create a series of multi-window views and easily switch between them. A quick launcher or dock hidden behind a hand gesture would be a great way for launching often-used apps, too.

A more robust notification system: visionOS doesn’t handle notifications well. They appear as tiny icons at the top of the user’s view that indicate nothing more than the app sending the notification. You have to tap on a notification in order to get any useful information from it, and then there’s no way to act on the notification other than to open its associated app. I’d like more information available within the notification itself, possibly by expanding the tiny icon into a larger element when the user looks at it, as well as the ability to quickly act on notifications, such as by replying to messages inline. Notification Center needs some attention, too, as the horizontally scrolling list can become a jumbled mess in no time.

Native system apps: I was surprised that no system apps were migrated from their iPad versions to native visionOS apps with the release of visionOS 2. The ability to run iPad apps on Vision Pro is a necessary feature, but Apple shouldn’t rely on it to avoid creating native versions of each of its apps. No one thinks the iPad version of Podcasts offers as good of an experience on Vision Pro as the native Music app. The company should show its commitment to the platform by getting every first-party app out of the Compatible Apps folder as soon as possible.

Core experiences reimagined for spatial computing: Once all of Apple’s apps are running natively on visionOS, it’ll be time to start rethinking many of them in light of what Vision Pro can do. Spatial computing is a new way of interacting with software, so every system app shouldn’t be limited to the flat rectangles that came before. How about a 3D virtual desk calendar showing your events for the day? Or a virtual recreation of the book you’re reading that you can hold in your hands? An envelope to fold your email into before sending it?11 The possibilities are endless in visionOS, and I’d like to see Apple think outside the literal box.

Feature parity with other Apple platforms: As far as visionOS has come, it’s still missing some features that we take for granted on other platforms: things like widgets, Find My support, Focus filters, and personal automations in Shortcuts, to name a few. That’s not to mention the fact that Apple Intelligence hasn’t been announced for visionOS at all, even as it’s poised to be released on iOS, iPadOS, and macOS later this year. Based on its prominent placement in the WWDC keynote and the fact that it’s being brought into the regular software development cycle, visionOS seems to be a high-priority platform for Apple, and it deserves the full platform treatment.

Incentives for third-party developers: The visionOS App Store continues to grow, but it’s nowhere near the size of the App Store on iOS or even iPadOS. Too many developers are opting out of developing for the platform or even allowing their iPad apps to run on it, and it’s up to Apple to convince them. I can’t say for certain what that looks like, and I imagine it won’t be a one-size-fits-all approach, but I want to see more third-party software available for Vision Pro, including some of my favorite apps that I currently have to use via AirPlay Receiver or Mac Virtual Display.

More devices running visionOS: This is actually a hardware request, and it’s almost definitely one that won’t be addressed within the next year. But the reality is that Vision Pro – as incredible and groundbreaking as it is – is not a product that appeals to the masses in terms of form factor or pricing. For spatial computing to grow, visionOS will need to be offered on a range of products with a much broader market in mind. Vision Pro is a feat to be sure, but at the same time, it’s likely the biggest, heaviest, and least comfortable device that will ever run visionOS. That presents a huge but exciting challenge for the hardware engineers working on the next Vision product, and I hope they meet that challenge with an offering that will be appealing and attainable to a wider swath of people.

Conclusion

Fifteen months after the Vision Pro was first announced, the product has already been on a wild ride full of twists and turns, the latest of which is visionOS 2, a major upgrade to the device’s operating system that’s barely half a year old. In that short time, the visionOS team has put together a substantial update with a compelling batch of features: quality-of-life improvements that users will benefit from constantly, integration with iOS and iPadOS to make more apps accessible on the platform, developer tools that firm up the foundation of spatial computing, and a jaw-dropping photo-viewing experience that only Vision Pro can offer. This is an update worthy of the 2.0 label, and I continue to be surprised at how much the platform has developed in such a short time.

The thing that excites me most about visionOS 2 is the way it represents how early we are in the era of spatial computing. Foundational principles still aren’t completely set in stone, and Apple is working with users and third-party developers to understand how this new paradigm for computing is going to work. Free of the baggage of legacy form factors and with so many uses cases unexplored, the possibilities are endless, and we get a front row seat to the discovery process.

What we’re seeing is a new way of computing being shaped before our very eyes. It’s going to take some time for spatial computing to reach its full potential, and the first attempt at some aspects won’t always be the right one. Things might even get a little weird. But until the system matures into yet another fully baked platform that prioritizes stability over innovation, we get the privilege of seeing it try new things, figure out what works and what doesn’t, and grow into what it will become.

visionOS 2 is a step forward in the journey of spatial computing. There’s a long way to go, but where we’re at is pretty great, and I, for one, am happy to be along for the ride.

Special Thanks

I consider it an incredible honor to get to write and publish this MacStories review covering a platform I believe in and enjoy deeply. Thank you so much for taking the time to read it. This project wouldn’t have been possible without the support of some wonderful people, so I’d like to thank a few in particular:

  • My wife Katherine, who is the most amazing partner and pushes me to pursue my dreams, including this one.
  • My son Noah, my inspiration for trying to make the world a little better in my own nerdy way.
  • Federico and John, for giving a guy with zero professional writing experience a chance to be a part of the incredible team they’re assembling.
  • The rest of the MacStories team, whom I endlessly enjoy working with every day.
  • Sigmund Judge, my podcasting partner in crime and consistent go-to for bouncing around ideas.
  • Tim Chaten, for never saying no to nerding out about Vision Pro with me.
  • Indie app developers, who create tools that improve my life and inspire me to create.
  • Joseph Simpson, for generously sharing developer perspective on visionOS.
  • The Vision Pro team at Apple, for their incredible work on this release and for blazing the trail of spatial computing.
  • Magic Rays of Light listeners.
  • Club MacStories members.
  • Our Discord members, for having such interesting conversations I always learn from.

And once again, a huge thank-you to MacStories readers like you for supporting our work and being so welcoming to me as I joined the team this year. This community is amazing, and it’s a privilege to be a part of it.


  1. Multitasking is another ongoing issue in visionOS, and version 2 doesn’t change much on this front. We’ll talk more about this topic in a bit. ↩︎
  2. I had given up on playing Jetpack Joyride 2 on my Vision Pro due to so many games being interrupted by the Control Center caret until this tip from Federico helped me limit unintended activations. Moving the caret further up made it harder to use when I wanted to, though, so it still wasn’t an ideal solution. ↩︎
  3. Pixels? ↩︎
  4. This was the first improvement I noticed after installing visionOS 2, and it was immediate. I literally said out loud, “My hands look awesome!” ↩︎
  5. Videos, by the way, can be trimmed in the Photos app in visionOS 2. I didn’t realize this capability was missing until Apple announced it as a new feature. ↩︎
  6. The Photos app does offer a limited selection of automatically converted pictures in the Featured Spatial Photos collection. I find this collection delightful to browse, and it gives me hope that more automatic conversion options are possible. ↩︎
  7. Alas, visionOS does now remain the sole Apple platform without a built-in calculator app. ↩︎
  8. The platform could still win some extra points with film buffs, though, by adding support for watching iTunes Extras bundled with film purchases and giving users a path to upgrade 2D titles imported from other storefronts via Movies Anywhere to 3D. ↩︎
  9. The example use case Apple gives for this technology is a virtual pet that abides within a particular room and greets you when you enter it, and I want to buy this app immediately. I really hope there’s a developer out there working on it. ↩︎
  10. Pardon the pun. ↩︎
  11. Okay, maybe not that last one. Email’s enough of a chore as it is. ↩︎

Access Extra Content and Perks

Founded in 2015, Club MacStories has delivered exclusive content every week for nearly a decade.

What started with weekly and monthly email newsletters has blossomed into a family of memberships designed every MacStories fan.

Club MacStories: Weekly and monthly newsletters via email and the web that are brimming with apps, tips, automation workflows, longform writing, early access to the MacStories Unwind podcast, periodic giveaways, and more;

Club MacStories+: Everything that Club MacStories offers, plus an active Discord community, advanced search and custom RSS features for exploring the Club’s entire back catalog, bonus columns, and dozens of app discounts;

Club Premier: All of the above and AppStories+, an extended version of our flagship podcast that’s delivered early, ad-free, and in high-bitrate audio.

Learn more here and from our Club FAQs.

Join Now]]>
macOS Sequoia: The MacStories Review https://www.macstories.net/stories/macos-sequoia-the-macstories-review/ Tue, 17 Sep 2024 18:41:36 +0000 https://www.macstories.net/?p=76555

Sequoia is unlike any major macOS update in recent memory. Annual OS releases usually tell two stories. The first is the tale of that release, which consists of a combination of design, system, and built-in app changes that add to the existing Mac experience. The second story plays out over time, taking multiple years to unfold and reveal itself. The best macOS releases are those that strike a balance between the two.

Often, a macOS update’s multi-year story revolves around new developer technologies that signal a change in direction for the entire platform. Swift and Catalyst were like that. Neither had an immediate impact on the day-to-day experience of using a Mac. However, even though the final destination wasn’t entirely clear at first, the corresponding macOS releases included concrete first steps that provided a sense of where the Mac was heading.

It’s possible to look at macOS Sequoia and see something similar, but the resemblance is only skin deep. This year’s release includes meaningful updates to system apps and even a brand new one, Passwords. Plus, Apple Intelligence promises long-term, fundamental changes to how people use their Macs and will likely take years to fully realize.

But Sequoia feels fundamentally different from Swift, Catalyst, and other past releases. It’s light on new features, the design changes are few and far between, and Apple Intelligence isn’t part of macOS 15.0 at all – although more features are on the way and are currently part of the macOS 15.1 developer beta. So what sets Sequoia apart isn’t so much what you can do with it out of the box; what’s unique about this release is that you could install it and not even notice the changes.

That’s not to say that Sequoia is a bad update. There’s more to like than not, with excellent additions like iPhone Mirroring, window tiling, the new Passwords app, and Safari’s video viewer. The trouble is that the list of changes, good or bad, falls off steeply after that. A half loaf may be better than none, but Apple has taught us to expect more, which makes Sequoia vaguely unsatisfying and out of balance compared to other releases.

It’s clear is that Apple is placing a big bet that artificial intelligence will pay off for macOS the same way magic beans did for Jack and his mother. The question heading into macOS 15.1 and beyond is whether Apple’s beans are magical too. Perhaps they are, but based on what I’ve seen of macOS 15.1, I’m not feeling the magic yet. I’ll reserve judgement and revisit Apple Intelligence as it’s incrementally rolled out in the coming months. For now, though, let’s consider macOS Sequoia 15.0’s morsels that readers can actually dig into today.

[table_of_contents]

iPhone Mirroring and Notifications

iPhone Mirroring

Apple has methodically expanded the interoperability of its devices under the umbrella of Continuity since macOS Yosemite, and this year is no different; the trend continues. With Sequoia, you can now mirror your iPhone to your Mac’s display and interact with your iPhone apps and notifications. It’s easily the best feature of macOS 15.0, but it takes a while for it to take hold.

When Apple added widgets to the Mac’s desktop, it included the ability to display widgets from a nearby iPhone. I have to imagine that was a testbed for iPhone Mirroring. And, as much as I’ve enjoyed having my iPhone’s widgets available on my Mac’s desktop, iPhone Mirroring and notifications are a much bigger leap forward in utility.

iPhone Mirroring is added to your Mac's Dock by default.

iPhone Mirroring is added to your Mac’s Dock by default.

After you update to Sequoia, you’ll see a new icon in your Dock for the iPhone Mirroring app. Open it and click through the onboarding explanation of the feature, and your iPhone will appear in a window on your Mac, much like the iPhone simulator does for developers running Xcode. For the feature to work, your iPhone must be within Bluetooth range and connected to the same Wi-Fi network and Apple Account as your Mac, which must either be an Apple silicon model or an Intel-based Mac with a T2 Security Chip. Mirroring also requires that your Mac not be using AirPlay or Sidecar when the connection is initiated.

The alert displayed on your iPhone while mirroring (left) and after you unlock your phone (right.)

The alert displayed on your iPhone while mirroring (left) and after you unlock your phone (right.)

While it’s being mirrored, your iPhone will be locked, so whatever you’re doing with it on your Mac won’t be visible to someone looking at your phone. A notification appears on your iPhone’s Lock Screen indicating which device it is mirrored to. Then, when you unlock your iPhone, a notification will appear from the Dynamic Island announcing that it was used on a Mac. The notification also includes a ‘Settings’ button that takes you to the ‘AirPlay & Continuity’ section of your iPhone’s Settings app where mirroring can be turned off.

iPhone Mirroring’s window has no chrome, nor is the window framed by a hardware image. However, a title bar will appear when you hover over the top of the iPhone’s window, revealing two buttons that take you back to the Home Screen and activate the app switcher, respectively.

Hovering over iPhone Mirroring’s window reveals two buttons.

Hovering over iPhone Mirroring’s window reveals two buttons.

The iPhone’s window isn’t resizable either, which can make it feel a little small on some displays, depending on your resolution settings. However, having used it on my Mac Studio Display all summer using my display’s default settings, I have no complaints about the window’s size. In addition to your iPhone’s Home Screens, its Today View and App Library are available while mirroring, but Notification Center and Control Center are not.

Landscape games like [Zenless Zone Zero](https://zenless.hoyoverse.com/en-us/) automatically switch to landscape mode on the Mac.

Landscape games like Zenless Zone Zero automatically switch to landscape mode on the Mac.

Most of the time, iPhone Mirroring will be in portrait mode. If an app or game defaults to landscape orientation or switches to it automatically, the iPhone Mirroring window will switch to landscape. However, there is no way to switch to landscape mode manually to take advantage of optional landscape modes that apps like Mail offer. Also, some apps like video streaming services don’t recognize the iPhone’s mirrored UI as a proper display, so you’ll hear audio if you play copy-protected, streamed video, but there won’t be any image. You also can’t access your iPhone’s camera from your Mac while the device is mirrored. Similarly, mirroring does not work with Continuity Camera, meaning that if you’re using your iPhone for a video call, you won’t be able to access your apps using iPhone Mirroring while you’re on the call.

Navigating apps feels natural and supports Magic Trackpad and Magic Mouse gestures, such as two-finger scrolling. My Logitech MX Master 3S for Mac mouse works for clicking and right-clicking in my iPhone’s mirrored UI, but its scroll wheel and other buttons are not supported. The Magic Keyboard’s Touch ID sensor is also available for those times when you’d normally need to enter your iPhone’s passcode.

In addition to the controls at the top of the iPhone Mirroring window, the app has built-in keyboard shortcuts to return to the Home Screen, access the app switcher, and start a Spotlight search query. You can also return to your Home Screen from any app by clicking the bar at the bottom of the window. Clicking and holding with your mouse or trackpad causes your Home Screen to enter ‘jiggle mode,’ allowing you to rearrange your app icons, add widgets, edit Home Screen pages, and customize the look of your Home Screens. However, I’m sorry to report that rearranging your iPhone’s icons is no easier from your Mac than it is on your iPhone.

Also, if you’re a fan of the iPhone’s StandBy Mode, you’re in luck because iPhone Mirroring works even when your iPhone is charging nearby and displaying widgets. And speaking of widgets, if you click on an iOS widget that you’ve added to your Mac’s desktop, it will launch iPhone Mirroring.

iPhone Notifications

Sequoia also introduces iPhone notifications on the Mac, which appear whether you’re using iPhone Mirroring or not and are not range-restricted. The notifications appear in your Mac’s Notification Center just like any other notification, but with a little iPhone icon in the bottom-right corner of the app icon to indicate their origin. I’m glad to see the differentiation between the two sources, but the icons are small and light gray, making it hard to notice them at a glance, so I’d like Apple to try something larger and more high-contrast in the future. Like a Mac notification, you can interact with your mirrored iPhone notifications, reading them and doing things like responding inline when that option is available. Clicking on an iPhone notification will open the associated app using iPhone Mirroring, too.

Examples of iPhone notification actions available from the Mac.

Examples of iPhone notification actions available from the Mac.

Sequoia has settings for which iPhone notifications show up on your Mac.

Sequoia has settings for which iPhone notifications show up on your Mac.

Sequoia gives you the ability to manage the notifications your Mac receives from your iPhone with settings to turn iPhone notifications off entirely, enable or disable notification sounds, and turn off notifications for individual apps. In addition, there’s a new ‘X’ button at the bottom of the Mac’s Notification Center that turns into a ‘Clear All’ button when you hover over it and, as the label says, will wipe out any notifications – whether they be from your Mac or iPhone – with a single click. Finally.

At long last, a 'Clear All' button for notifications.

At long last, a ‘Clear All’ button for notifications.


When I first started using iPhone Mirroring, I wondered whether it would be of much use to me. As it turns out, I have more iPhone-only apps than I realized. Yes, some – like my bank’s app – can be replicated by visiting a website on my Mac, but others – like the apps for my thermostat, Roomba, and garage door – can’t. Instead of pulling out my iPhone to adjust the temperature, check if my Roomba is stuck under the couch, or see whether my garage door is open, I can quickly check those apps via iPhone Mirroring. It might seem like a small difference compared to grabbing my iPhone, but it’s not as disruptive, which means it’s also less distracting. It took a while, but over the course of the summer, I’ve discovered more and more apps that are useful to access from my Mac. I’ve also found it very useful when writing about iPhone-only apps, which I can now test and glance at as I write.

AirDropping an image from my iPhone to my Mac, using my Mac.

AirDropping an image from my iPhone to my Mac, using my Mac.

iPhone Mirroring also makes grabbing files and images from my iPhone and sending them to my Mac (or the other direction) via AirDrop easier than ever. That’s something I do a lot, and now it’s far easier because I don’t have to switch between devices. Later this year, Apple says iPhone Mirroring will gain the ability to drag and drop files between your iPhone and Mac, too, but for now, AirDrop is nearly as good a solution.

Dealing with iPhone notifications on my Mac has been a better experience than I expected, too. I manage the apps that can notify me on my iPhone carefully, so adding them to my Mac notifications hasn’t felt overwhelming. On the contrary, it’s allowed me to deal with those notifications in a more quick and timely way without having to switch to my iPhone, which I love. It also means I don’t wind up with a big stack of unread notifications at the end of the day when I stop working on my Mac and grab my iPhone.

Apple’s suite of Continuity features that have been introduced over the years is one of the brightest spots of its OSes. Sidecar, Continuity Camera, Handoff, the Universal Clipboard, and Universal Control are all Continuity features that I use daily and have had a measurable impact on my productivity. Few are features that I would have imagined I needed before they were introduced, but all of them are deeply embedded in my day-to-day workflow now. In the near term, I’d love to see mirroring work with the iPad, but I hope the Continuity team at Apple is also busy dreaming up new ways to make the company’s devices work together even more seamlessly.

Window Tiling

I almost wonder whether Sequoia’s window tiling feature has been sitting in a drawer somewhere inside of Apple Park, waiting to be pulled out in a year like this one when the macOS update was light on features. I mean, why now? Windows has had tiling forever, and there are many third-party window managers for the Mac ranging from simple collections of keyboard shortcuts to complex, full-blown utilities that do all sorts of fancy tricks with windows. That said, I’m glad Apple finally added window tiling, which works whether you use one screen or multiple screens.1 Sequoia’s implementation is simple but likely to satisfy a lot of users’ needs, mine included.

Accessing window tiling from a window's green button.

Accessing window tiling from a window’s green button.

Each tiling option includes a keyboard shortcut.

Each tiling option includes a keyboard shortcut.

Sequoia’s window tiling options can be accessed in multiple ways, including from the Window menu, the green button in the corner of every window, drag and drop, and keyboard shortcuts. The variety is welcome because it accommodates a range of workflows and makes it easier to incorporate the feature into automations even though windows tiling actions have not been added to the Shortcuts app.

Dragging a window to fill the right half of the screen.

Dragging a window to fill the right half of the screen.

Regardless of how you prefer to access the feature, Sequoia’s window tiling is divided into two categories with four options each. The first category is ‘Move and Resize,’ which allows you to place a window on the left, right, top, or bottom halves of your screen. The second set of options is called ‘Fill and Arrange,’ which includes the following options:

  • Full Screen for filling all available screen real estate with the active window
  • 2-Up for placing the active and second most recently active windows side by side with each taking up half of the screen
  • 3-Up for placing the active window on the left half of the screen and the two other most recently accessed windows in equally-sized quarters of your screen’s right half
  • 4-Up for distributing the four most recently active windows in equally-sized quadrants

If you have fewer windows than the number necessary for the option you pick, Sequoia will make do with what it has, leaving blank spaces as needed.

Managing window tiling in System Settings.

Managing window tiling in System Settings.

There are new options in System Settings related to window management, too. In the app’s ‘Desktop & Dock’ tab, you can specify whether your tiled windows touch at their edges, taking up your full screen. Alternatively, you can leave a very narrow gap between windows, which I like the look of a lot. However, judging from reactions I’ve seen online, that handful of pixels around windows is viewed by many as a gross waste of screen space, so it’s a good thing users have an option.

System Settings includes toggles for tiling windows by dragging them to the edges of your screen or Option-dragging them. There’s also a new setting that allows you to double-click the title bar of a window to make it fill the screen. This is something that Windows has done forever by default and one of the few Windows interactions that I’ve always missed despite having left that world years ago. It’s a small touch, but one that I immediately turned on and don’t plan to ever turn off.


Raycast also offers simple window tiling options.

Raycast also offers simple window tiling options.

I’ve used Raycast’s window tiling feature for a few years now, and although it’s not as robust as some third-party alternatives, it works well for me because I typically don’t do anything more than maximize a single window or put two side by side. As a result, I’ve found myself continuing to use Raycast’s keyboard-driven tiling implementation, but turning to Sequoia’s new options when my hands are already on my mouse or trackpad.

For people with similarly simple window management needs, I suspect that Sequoia’s new window tiling will be a fantastic addition because it’s built-in and handles the basics very well. For anyone who wants more, apps like the recently-updated Moom (which Niléane reviewed), Lasso, Rectangle, Raycast, and others will continue to be excellent options. I don’t expect Sequoia’s new windows tiling option to have a significant impact on third-party utilities; instead, it will simply be easier for anyone who is new to window tiling to get started and perhaps graduate to a more robust third-party alternative later.

Video Conferencing Enhancements

A FaceTime call using one of my own photos as a background.

A FaceTime call using one of my own photos as a background.

Sequoia adds new video conferencing features that will make it easier to participate in calls and share your screen without oversharing.

Background replacement is already a popular feature of a lot of video conferencing apps. For a while, macOS has offered the ability to blur your background to add a modicum of privacy and hide clutter, but with Sequoia, Apple is leveraging its advanced camera processing pipeline to detect the subject of a video feed in real-time and replace the background behind the subject with a built-in scene or one of your own photos.

Another background option is a series of images from Apple Park.

Another background option is a series of images from Apple Park.

There are two sets of pre-built backgrounds, plus a third set for saving your own photos as backgrounds, all of which you can access from the video menu bar app. The first set includes eight light gradients and a black gradient. The second set includes nine scenes from Apple Park in Cupertino. The photos are nice, but if you don’t work for Apple, would you really use these? I doubt most people would. Finally, you can add your own photos to the mix, which in my experience look good as long as you don’t pick something that’s too distracting.

Regardless of the background you pick, the feature itself works well, fading in when activated with a nice animation and replacing the room you’re sitting in with the photo while managing not to look strange. The background replacement feature even handles my glasses better than the iPhone’s Camera app sometimes does in Portrait mode.

One of the simple gradient backgrounds.

One of the simple gradient backgrounds.

Aside from hiding the dirty laundry in the background of your home office, Apple has improved screen-sharing options to make it easier to limit what you share on video calls. The feature, which Apple calls presenter preview, is triggered when you share your screen, allowing you to choose between sharing a particular app or your whole screen before your audience sees what you’re sharing. If you pick an app, no other open app’s windows or notifications are visible to the people you’re sharing with.

Sharing my entire screen with Federico.

Sharing my entire screen with Federico.

Once sharing is started, you can click on the screen sharing icon in the menu bar to check what you’re sharing with someone, too. Users get the same sort of sharing choices when starting playback of a Keynote presentation (or another app that supports the feature) or, starting later this year, connecting an external display.

Presenter preview in action. Source: Apple.

Presenter preview in action. Source: Apple.

I don’t participate in many video conferences and rarely share my screen, so I don’t have much to say about presenter preview except that offering ways to hide what you don’t want to share can only be a good thing. In my testing, the feature works well and is easy to use. Plus, it works in any video conferencing app that supports the feature’s APIs, so you can use it in apps like Zoom as well as FaceTime.

System Apps

One of the bright spots of this year’s OS updates across every device is that Apple’s system apps are more in sync with each other than ever before. In the past, it wasn’t unusual for changes made to an app on the iPhone to take a year or more to make their way to the Mac. This year is different.

With very few exceptions, the features added to Mac system apps are the same across iOS and iPadOS 18, too. So instead of covering the same changes twice, Federico and I have divided up system apps between our reviews. You’ll find our coverage of the following apps, which are also available on the Mac, in Federico’s review of iOS and iPadOS 18:

  • Calculator
  • Calendar
  • Home
  • Messages
  • Music
  • Notes
  • Photos
  • Reminders
  • Weather

Likewise, unless otherwise noted, the changes to Safari, Maps, and other system apps discussed below apply to the iOS and iPadOS 18 versions, too.

It’s great to see system apps moving in lockstep in this year’s OS updates, although I would have liked to have seen Shortcuts add personal automations and more system actions. As I argued two years ago:

…if Shortcuts is to become the default way to automate tasks on the Mac, there needs to be steady, yearly progress to make macOS and its default system apps as Shortcuts-friendly as possible.

That hasn’t happened, but perhaps some progress will be made as part of the Apple Intelligence features coming in 2025. For now, though, what I said in my macOS Sonoma review remains true:

At this point, Shortcuts for Mac is beginning to feel a lot like a modern version of Automator – a neglected, dead-end tool that never fulfilled its potential. Perhaps next year will be different, but I’m no longer optimistic it will be.

Still, there are significant updates to Safari and Maps, plus the brand-new Passwords app to cover, so let’s take a look at what you can expect from each.

Safari

There’s probably no system app more important to macOS than Safari. Sure, Finder is necessary to navigate the file system, but I’d wager that few users think of Finder as an app. Safari permeates everyday life for most people in a way that few other apps do, whether it’s planning a trip, researching a topic, reading the news, or working in a web app. As a result, users are sensitive to small changes, making it difficult to update Safari without upsetting somebody. This year, that’s me with one exception.

Video Viewer

Watching [Brendon Bigley's Wavelengths](https://www.youtube.com/@itswavelengths) using video viewer puts the video front and center, dimming out the rest of the YouTube page.

Watching Brendon Bigley’s Wavelengths using video viewer puts the video front and center, dimming out the rest of the YouTube page.

For my money, the best new feature is Safari’s new video viewer. The more I use it, the more I miss it when I’m on my laptop, which is still running Sonoma. When you come across a video on a webpage, you can activate the video viewer from the unnamed multi-purpose button in Safari’s address bar, which I am hereby dubbing the ‘Webpage Settings’ button. Once you do so, the video expands to take over most of the window, dimming the rest of the webpage behind a nearly opaque black overlay. Standard playback controls appear at the bottom of the video and fade away as the video plays, reappearing when you hover your pointer over the video. To exit the viewer, just tap the Escape key or return to the address bar and toggle it off with a click. It’s an excellent way to watch videos on the web, but it’s the little touches that make it even better.

My favorite detail about the video viewer is that it automatically enters Picture-in-Picture mode when you change your Mac’s focus to another app or switch Safari tabs. Return to Safari and the original tab, and the video pops back into the webpage. While a video is in PiP mode, you have options to close it, manually pop it back into Safari, or stop playback. However, you lose the ability to skip forward and back five seconds at a time using the arrow keys, which is disappointing.

There’s also a strange detail about video viewer that I don’t understand. If a video is muted, the option to activate the video viewer doesn’t appear when you click the Webpage Settings button. I can’t think of a reason why that is, but figured I’d mention it because the first time I tried to activate video viewer not realizing that the YouTube video’s sound was muted, I was confused at first. Another limitation is that only one video can be in viewer mode at a time.

Switching to another tab or app automatically puts what you're watching in Safari's video viewer into Picture-in-Picture mode.

Switching to another tab or app automatically puts what you’re watching in Safari’s video viewer into Picture-in-Picture mode.

Thanks to NPC: Next Portable Console, I’ve been watching more YouTube reviews of videogame handhelds lately. Safari’s video viewer allows me to start a video and then open a new tab to research whatever is being covered or take notes in another app without having to manage the video at all. Instead, as soon as I open a new Safari tab or move to a text editor, the video I’m watching enters PiP mode. If the video gets to a part I’d like to see in a bigger window, flipping back to Safari reverses the process, returning me to its larger viewer.

The biggest downside I’ve found to watching videos this way is that it obscures YouTube’s descriptions, which often include valuable links. I’d love to have a way of accessing those links in viewer mode so I could open them in a new tab, activating the viewer’s PiP mode. Instead, if I want to access a video’s description and links, I need to exit viewer mode, which adds the sort of friction that the feature is meant to eliminate. That said, I love video viewer and have used it nearly every day this summer because it’s made watching videos meaningfully better.

Web Apps Get Extensions

Apple hasn’t publicized it anywhere that I’ve come across, but the web apps that you can create via Safari for Mac now support browser extensions. Web apps, introduced last year in macOS Sonoma, are another Safari update that I’ve found myself using daily. I have web apps set up for social media accounts and a variety of services we use to run MacStories, and they work great.

However, as first implemented, web apps left behind the growing universe of Safari extensions from third-party developers. That wasn’t an issue with most of the web apps I created, but over the past year, I’ve heard from a lot of readers who have one or two extensions that are closely tied to the work they do in their web apps. With Sequoia, most of your Safari extensions should show up in a line along the right side of a web app’s title bar next to the share icon. I say “most” because one of mine is missing, which I believe is due to the fact that it’s only available outside the Mac App Store. In any event, the extensions work just like they do in Safari itself, and while it may seem like a very small change, it will make a big difference for a lot of users who rely on web apps.

Highlights

Highlights works best with large hotel chains.

Highlights works best with large hotel chains.

The feature Apple spent the most time talking about at WWDC is Highlights, a new US-only feature that spotlights important content from a webpage. Apple seems to be tackling two different problems with Highlights: the first is finding important information on a busy webpage, and the second is summarizing articles, which I’ll cover below in the context of Safari’s Reader mode. The example of the first problem used at WWDC was pulling the address, a mini-map, and a phone number from a hotel’s website. Apple also says the feature works with restaurants, landmarks, people, music, movies, and TV shows.

In my testing, Highlights works well for large national hotel chains but is hit-or-miss for smaller hotels. For example, if I visit the website for a Marriott, Hilton, or Sheraton hotel, a purple sparkles icon will appear over Safari’s Webpage Settings button indicating that there’s a Highlight available. Clicking the button pops up a map of the location with truncated details about it such as its rating and hours. Clicking on the map opens the location in the Maps app. Highlights results also include a button to get driving directions to the location in the Maps app.

The feature won’t work if you visit one of the sponsored links at the top of Google search results, which are usually from hotel booking aggregators, nor does Highlights consistently work with smaller one-off hotels. I’ve found a couple of smaller boutique hotels that Highlights worked with, but not any smaller local hotels outside of urban areas. I’d love to see this feature expand to include more hotels and other countries, but as it stands, Highlights is still a useful way to track down the basics of many U.S. hotels quickly.

Highlights works with some landmarks.

Highlights works with some landmarks.

Highlights’ landmark coverage is hit-or-miss, too. For example, Safari will offer Highlights for the Statue of Liberty but not the Empire State Building. You’ll also get Highlights for Yankee Stadium, but not Bank of America Stadium where the Carolina Panthers play, or the Rose Bowl. As far as restaurants are concerned, Highlights can reliably navigate you to large chains, but other restaurant coverage is spotty.

Movie information is limited, but links to the TV app if it's available there.

Movie information is limited, but links to the TV app if it’s available there.

Musician information is also limited, but links to a bio on Wikipedia or to Apple Music.

Musician information is also limited, but links to a bio on Wikipedia or to Apple Music.

Early in the betas, I couldn’t find any Highlights for people, music, movies, or TV shows. Those categories are still a little hard to find, but I’ve begun seeing the familiar purple sparkles more often over the past couple of weeks.

Although visiting music artists’ webpages has never triggered a Highlight for me, they’re starting to appear on review sites like Pitchfork. Sometimes Highlights link to sites like Wikipedia for more information about the artist, but other times, they link to albums in Apple Music.

Movie and TV show Highlights have been hard to find, too, but I have seen examples on Variety, Vulture, and Wikipedia. However, I have yet to run into movie or TV show Highlights when watching trailers on YouTube or visiting IMDb. When they do appear, movie and TV show Highlights include basic information like genre, release year, and rating, and clicking on the Highlight opens Apple’s TV app if the content is available there.

A bio of Steve Jobs included in Highlights for a post about 'The Crazy Ones.'

A bio of Steve Jobs included in Highlights for a post about ‘The Crazy Ones.’

I was also able to find a Highlight with a short bio of Steve Jobs when I visited a website with an article about ‘The Crazy Ones.’

The bottom line is that if you have a road trip coming up and are staying at a major hotel chain in the U.S., Highlights is a handy way to get driving directions. The other available categories have potential, but the coverage is still too spotty to be something you can expect will show up regularly as you browse the web, limiting its utility.

Reader Mode

There’s actually one other feature of Highlights that I saved for this section because they’re so closely tied. Safari will often display its purple sparkles icon over the Webpage Settings button when you open an article. Click the button, and a summary of it will appear along with a button offering to open the article in Reader mode.

When you open Reader mode, the familiar stripped-down UI appears, but with a hideable sidebar on the right that includes a summary of an article and, in some cases, a table of contents. Like Highlights, I don’t find these summaries very useful.

First of all, they appear in far fewer articles than I expected. I’ve spent a lot of time comparing longform stories from sites like The Verge from similar time periods and on similar topics, and there doesn’t seem to be any discernible pattern for whether or not summaries appear. However, I have noticed articles that previously had no summaries gaining them later, so I suspect Apple’s web crawler is still indexing and summarizing in the background. As a result, article summaries may become more prevalent over time.

Summaries don’t always do a good job of parsing webpages. For example, David Pierce’s article for The Verge about robots.txt files pulls too much information into the article’s title. In contrast, Safari does a surprisingly good job with Mia Sato’s story (also on The Verge) about how the Internet molded itself to fit Google’s algorithms, despite the story’s atypical layout.

Moreover, the summaries are too short to be useful. In the examples above, Pierce’s robots.txt story is roughly 3,000 words long, while Sato’s on Google comes in at about 6,000. Both summaries are far too short to meaningfully summarize the articles, and surprisingly, the summary of Pierce’s story – which is half the length of Sato’s – is 80 words compared to Sato’s summary at 60 words.

Article summarization feel like a feature that was added because summaries are one of the things that AI is good at, not because anyone thinks you can effectively boil a 6,000-word article down to just four sentences. Perhaps Apple is being conservative with the length of these summaries to avoid copyright claims by authors, which would be understandable, but I’d prefer if it had abandoned the feature instead of shipping something with such limited utility.

This and the next article on MacStories were formatted the same way, but one includes a table of contents, and the other doesn't.

This and the next article on MacStories were formatted the same way, but one includes a table of contents, and the other doesn’t.

The second addition to Reader is that it sometimes includes a table of contents for articles. I say “sometimes” because it’s even harder to find an article with a table of contents than a summary. That seems to be because tables of contents usually appear in articles that are summaries themselves, which is already a somewhat rare thing, and only builds a table of contents if the article also uses H2 or H3 headers. Still, that doesn’t entirely explain when you can expect to see a table of contents because, of two identically-formatted MacStories articles about our favorite HomeKit devices – both of which were summarized – only one includes a table of contents.

When a table of contents does appear next to the Reader version of an article, it’s far more useful than a summary. As you’d expect, clicking on an entry in the table of contents jumps you to that part of the article, making it easy to skip around in a longer story. I’d like to see tables of contents added to a wider variety of articles because when they show up, they’re an excellent way to navigate the content.

It’s also worth noting that entering Reader mode used to take a single click on Safari’s Webpage Settings button. Now that the Webpage Settings button has been loaded with additional features, it takes two clicks, which is a lot if you spend as much time as I do in Safari. Fortunately, you can also enter Reader mode using ⌘ + ⇧ + R.

Distraction Control

Highlighting a webpage element for hiding.

Highlighting a webpage element for hiding.

The remnants of an ad blown away like dust.

The remnants of an ad blown away like dust.

Distraction Control is yet another option chucked onto the increasingly-crowded set of features accessible from Safari’s Webpage Settings button. Click on it, and you enter a mode that highlights elements of a webpage as you hover over them. Click on whatever the highlighted element, and it disintegrates Thanos finger-snapping style. It’s a cool effect for a feature that I’m not sure anyone needs or wants.

Don’t get me wrong: the web is littered with pages full of nonsense that gets in the way of reading. But Reader mode already handles that well, whereas Hide Distractions is a chore to use. After activating the feature, you have to click on every element you want to hide, which can be a lot and isn’t very effective on sites that use infinite scrolling. Plus, elements that refresh periodically, like some ads, will return after a while anyway. Elements that are static will remain hidden between visits to a site.

Apple has been careful to say that Distraction Control is not an ad blocker, pointing to the fact that ads that refresh automatically come back after a time. I’d spin the feature a little differently. Distraction Control is a very effective ad blocker for sites that display static ads, but not a replacement for a third-party ad blocker that can handle dynamic ads.

My concerns about Distraction Control are twofold. First, static ads – like the kind of ads we display on MacStories – are often chosen because they are less disruptive than dynamic ads. Effectively, Apple is penalizing sites that use less distracting ads by making them easier to block long-term than the dynamic variety. I’ve checked, and as things stand today, if someone hides a banner ad on MacStories, it will never reappear. That’s a problem if you’re running a website.

Hide the MacStories banner ad, and you lose the entire masthead as well as navigation controls, which isn't great.

Hide the MacStories banner ad, and you lose the entire masthead as well as navigation controls, which isn’t great.

Second, Distraction Control is not very precise. If you try to hide the banner ad at the top of MacStories, it will wipe out the masthead and site navigation along with the banner. That’s a potential support nightmare we (and I suspect other sites) will have to deal with. I know from experience that ad blockers are the number one cause of complaints we receive that MacStories isn’t working correctly. Not that many people use third-party ad blockers, but with this feature built into Safari, I wouldn’t be surprised to see more people start hiding elements, leading to an increase in support requests.

But those are issues for people who run websites. As a user, Distraction Control’s utility is limited. I don’t think it gives me anything that isn’t solved better by Reader mode, at least for articles I want to read. Sure, it preserves more of a site’s layout, but it’s so much more laborious than entering Reader mode that I haven’t run into a situation where I prefer using it, except when I visit a site that has annoying overlays and similar elements to do something other than read. Like article summaries, Distraction Control’s limited utility makes it feel like a feature that might not have made the cut in a year with a more robust set of feature updates.


If it were me, I would have added the video viewer to Safari and extensions to web apps, and called it a day. Everything else feels underdeveloped, incomplete, or ill-conceived. Highlights has potential, but currently, the places where you’ll find it are limited. In contrast, Reader mode summaries and Distraction Control seem to be features in search of a problem. Perhaps they can be refined over time to become more useful, but as they stand, I’d like to have a setting to simply turn them both off to declutter Safari’s increasingly crowded Webpage Settings menu.

Passwords

For years, Federico and I have wished for a standalone Passwords app, and it’s finally here. We’ve split most of the system apps between us for the purposes of our reviews, but not Passwords. It’s new, and so you’ll find it in both of our reviews because it’s not only here at long last, but it’s also very good.

I had a feeling this day would come eventually, so I’ve been weaning myself off of 1Password for about three years, moving passwords to Safari’s built-in system as I accessed sites and services. I could have imported them into Passwords all at once, but with so many years of passwords saved (many of which I no longer use), I figured a slow transition would provide me a cleaner start, and it has. I still keep 1Password around; I just find myself using it less and less. But I’m getting ahead of myself. Let’s cover what Passwords does before exploring why you might want to switch to it too.

Passwords is a thoroughly modern Mac app that anyone who has used system apps like Reminders will feel comfortable using. That’s a big deal because people will only use an app to manage their passwords if it’s easy to use, which Apple’s Passwords is. It’s worth noting, too, that there is still a Passwords tab in Safari’s Settings, but all it includes is a button that opens the Passwords app – a smart addition for those who were in the habit of looking for their passwords there. The Passwords section of System Settings features a similar button and has been relocated to the General tab under ‘Autofill & Passwords,’ which includes a handful of other password-related settings too.

Passwords greets users with a login screen reminiscent of Safari's passwords Settings tab.

Passwords greets users with a login screen reminiscent of Safari’s passwords Settings tab.

The app, which must be unlocked with your user password upon launch, has a classic three-pane layout. The left pane gathers top-level password categories, plus any groups you’ve shared passwords with. The second pane is a list view of whatever category or group is selected in the left pane, and the third and final pane shows the details of the selected item. At the top of the middle pane, you’ll also find buttons to add new credentials and sort lists by Title, Website, Date Created, or Date Edited in either ascending or descending order. There’s a prominent search field in the toolbar, too.

I love Passwords' approachable, simple design.

I love Passwords’ approachable, simple design.

The six top-level password categories are automatically created by Sequoia:

  • All
  • Passkeys
  • Codes
  • Wi-Fi
  • Security
  • Deleted

What I like most about Passwords is its simplicity; the app has very few settings. I like tweaking app preferences as much as the next person, but not with my password manager. I’m invariably coming from somewhere else when I open Passwords. I want to get in, find what I need, and finish a purchase or log onto a site, not linger.

The beauty of Passwords is that it takes care of organizing things for me. With a single click, I have access to my two-factor codes, Wi-Fi passwords, and passkeys. Then, in the Security section, I can chip away at alerts, looking for vulnerabilities and fixing them. And, if I delete something that I shouldn’t have, I can scan through the Deleted category too. But most of the time, I just run a quick search to pull up what I need and move on.

Shared passwords.

Shared passwords.

Below Passwords’ categories are Shared Groups. If you created a Shared Group when passwords were accessible from Safari’s preferences, they’ll show up in Passwords automatically when you upgrade to Sequoia. You can add new ones, too. Hover over the Shared Groups section title in the left pane, and a plus button appears for adding a new group. From here, you can also click on the caret icon to hide your Shared Groups. As before, Shared Groups are not limited by things like Family Sharing; for example, I have a Shared Group with my family, but I have another one with Federico for MacStories passwords.

In Passwords’ Settings, you can display accounts as Titles or Websites. There are also checkboxes for showing the Passwords app in the menu bar, detecting compromised passwords, suggesting strong passwords, and automatically creating passkeys.

Passwords' menu bar app suggests relevant passwords for whatever app or website you're using.

Passwords’ menu bar app suggests relevant passwords for whatever app or website you’re using.

If you choose to display Passwords in the menu bar, it will enable you to add new passwords and search for existing ones. Clicking on an entry opens its details view, where you can copy a username or password simply by clicking on it or choose any website to be taken there. My favorite feature of the menu bar app is that it detects the app or website you’re using and suggests any passwords for it at the top of the list. For instance, as I was writing this section in Obsidian, I clicked on Passwords in the menu bar, and it suggested my Obsidian passwords.

As I mentioned at the top of this section, I’ve been migrating passwords away from 1Password for a long time in anticipation of this day. 1Password is a good app, but it’s more than I really need, and I’ve always felt that version 8 makes it harder to find things than it should be.

That said, I do miss a few features from 1Password that I would like to see added to Passwords:

  • file attachments, which can be partially handled by setting up a password-protected note in the Notes app, but that solution isn’t as robust as the one 1Password offers; and
  • a new category for things like government IDs, credit cards, and insurance details.

The lack of a notes field and document attachments has made me hesitant to fully migrate MacStories to Passwords, and not having credit cards in the app is a pain for me personally, but for a 1.0 system app, Passwords is excellent. If there were more to say about Passwords, I would. The team behind it deserves a lot of credit for migrating years of work to a standalone app in a way that’s so simple that there’s very little that I can say about it, except that I highly encourage readers to give Passwords a try.

Maps

Maps is one of those system apps that sometimes flies a little under the radar, but it shouldn’t. The app has evolved over the years to encompass far more than driving directions. That’s an essential core feature that I use more than anything else, but Maps has become as much about exploring the world around you as it is about getting from one point to another.

Maps already allows you to pin locations, save places, and create guides. This year, the app adds new Routes features, and along with them, a new Library section to organize everything.

Maps' Library collects pinned locations, places, guides, and routes all in one place.

Maps’ Library collects pinned locations, places, guides, and routes all in one place.

To access the Library, click on your profile button and select Library. That opens a view with Pinned, Places, Guides, and Routes sections, followed by recently added items. Ever since I moved a couple of years ago, I’ve been saving places I’d like to visit in Maps, so the new Library, which makes it easier to find everything you’ve saved, is a very welcome addition.

The Routes section of the Library is entirely new, and it’s another feature I know I’ll use a lot. If you’ve ever used an app like Footpath, you’ll be familiar with how this new addition works. Like a lot of system features, Routes is simpler than similar third-party apps, but I expect it will get the job done for a lot of people.

Maps' new Routes  creation feature.

Maps’ new Routes creation feature.

At the top of the Routes section of your Library is a ‘Create’ button that activates Maps’ route creation mode. Choose a starting point, click it, and then add additional points along your route. You can click on any point to remove it, reverse your start and end points, create an ‘out and back’ route, or close your loop. As you lay down points, Maps automatically snaps to the nearest roads and paths.

Once you’ve finished a route, Maps will show you how long it is, the time it will take to walk it, elevation changes, and a graph of those changes over the course of the route. You can add Notes to a route, too. Maps will also offer directions from your current location to the start of the route, as well as the option to save the route to your Library.

Creating a route.

Creating a route.

I wish I’d had this feature two years ago when I first moved to North Carolina. I spent a lot of time picking out walking routes in my area when I first moved here, and Routes would have been incredibly useful for that. Still, I plan to use the feature as a handy way to pick a route that matches the time or energy I have when I head out for some exercise.

This leads me to something I hope Apple adds to Routes in the future: integration with the Apple Watch. Planning a route is something walkers, runners, and cyclists do all the time, and it would be great to be able to plan a route in advance and then navigate it with a combination of the Apple Watch and voice prompts via AirPods. I’ll be surprised if Routes doesn’t come to the Apple Watch next year, but for now, the initial implementation is excellent and a nice addition that continues the app’s evolution beyond driving directions.

Maps also includes a set of built-in hiking routes for U.S. national parks that can be filtered by length, elevation, and type. They’re similar to the custom routes you can create anywhere in the app and include the same sort of data. However, although hiking routes are a terrific addition to Maps, I hope they’re expanded beyond national parks and to other countries because, unless you live near a national park with hiking trails, you probably won’t get much out of them.

Maps includes hiking trails in U.S. national parks and topographic data.

Maps includes hiking trails in U.S. national parks and topographic data.

If you look at one of the hiking trails added to Maps, you’ll notice something else, too. Topographic data has been layered into Maps in the U.S. and Japan. The topographic contours of the landscape seem to show up a little differently depending on where you are. By default, you won’t see topographic features, but if you search for a park with trailheads, like Yosemite National Park, and click on the Trailheads button in Maps’ search results, the contours of the landscape will appear. They also appear if you click on the marker for the park itself or a trail, but they disappear when it isn’t selected. In other parks, I’ve noticed that topographic features only appear when you select a trail. Either way, it’s worth noting that topographic maps aren’t limited to U.S. national parks. I’ve checked multiple North Carolina state parks, and they work there, too.

Another addition for hikers this year is the option to save hiking trails for offline use. If you’re hiking far from civilization, this will be a great feature to have, but I haven’t had the opportunity to try it yet.

The detail cards for locations have been updated in Maps too.

The detail cards for locations have been updated in Maps too.

Apple has also reorganized the detail cards for locations in Maps. There’s a new ‘+’ button next to the ‘More’ button on a location’s card for quickly adding a destination to your Library. Click on the ‘More’ button, and there are new options to add a note to a location or pin it too.

Maps' new 'Search Here' button.

Maps’ new ‘Search Here’ button.

Finally, if you search for something in Maps and then pan around, search results will automatically update to show the new area. That’s not new. However, now, if you pan Maps’ view just a little bit, instead of updating the search results, the app will display a ‘Search Here’ button at the bottom of the map to update the results for your slightly changed view.


All in all, this year’s update to Maps is great. It continues to expand the app beyond the table stakes of navigation to include more ways to explore the world around you. Maps is an app that by definition will never be finished, but it’s great to see the team behind it not just updating existing data but expanding what’s available too.

All the Rest

The new Sequoia Sunrise wallpaper.

The new Sequoia Sunrise wallpaper.

As with any OS update, there are a lot of smaller changes to Sequoia, too. I’ve rounded up my favorites below.

Wallpapers

Examples of two of Sequoia's new wallpapers.

Examples of two of Sequoia’s new wallpapers.

There are three new wallpapers available in System Settings this year:

  • Macintosh is a Dynamic Wallpaper that cycles through classic Susan Kare Mac iconography.
  • Sequoia is an abstract Dynamic Wallpaper that reminds me of sunlight filtering through water.
  • Sequoia Sunrise is a live wallpaper of a forest of Sequoia trees, and it’s my favorite.

System Settings and More

Every macOS review comes with a bunch of updates scattered throughout the system. It’s impossible to catalog them all, but I’ve picked out the biggest changes, along with smaller system app updates that are notable.

One of my favorite changes is that, in many apps, if you select text and type ⌃ + ⮐, the right-click contextual menu will appear. As far as I can tell, this new keyboard shortcut works in all native text fields but won’t work in a webview. That means you can use it in apps like TextEdit, Things, and Notes, but it won’t work in Safari, Obsidian, Slack, or Discord. Still, it’s a handy shortcut to keep in mind.

HDMI passthrough is an option in multiple system media apps.

HDMI passthrough is an option in multiple system media apps.

Sequoia has also added HDMI passthrough for the home entertainment nerds in the audience; it can be found in the TV app, QuickTime, and Music. That means Dolby Atmos and other formats will be passed directly to soundbars, receivers, and speakers from your Mac. What’s most interesting about this change is that it seems like something that’s been built in anticipation of a more home entertainment-friendly and potentially smaller Mac mini.

AirDrop's new progress bar.

AirDrop’s new progress bar.

Other system and System Settings updates include

  • a progress bar that appears when a file has been AirDropped to your Mac;
  • a reorganization of the System Settings sections, which are still a bit of a mess, but better;2
  • a redesigned and dedicated iCloud tab in System Settings that remains accessible from your profile picture too; and
  • a switch from the Appearance tab to the General tab as the default section that is displayed when you open System Settings.

With Sequoia, Apple says it has also refined Game Mode, which shifts system resources to prioritize a game when one is launched. The mode kicks in automatically, and according to Apple, Sequoia improves on the feature with smoother frame rates and better power management.

Apple has ramped up privacy notifications for apps that require screen recording privileges, too. Although the initial approach was to require user confirmation at least once a week, notifications now appear monthly, which is better, but it’s still annoying that alerts can’t be turned off entirely.

Sequoia's new rotating Wi-Fi network option.

Sequoia’s new rotating Wi-Fi network option.

Sequoia extends ‘Private Wi-Fi,’ the feature that sets a unique MAC address for each wireless network to which a device connects. A new ‘Rotating’ option that allows devices to cycle between multiple MAC addresses when they connect has been added and can be found in the Wi-Fi section of System Settings by clicking on the ‘Details’ button next to the Wi-Fi network your Mac is connected to and changing the ‘Private Wi-Fi address’ option to ‘Rotating.’

Accessibility

You can read more on the accessibility features coming to macOS Sequoia and Apple’s other platforms in this story that we published in May.

Smaller System App Changes

Freeform

Freeform adds Scenes for presenting canvases that can be printed or converted to PDFs.

Freeform adds Scenes for presenting canvases that can be printed or converted to PDFs.

The headlining addition to Freeform on the Mac, iPad, and iPhone this year is the Scene navigator. If you have a big, sprawling board, the feature allows you to create incremental views that you can then navigate as you present your creation to others. Scenes can be created and navigated from the floating toolbar, which includes the app’s zoom tool and can be reordered using drag and drop. Scenes can also be printed or saved as PDFs. In addition, Freeform now includes

  • an option to snap board elements to a grid for easy alignment;
  • the ability to pan a canvas by holding down the space bar and dragging your board; and
  • a sharing feature that lets you send of a copy of a board via an iCloud link.

Voice Memos

Voice Memos' transcription occurs in real-time and is very accurate.

Voice Memos’ transcription occurs in real-time and is very accurate.

If you’re the sort of person who likes to think aloud, the new Live Transcription in Voice Memos, available on the Mac, iPhone, and iPad, will be appealing. The feature, which is similar to voice transcription in Notes, allows you to simply speak into your Mac’s microphone or an external mic connected to your Mac, and as you speak, your words will be transcribed in the app. Voice Memos’ default view remains a visualization of the audio waveform as you speak, but by clicking the transcribe button in the toolbar, you can switch to a view that shows your words as they’re spoken.

In my experience, the transcription is accurate and fast. It’s simple to copy the resulting text into another app, too. When you’re finished, you can listen back in transcription mode, and your words will be highlighted as the recording plays, much like the way lyrics work in Music. The app can even go back and transcribe older recordings you’ve saved, adding a little transcription icon to the recording’s listing.

The new transcription feature has a lot of potential for students listening to lectures and anyone conducting an interview. It could also be a good way to create a hands-free rough draft or quickly preserve your thoughts about something for later.

Chess is prettier in Sequoia.

Chess is prettier in Sequoia.

In addition to the foregoing:

  • The Calculator app has added calculation history and unit conversions, along with Math Notes, which I covered in the context of my Smart Script story last month.
  • The Home app has gained guest access, allowing you to grant trusted people access to your home at defined times.
  • Mac App Store downloads no longer require twice the storage of the app’s size at installation. Instead, they only require the space they’ll take up when installed, which seems like the way it should have worked from the beginning.
  • The Podcasts app in Sequoia doesn’t allow you to share timestamped links to episodes like you can on iOS or iPadOS, but links shared to a Mac from those platforms will load to the timestamp as expected.
  • The Chess app has better graphics than before but fewer style options.

Conclusion

So, that’s Sequoia. There’s no getting around the fact that Sequoia is an odd update. It has its highs and lows as any OS release does, but it feels incomplete. In some ways, that’s easily explained by the absence of Apple Intelligence, but not entirely.

Instead, Sequoia feels like visiting a bookstore and finding a book you want to read, only to discover that half the chapters have been torn out. The OS feels unfinished in a way past releases that started down a path of implementing new technologies like Swift and Catalyst didn’t.

Is that because Apple changed course midway on the path to Sequoia and scrapped features to focus on Apple Intelligence? Perhaps. It’s a theory that tracks with the fact that macOS 15.1 was released in beta to developers long before 15.0 was even released to the public, and even when 15.1 is released, its Apple Intelligence features will receive a ‘beta’ label. Those are unusual moves by Apple that arguably acknowledge that the Sequoia you can download today is incomplete and unfinished.

It’s tempting to ignore how Sequoia wound up in an odd spot and solely focus on the merits of what’s in 15.0, but I think that would be a mistake. Apple is known for waiting to announce hardware and software until it’s ready and the company can articulate a clear message about where it fits among their products.

At WWDC, Apple laid out a vision of Apple Intelligence as a personal, private, integrated approach to AI. It’s too early to judge whether Apple Intelligence will succeed at meeting those goals, but what’s different from other announcements that took years to implement is that there’s no trace of Apple Intelligence in macOS 15.0 today. In the past, even when the payoff of a new technology was years away, macOS has always included some sort of first step into the future for users alongside a full menu of standard year-over-year updates.

A good example is Mac Catalyst, which was announced at WWDC with macOS Mojave. Catalyst started as a Sneak Peek that made it clear the technology was more about the future than the present. Also, in contrast to Sequoia, Mojave offered a significantly deeper set of standard OS updates along with the first Catalyst system apps. Apple may have made it clear that Apple Intelligence is being released in stages over time, but it’s also a central part of the Sequoia story.

It’s hard to have that one both ways, which is at the core of what makes Sequoia different and has left me with a nagging sense of unease all summer. The update’s shipping features aren’t at fault; some I like, and others I don’t. Instead, it’s my, “Is that it?” reaction that concerns me and leads me to wonder what features may have been shelved to focus on Apple Intelligence.

Maybe this will all work out in the end, and we’ll look back at 15.0 as part one of an odd, quick two-part release when 15.1 comes along. I hope so, but those are stories for another day as Apple incrementally releases its AI features.

Regardless of Sequoia’s backstory, though, the result is an update that’s easy to miss unless you go looking for the new features. That may be music to some Mac users’ ears, but to me, it makes Sequoia an underwhelming release.

That’s not to say there aren’t bright spots. iPhone Mirroring is one of those Continuity features that I never knew I wanted, but I’m glad I now have. I don’t use it every day, but it makes my iPhone feel more like a part of my Mac workflow. The addition of iPhone notifications on the Mac has had a similar effect. Being able to manage and clear notifications in one place is simply more efficient.

Likewise, I’m a fan of the new window tiling system in Sequoia. It should have been added a long time ago, but it’s still a great addition. A dedicated Passwords app and Safari’s new video viewer make working with passwords and watching videos substantially better too. Plus, I appreciate the ability to create custom routes in Maps from my desk before heading out for a walk.

Other features, like Safari’s article summaries and the curiously rare tables of contents, aren’t ready for prime time. The late beta period addition of Hide Distractions to Safari strikes me as an ill-advised solution to a problem that’s better solved by Reader mode. It’s hard to imagine many of these Safari features making the cut in a year when macOS had more to offer users.

On balance, though, macOS Sequoia is fine, which is also the source of its biggest problem. Apple has taught us to expect more than fine. Maybe Apple Intelligence will fill Sequoia’s gaps and elevate it beyond fine, but I’m skeptical based on what I’ve seen of macOS 15.1 so far. Of course, there’s a long road ahead before I or anyone else can fairly judge Apple Intelligence. But by putting off those features until later and shipping a 15.0 update that’s light on other features and long on promises of a better tomorrow, Apple has simultaneously failed to tell a compelling story about this release and Apple Intelligence as the future of the Mac. Hopefully, macOS 16.0’s storylines will be more compelling on both fronts.


  1. Whether it's related to window tiling or not, Apple seems to have broken the ability to move the Mac's Dock from one display to another. ↩︎
  2. It's still popular to dump on System Settings, which is understandable because it's not great. But System Preferences isn't coming back, so I think it's time to move on to more important macOS limitations and make friends with System Settings' search field. ↩︎

Access Extra Content and Perks

Founded in 2015, Club MacStories has delivered exclusive content every week for nearly a decade.

What started with weekly and monthly email newsletters has blossomed into a family of memberships designed every MacStories fan.

Club MacStories: Weekly and monthly newsletters via email and the web that are brimming with apps, tips, automation workflows, longform writing, early access to the MacStories Unwind podcast, periodic giveaways, and more;

Club MacStories+: Everything that Club MacStories offers, plus an active Discord community, advanced search and custom RSS features for exploring the Club’s entire back catalog, bonus columns, and dozens of app discounts;

Club Premier: All of the above and AppStories+, an extended version of our flagship podcast that’s delivered early, ad-free, and in high-bitrate audio.

Learn more here and from our Club FAQs.

Join Now]]>
watchOS 11: The MacStories Review https://www.macstories.net/stories/watchos-11-the-macstories-review/ Tue, 17 Sep 2024 15:01:21 +0000 https://www.macstories.net/?p=76567

After years of steady, iterative updates to watchOS, last year, Apple dropped one of their most significant releases in years with watchOS 10. The design language was updated for all of their first-party apps, watch faces were upgraded to take full advantage of the larger screens on current models, and the Smart Stack was introduced to make glanceable information much easier to access. To make way for the Smart Stack, Apple also reassigned the Digital Crown and side button to new functions. These changes, along with the usual updates for health and fitness, made for a release that every Apple Watch user took note of.

The awkward recalibrating of muscle memory aside (I still very occasionally swipe up on my watch face to try and reveal the Control Center), it was an excellent update. My only worry coming out of it was that Apple would dust off their hands, reassign lots of their talent to something else, and go back to the usual, iterative, health- and fitness-focused updates with watchOS 11.

Thankfully, that was far from the case. Not only has Apple made some solid updates to the Apple Watch hardware line this year, but they’ve also enhanced and added to the software in ways that signal they are far from done.

The question is, are these changes going to enhance your daily use of Apple’s most personal device, or are they just, well, changes?

I’m excited to dive into this question in my first watchOS review for MacStories, but before I do, I want to thank Alex for his years of excellent watchOS coverage. I hope I can live up to the standards he set.

Right, let’s do this.

[table_of_contents]

Smart Stack

Last year, when the Smart Stack was introduced alongside hardware button reassignments in watchOS 10, the new interface’s many potential benefits were often lost amongst the various discussions over whether Apple should have changed the button functionality in the first place. And despite its name, daily usage of the Smart Stack didn’t really feel all that “smart.”

Part of me wonders if that was all deliberate. Now, one year out, the latest additions to the Smart Stack can be recognized for what they are: genuinely solid enhancements that actually make the Smart Stack live up to its “smart” name. Let’s see how these new features stack up. (Sorry.)

Live Activities

Live Activities are a great feature on iOS, but their glanceability depends on you having your iPhone on your person. So it feels only natural to bring them to the Apple Watch, a device that is literally attached to your body. Thankfully, this is one of those times Apple has channeled Steve Jobs’ famous phrase and Live Activities “just work.”

Rather than taking an app’s Lock Screen Live Activity from iOS and cramming it into a widget in the Smart Stack, watchOS uses the detail contained in a Live Activity’s Dynamic Island. It then displays this information in a slightly tweaked version of the Dynamic Island that also includes the app’s name.

Live Activities ‘just work’ by taking existing information from the Dynamic Island on the iPhone.

Live Activities ‘just work’ by taking existing information from the Dynamic Island on the iPhone.

The great thing about this approach is that developers don’t need to write a single line of additional code. They don’t even need to offer an Apple Watch app; watchOS mirrors what’s in the iPhone’s Dynamic Island and shows it in the Smart Stack. During a session, Live Activities will show temporary updates (like a change in your food delivery status) before switching back to the Dynamic Island data. If you are currently using an app on your Apple Watch, the update will appear as a small banner at the bottom of the screen, which you can either tap on to launch the Live Activity or dismiss.

This is a fantastic way of bringing the Smart Stack to the Apple Watch, and watchOS offers more customization, both for users and app developers.

Developers aren’t limited to mirroring the Dynamic Island in their watchOS Live Activities; they can create richer versions more in line with their Lock Screen Live Activities using only a few lines of code. They can also add buttons for actions like pausing audio content. Apple encourages developers to only use one button in Live Activities on the Apple Watch, but that recommendation doesn’t seem to be an enforced rule, so it will be interesting to see what developers do with this new functionality.

From a user perspective, the way Live Activities activate has also been well thought out. Launching a Live Activity on your iPhone will automatically launch the Smart Stack on your Apple Watch with the Live Activity at the top. This includes the Now Playing Live Activity, which appears rather than the full-screen player from previous years.

At first, this might seem annoying. The Smart Stack launches every time I have a Live Activity on my iPhone? Will I ever see my watch face again? Mercifully, in the Watch app on iOS, you can fully customize each app’s Live Activity behavior. You can have it auto-launch the Smart Stack, appear quietly in the Smart Stack for when you choose to bring it up, or not appear at all.

One of the first things I’d recommend every user do after updating to watchOS 11 is go down your list of apps and specify your preferences. Unfortunately, this is where you’ll butt up against a frustrating limitation of this feature: you need to go through every app one by one. If you’re someone who has a lot of apps on your iPhone – so, everyone – this will take some time.

Setting Live Activity settings for the Apple Watch. It takes a while so get comfy.

Setting Live Activity settings for the Apple Watch. It takes a while so get comfy.

Ideally, there would be a toggle at the top of the app list to turn all Live Activities off; then, you would be able to turn on the subset you want to use and specify how you want them to work.

If you tap on a Live Activity in watchOS, one of two things will happen. If the app is installed on your Apple Watch, it will launch. If the app is not installed, a full-screen alert will appear that includes a button to open the app on your iPhone. Tapping it will automatically open the app on your phone – after you authenticate, of course.

One final thing to note is that these Live Activities don’t work when your Apple Watch isn’t connected to your iPhone. This hasn’t been an issue for me yet; I suspect it rarely will be.

Except for the time-consuming list of settings in the Watch app, the introduction of Live Activities on the Apple Watch is exceptional. The way it all “just works” while offering a high level of customization for each app is very well thought out. I’ve found a lot of use in having Live Activities like my food orders, sports scores, and multiple simultaneous cooking timers easily glanceable on my wrist. I know I won’t be the only one.

Suggested Widgets

For the past year, the Smart Stack was supposed to be suggesting widgets for things like an upcoming meeting or reminders that hadn’t been completed. This was originally touted as a machine learning feature, and while it did periodically work for me, it felt half-baked. In watchOS 11, Apple has doubled down on this capability with suggested widgets.

This feature uses data like time and date, location, sleep schedule, music detection, Activity ring status, rain alerts, and more to suggest widgets in your Smart Stack. “Suggest” may be too gentle a term for what it does, though, as the widgets are literally added to your Smart Stack, sometimes right at the top.

Widget suggestions are, for the most part, useful.

Widget suggestions are, for the most part, useful.

After spending a lot of time last year carefully picking the widgets I liked and in which order I wanted them, I initially found this feature a bit chaotic. I had several pinned widgets in my Smart Stack, and suggested widgets are never allowed to appear above pinned ones. So I was getting several widget suggestions, but they were three or more widgets deep, which meant I never took advantage of them.

To properly test this feature, I unpinned all of my widgets and deleted a couple I didn’t need. That’s when I started to see the real benefit of suggested widgets. There are countless examples of times I looked at my Smart Stack at a particular moment and what appeared at the top was very relevant to me. To list just a few of them:

  • When I leave home, a Home app widget appears, showing me the status of devices in my house and, crucially, if any contact sensors on my windows and doors are open.
  • When I wake up, the Smart Stack always shows me a sleep widget displaying my sleep data from the night before and a weather widget with the current conditions.
  • If I have a meeting coming up soon, a Calendar widget with that event will be visible.
  • After a workout, the Smart Stack shows my Activity rings at the top of the stack.

Another convenient widget suggestion is meant to occur when the Apple Watch detects music playing prominently in your environment. In response, it will supposedly add the Music Recognition (also known as Shazam) widget to the Smart Stack. (You will need to give your permission to enable automatic music detection.)

The Music detection widget should automatically appear when music is playing.

The Music detection widget should automatically appear when music is playing.

I say “supposedly” because I can’t for the life of me get this to work. I’ve tried playing music on my MacBook, putting on vinyl records, and even checking it while at a gig – all to no avail. Here’s hoping it works better for others.

When the Smart Stack makes widget suggestions, they’re not always perfect. For instance, it occasionally adds a Home widget with my ‘Goodnight’ scene at around 7:00 pm, way before I should be in bed. Those occasional errors notwithstanding, widget suggestions work very well.

As with Live Activities, suggested widgets are enhanced by the ability to customize how each widget might be suggested. You can turn off suggestions for an app completely or specify the types of widgets the Smart Stack can suggest for a particular app. To give you an example, in the Home app’s settings, you can individually define whether to allow it to suggest accessories, scenes, controls, electricity usage, and electricity rates.

Refining your preferences for widget suggestions is even more time consuming and fiddly than Live Activities.

Refining your preferences for widget suggestions is even more time consuming and fiddly than Live Activities.

Unfortunately, as with Live Activities, there is no way to toggle all apps on or off (though you can turn the entire feature off with a toggle). To make matters worse, the controls are only available in the Settings app on the Apple Watch itself, not in the Watch app on iOS. This makes for a very fiddly and time-consuming experience when refining your setup.

Nevertheless, suggested widgets are another impressive addition to the Smart Stack. The level of granular control is particularly welcome. Apple could have just added this feature and let it run wild, but the company is allowing users to drill down into each app’s settings to precisely control suggestions, which is to be commended. After getting over my initial confusion about the feature, I’ve found it to add welcome contextualized information to my daily life.

Interactive Widgets

The enhancement of the Smart Stack continues with the new ability for developers to add interactivity to their widgets. We got a small glimpse of how this works last year with the Timer and Now Playing widgets. They each featured a simple, circular button on one side to start and stop a timer or playback, with a countdown or the currently playing song on the other. If you’ve seen those, you know the basics of how interactive widgets work in watchOS 11.

Interactive widgets are maybe too simple.

Interactive widgets are maybe too simple.

The most prominent example of interactive widgets in watchOS 11 is a Home widget that toggles an accessory on or off with a tap of its circular button. This implementation is okay, but it’s too constrained. If you want to control a light, you can turn it on or off, but you can’t specify the tint or color. Even tapping on the widget’s text performs the same action as pressing the button, which almost defeats the point of the layout. To adjust the tint or color of a light, you have to open up the Home app as before and access the setting that way.

A Home app widget to control a scene.

A Home app widget to control a scene.

This is a needless way of doing things. Tapping on the text should launch a details panel for the light containing these settings, just like in the Home app on iOS. Additionally, the widget featuring three recent workouts, which launches the selected workout when you tap on it, would make an excellent template for a Home widget layout. It would allow you to add your three favorite scenes or accessories and quickly toggle them on or off.

The new workout widget shows your three most recent workout types.

The new workout widget shows your three most recent workout types.

The term “interactive” is doing a lot of work here: you can technically interact with the widgets, but they really just feature a single button that starts or stops something. As with Live Activities, Apple suggests using only one button per widget in their developer videos. However, I hope we’ll see developers ignore that and create three-button group widgets that take advantage of this new control system. And since widgets can also be dynamically updated, developers could take this even further and make their widgets’ contents change based on user interactions.

Smart Stack Miscellany

The clock at the top of the Smart Stack now appears in the same style and format as your current watch face. It’s a nice touch, but I still believe it wastes space. It would be more efficient to immediately show two widgets when activating the Smart Stack instead of one or allowing room for complications. After all, when you scroll a bit further to show more widgets, the date and time are still displayed at the very top of the screen, so you’re never missing out on that information.

The clock in your Smart Stack changes style to match the watch face.

The clock in your Smart Stack changes style to match the watch face.

There are several new system widgets in watchOS 11. The severe weather alert is excellent, as is the one for upcoming rain. I’m not so keen on the Photos widget because it’s so tiny, but it’s there should you want it. There are also widgets for the new Vitals and Translate apps as well as Training Load – but more on them later.

Health and Activity

We all know the story of how Apple changed course with the Apple Watch in the initial years after its launch, focusing more on health and fitness rather than fashion and “personal connections.” Sorry, Digital Touch, you were ahead of your time. Or behind. I can never be sure.

This was once a banner feature of watchOS. Really.

This was once a banner feature of watchOS. Really.

Apple’s commitment to health and fitness with this product has meant yearly enhancements in that area. You needn’t look any further than the number of workout types the Apple Watch now supports for proof of that.

Alongside notifications, my primary use for the Apple Watch is tracking my health; anything new in that area will always catch my eye. While last year’s focus was more on OS design and interactions, watchOS 11 turns its attention back to significant new health and fitness features.

To be fair, there isn’t anything here that hasn’t been seen before in third-party apps or other fitness devices, but Apple has approached them all in a unique way, leaning on the company’s years of experience and even some of the medical studies it has helped facilitate.

First up is a new app that, like many watchOS apps before it, brings together and expands on several capabilities the Apple Watch already has.

Vitals

Apple’s slow and steady march forward with sleep tracking continues again in watchOS 11. Sleep tracking was added four years ago in watchOS 7 with only basic sleep/wake tracking and the ability to schedule a wind down as part of your bedtime routine. Back then, with the Series 6, tracking your sleep was a little challenging to do due to battery limitations – namely, the time it took to charge the device. Getting enough charge before going to sleep meant you had to efficiently plan when to take your watch off for a significant period of time. I remember hearing tales of people buying two watches just so they could track their sleep with one while the other charged.

A year later, the Series 7 was released with fast charging, allowing the battery to charge up to 80% in 45 minutes, and watchOS 8 added respiratory rate tracking during sleep. Now, with fast charging on the Series 10 down to 30 minutes for 80% battery and the addition of sleep apnea detection, there are more compelling reasons than ever to wear your watch to bed. watchOS 11 adds another big one: the new Vitals app.

However, before we get to that, we should note that Apple is making it easier to start tracking your sleep in the first place. In watchOS 10, if you wanted to track your sleep, you had to enable the Sleep Focus on your Apple Watch. Forget to do so and, well, your watch wouldn’t record any metrics (despite some third-party apps being perfectly capable of doing so). Starting with watchOS 11, you don’t need to trigger sleep tracking with a Focus change. Your Apple Watch will leverage its built-in motion sensors and sleep detection algorithms to identify when you fall asleep, allowing you to track your sleep – including naps throughout the day – automatically.

Sleep data in it’s various forms.

Sleep data in it’s various forms.

I had the terrible task of testing this out by taking several daytime naps this summer (“It’s for my review!” I told my justifiably skeptical wife) and am happy to report that it worked perfectly. I slept with an Apple Watch on each wrist (which, yes, did look ridiculous); one was set to the Sleep Focus, and the other was not. Both watches tracked my sleep within a minute or two of each other every time.

I suspect I’ll continue sleep tracking via the Sleep Focus technique, as I like to have my watch’s screen locked and turned off on a schedule. Still, it’s good to know that, should I pass out on the sofa on a Sunday afternoon, my Apple Watch will be able to tell me how much of the afternoon I lost to the land of nod.

There is a downside to automatic sleep detection, and it’s a strange one: when it detects sleep automatically, my Apple Watch never records my wrist temperature or blood oxygen. I have no idea why, but it’s not great, considering the new Vitals app relies heavily on those pieces of data.

The new vitals app is a nice overview of key metrics while you slept.

The new vitals app is a nice overview of key metrics while you slept.

The Vitals app itself is pretty simple to understand: wear your watch to bed, and it will track your average values for heart rate, respiratory rate, wrist temperature, blood oxygen, and sleep duration. After the first week of use, the Vitals app will establish your typical ranges for each of these metrics. If two or more of those figures ever fall outside your typical range, your Apple Watch will alert you and suggest possible reasons for this discrepancy.

If you use auto-detection for sleep tracking you loose two metrics. If you have a recent Apple Watch in the U.S. that doesn’t support blood oxygen measuring that column is removed completely.

If you use auto-detection for sleep tracking you loose two metrics. If you have a recent Apple Watch in the U.S. that doesn’t support blood oxygen measuring that column is removed completely.

The interesting thing about this new app is that it doesn’t actually measure anything new. All the metrics mentioned above are already available in the Health app on your iPhone. What Apple is doing here is taking the data and putting it together in a way that’s easier to review. In the Health app on iOS, the figures are just dots on a graph; unless you’re an expert, they don’t mean a lot. Showing if a metric is outside your typical range is helpful, and along with an alternative seven-day view of each metric, you can get an idea of what might be affecting the quality of your sleep.

Each metric has a small ‘i’ button that, when tapped, explains its importance and factors that can affect it. Apple claims that the notification algorithm that alerts you of outliers in your vitals is informed by data from the Apple Heart & Movement Study and input from clinical experts.

The info about each metric is very useful and informative.

The info about each metric is very useful and informative.

There’s more to this app than initially meets the eye, with some nice touches but also a couple of glaring absences. Let’s focus on the good stuff first. The information you’re given about each metric and its importance is very useful, and if you’ve never paid attention to these vitals as part of your sleep, it will educate you on what makes a good night’s sleep. Plus, the couple of times my wrist temperature has been shown to be out of range when I wake up, I’ve felt a bit rough later that day, which was interesting to see. One time, I even canceled an evening gym visit based on an alert from the Vitals app, which proved to be the right choice when I started to feel run down later in the afternoon.

Vitals in the Health app.

Vitals in the Health app.

Still, there are some missteps here that Apple would do well to address. First, healthy sleep is about consistency, and an excellent way to measure that is through the concept of sleep debt. As an example, if you normally sleep 7.5 hours every night except one when you slept sleep 5, you’ll be creating sleep debt that will have to be addressed. Sleep debt would be a much better vital to track in this new app than the somewhat simplified sleep duration.

Second, one common measurement provided by health devices is Heart Rate Variability (HRV). HRV is based on the amount of time between a person’s heartbeats and how much it varies. A high HRV suggests your body is well rested, you had a good night’s sleep, and you’re primed for a big day or a challenging workout.

This is an incredibly useful metric, and I find it strange that Apple hasn’t included it in the Vitals app, especially since HRV data is already available in the Health app. The only reason I can imagine for its absence is that it sounds like a complicated metric, but if you look anywhere in the Health app on iOS, there are tons of complicated-sounding metrics.

Overall, the Vitals app is a well-intentioned and, to some extent, useful app. There’s certainly room for improvement to make it a truly powerful tool, but if it follows the trajectory of sleep features on the Apple Watch, I suspect watchOS 12 will see an expanded version. If I were a betting man, I’d say that sleep debt tracking will be next on the list.

Training Load

Training load is a common term amongst athletes that measures the impact of exercise on a person’s body over a set period (typically seven days). It’s calculated by measuring the intensity and length of a workout along with the frequency of other workouts. Getting this figure and comparing it to the last 28 days allows you to be better informed about how to train. If your training load score is well below your average for the past 28 days, then you know you need to pick up the pace to push your body a bit more to see results. If it is well above, you might want to dial it back a bit so your body has time to recover and you don’t get injured.

Considering the fact that the Apple Watch has always been firmly aimed at the general consumer, especially when it comes to health and fitness, it’s a big move for Apple to step into this field. Proper training load can be a complex measurement, so in addition to using it in watchOS 11, I had to do a fair amount of research into the subject to fully understand it and how Apple’s approach is unique.

Unsurprisingly, this being a feature from Apple, it’s straightforward to use your workouts to inform training load and log the necessary data. Quite simply, after a workout, all you have to do is rate the effort you feel you put in on a scale of one to ten. Those options are also split into four categories: Easy, Medium, Hard, and All Out. You can enter your rating any time after a workout, and you can also change it later, should you wish. After a cardio workout, the Workout app will even suggest a rating for you based on age, height, weight, GPS, heart rate, and elevation.

Giving your workout an Effort rating straight afterwards or (right) later on.

Giving your workout an Effort rating straight afterwards or (right) later on.

This effort rating goes toward your training load score for that day, but that’s where things start to get a little murky.

Your effort rating is entirely based on your input (unless you leave the cardio ratings as suggested), so the question has to be asked: is that reliable? There is evidence to suggest people generally get these ratings right. Still, it strikes me as a potentially unreliable approach when other devices from brands like Garmin use metrics to give you a decisive, locked figure.

I’ve found the automatic ratings generated from cardio workouts to be okay, if not a little low compared to how I felt the workout went. However, non-cardio workouts are entirely reliant on my rating. Plus, I need to remember to enter the rating each time, which – I’ll be perfectly honest – doesn’t always happen.

The Training Load feature is a little hidden (top right on in the Activity app). You can also refine by workout type (right).

The Training Load feature is a little hidden (top right on in the Activity app). You can also refine by workout type (right).

I have other questions about Apple’s handling of this feature. For instance, there is no way to see a correlation between your workouts, effort ratings, and training load. The system also offers no advice concerning what to do about your training load, whereas third-party apps (such as Athlytic) will tell you whether you’re primed for a big workout or in need of a rest day.

As another example, if a user keeps pushing their workouts to get a higher percentage above their average 28-day training load, taking their training load higher and higher, would they know they need to ease up a bit? If you don’t know what these numbers truly mean, they can be confusing and, in the worst-case scenario, dangerous.

You can also view your training load in the Fitness app on iOS (with your figures from the Vitals app shown just below for comparison), where there is advice on what your percentage means. But is everyone going to look there? When I opened the Fitness app for the first time in iOS 18, the training load card wasn’t even present; I had to add it. I’d like to see Apple better present this potentially great feature and what it means to users.

Training Load in the Fitness app shows more detail and actual advice with what to do with your rating. The Apple Watch app does not.

Training Load in the Fitness app shows more detail and actual advice with what to do with your rating. The Apple Watch app does not.

That said, I have been using training load a lot, and after reading up about what it actually is, I’ve found it to be a good motivator for pushing myself rather than just closing my Activity rings every day. It’s also good that I can go back and adjust my effort rating for a workout later, something other devices don’t allow. This was particularly useful when I went on a long hike for a couple of hours that my Apple Watch rated as a 2. You know what? I felt pretty tired after all that walking, so I was glad to be able to up it to a 6.

This is only the first iteration of training load on the Apple Watch, so I’d like to see a couple of things change next year in watchOS 12. First, your training load figure relies entirely on your effort rating being input after every workout, which you are prompted to do when you end the workout from your Apple Watch. However, if it ends automatically - like with a Fitness+ workout or if you use Siri and don’t look at the screen - it’s easy to forget. If you don’t enter it, then your training load isn’t accurate. While you can go back and scan through the past few workouts to enter a rating, it would be good to have a more reliable reminder to rate a workout after completion, like a notification that stays in your Notification Centre on your iPhone or Apple Watch.

HRV in the Health app (left), in Athlytic (middle), and the recovery rating in Athlytic (right).

HRV in the Health app (left), in Athlytic (middle), and the recovery rating in Athlytic (right).

Secondly, Heart Rate Variability (HRV) is typically used to show readiness for training in the form of a ‘recovery’ percentage. As mentioned in the Vitals section earlier, your Apple Watch already takes HRV readings, and it would be good to combine this information with your training load to give solid advice to users about what sort of workouts would be suitable for their bodies that day.

Training Load is a fascinating addition to the Apple Watch. For the most part, I think it’s a good feature – if you understand it. However, it feels like Apple is holding back a bit with what they could really do here.

That said, what is really fascinating about training load is that it is the first truly “pro” training feature to be added to the Apple Watch. Sure, it’s watered down a bit, but this is a change from Apple’s “close your rings” approach of the past.

So why now? My theory is that the company seems to be having a hard time adding new sensors for things like blood glucose to the Apple Watch, which means it has to focus on adding more value in other places. Extending the capabilities of the Apple Watch into pro-athlete territory does just that, and I hope we see more pushes into this area.

Activity Rings

Let me posit a scenario: someone gets an Apple Watch as part of an effort to get fitter and maybe lose some weight. They’ve heard all about how the Apple Watch can help you get more active by “closing your rings”, so they’ve taken the plunge and brought their all. On the days that they are exercising and closing their rings, they feel great, but not having regularly exercised before, it’s a huge physical and mental challenge to do this every day. Nevertheless, on days they struggle to motivate themselves to work out – or even when they just want to have a day off – their Apple Watch keeps saying things like, “There’s still time to close your rings!” and “Find some time to be active today.”

I’ve created this hypothetical person to help explain how significant the new ability to pause your rings in watchOS 11 truly is. I’ll confess that if I’m having a rest day and don’t intend to close my rings, it doesn’t bother me. But for many people just starting out on their journey with getting fit, being constantly nudged to work out can have a negative effect mentally. Being able to essentially “turn off” your rings for the day is huge for those people and many others.

Let’s be clear: Apple doesn’t deserve massive praise for this. It’s something that we should have been able to do years ago. Nevertheless, I’m delighted it’s here, and of course, it’s very well implemented.

Pausing your Activity rings is very easy.

Pausing your Activity rings is very easy.

To pause your rings, tap on your weekly summary in the top-left corner of the Activity app. Then, scroll right to the bottom, and you’ll see two buttons: ‘Change Goals’ (more on that later) and ‘Pause Rings.’ Select ‘Pause Rings,’ and you’ll be presented with four options for pause durations: ‘For Today,’ ‘Until Tuesday,’ (other people seem to get ‘Until Monday’ so it’s a bit unclear what is happening here), until the next month, and ‘Custom.’ The ‘Custom’ option allows you to set the length of your pause in days. Once you’ve made your selection, your Activity rings will turn grey, and all alerts will be disabled. The night before your rings are due to resume, you will get an alert asking if you’re “ready to get back to it,” with the option to extend your pause should you wish to.

The prompt you will receive the night before your Activity rings restart.

The prompt you will receive the night before your Activity rings restart.

It’s a simple option that should have been here before, and now it is. Thankfully, that’s not the culmination of your power over your Activity rings. watchOS 11 also allows you to set a daily schedule for all three rings.

Scrolling down to the bottom of your weekly summary screen and tapping on the ‘Change Goals’ button will bring up your usual daily goal, with the move ring showing first. If you tap the calendar icon in the top right, you can switch to a weekly schedule with different daily targets. This is a much more elegant way to refine your rings’ targets than pausing them altogether. I don’t really get a chance to work out on weekends, so it was great to adjust my ring targets to more reasonable levels for those days. You can also do this in the Fitness app on the iPhone.

Setting a schedule for your Activity rings.

Setting a schedule for your Activity rings.

There is one final, relatively hidden change to your Activity rings. If you scroll to one of the rings in the Activity app, you can adjust its target just for that day, and it will snap back to its usual target the next day. Our lives and schedules aren’t always consistent, so this is a nice touch.

The only thing I’d like to see added to this feature is the ability to keep my stand reminders for the day even if I pause my move and exercise rings. Remembering to stand regularly is an impactful way to help reduce cardiovascular problems, improve circulation, and more. I love the stand reminders, as my days can sometimes pass by so quickly, and don’t want to disable them.

Regardless, the other changes here add up to some good enhancements to one of the Apple Watch’s central elements.

Fitness App

The Fitness app on the iPhone last underwent a significant change with the introduction of Apple’s Fitness+ service in December 2020, adding a tab that contains an ever-increasing library of workouts. (The app wasn’t available at all on the iPad until the launch of Fitness+.) I’m a very regular user of the service, using it almost every weekday for cycle, core, and strength workouts. Suffice to say, I’m a big fan.

Apple seems to be, too, since the company adds new workout types, trainers, themed workouts, and more on a very consistent schedule. These service additions aren’t tied to software updates, but iOS 18 and iPadOS 18 do include some improvements to navigating the plethora of workouts. The main change is the addition of a navigation bar along the top of the Fitness+ tab that features four sections: For You, Explore, Library, and search.

For You in Fitness+.

For You in Fitness+.

The For You section contains recommendations for workouts you might like based on what you usually do. If you’re new to Fitness+, the app will prompt you to pick activities you might be interested in to help populate this section. Since I stick to cycling, core, and strength workouts, that’s what the app suggests for me. What I like about these suggestions is that they are all workouts I haven’t done before. Too often, I have to search through the Fitness+ library to find a new workout; this speeds the process up. Your favorite trainers are also shown here, as well as your workout history.

The Explore tab in Fitness+.

The Explore tab in Fitness+.

In the Explore section, you can find new workouts, suggested sections and collections like ‘Get Back to a Routine’ or ‘Yoga for Every Runner,’ and beginner programs to get you started. I don’t really use this because I stick to a similar routine every week, but if you’re new to Fitness+ or have trouble picking workouts, this will be very helpful.

The Library tab is a great overview of what you have already done including custom plans and stacks.

The Library tab is a great overview of what you have already done including custom plans and stacks.

Next is the Library, where you can browse all the workouts, meditations, programs, custom plans, and stacks you have saved. When all Fitness+ content was clumped together in a single tab, your Library was tough to find and navigate because it was much further down the screen and not very well categorized. This redesign makes the Library much easier to access and use.

It’s far easier to search. Kim’s a Hackney girl so always good to hear a local voice.

It’s far easier to search. Kim’s a Hackney girl so always good to hear a local voice.

Finally, there’s search. Under the search bar, this tab includes cards representing different activity types offered in the app as well as a Discover section of collections like ‘Run Your First 5K.’ Tapping on the search bar allows you to search for workouts, trainers, music, and more. Yes, that’s right: workout music is available as searchable metadata in-app. Now, I can look for a workout that includes some AC/DC to get me fired up.

This new four-section organization scheme is most welcome and has helped me find new workouts much more easily than before.

One thing I’d like to see added to Fitness+ is support for custom interval workouts displayed on your watch. Sometimes, getting your iPhone out and doing a full video workout is impossible; maybe you’re at a busy gym, or you don’t have your iPhone nearby. It would be great if you could download a workout with different intervals to your Apple Watch so you could do it anywhere. The Up Next workout view, new in watchOS 11, would work very well with this feature, showing which exercise is next while you perform your current one.

Still, these are all very helpful quality-of-life changes that make Apple’s fitness service easier to use, which I’m really happy to see. While Fitness+ is not perfect, it’s constantly evolving and contains a lot of great workouts and trainers who make keeping active approachable.

As if the layout changes for Fitness+ weren’t enough, the rest of the Fitness app is also getting a significant shakeup this year in the form of customization. Scroll down to the bottom of the Summary tab, and you will see two new buttons: ‘Edit Summary’ and ‘See All Categories.’ Tapping the former will send all the cards in the Summary into “jiggle mode.” From there, you can rearrange their order – except your Activity rings, which stay locked to the top – and add in additional cards like Training Load, trends, friends, Fitness+ recommendations, and more.

Editing the summary in the Fitness app. Jiggle mode included.

Editing the summary in the Fitness app. Jiggle mode included.

As I mentioned before, I primarily do cycling, core, and strength workouts, so I appreciate the ability to add a card for each of those workout types for a quick check-in on my last session. Other cards, like Training Load, are nice to have in the mix, too, but I wish I could shrink or even remove the Activity Rings card. I always have my rings visible on my Apple Watch, and the card takes up almost a third of the screen that could be used for another category summary.

After organizing your perfect layout, you can tap the ‘See All Categories’ button to go to any category, including those you haven’t added to your main summary, and check in on its data.

Like the Fitness+ changes, this new layer of customization makes a meaningful difference to how I use the Fitness app. We will likely never know how successful Fitness+ is in raw numbers, but someone high up at Apple (maybe even early morning workout boy Tim Cook) really likes both the service and the app, and I enjoy seeing them grow. Thanks to the hard work of everyone involved, from the trainers in the videos to the software developers behind the scenes, the Fitness app remains one of Apple’s lowkey success stories.

Pregnancy Tracking

Well, I have a confession to make: I haven’t been able to try any of the new pregnancy features in watchOS 11. I really wanted to, but let’s just say I wasn’t up to the task. Nevertheless, having been through a slightly problematic pregnancy with our daughter (she’s fine now), I welcome any tangible help with things like high heart rates and mental health during and after pregnancy.

If you already use the cycle tracking feature built in to the Health app, you can now add a pregnancy by heading to the pregnancy section and selecting ‘Add a Pregnancy.’ You will then be taken through a setup process, and once you’re finished, the cycle tracking section will be changed to show your gestational age down to the day.

From there, you can add various data points, such as unexpected bleeding and even appetite changes. As usual, Apple never tries to offer medical advice, so when bleeding occurs, the app will prompt you to reach out to a healthcare provider.

Heart rates often increase during pregnancy, so if your high heart rate threshold is lower than 120 bpm, the Health app will notify you to review it. Furthermore, you will be alerted if your walking steadiness decreases during pregnancy.

Pregnancy tracking on iOS and watchOS. Source: Apple.

Pregnancy tracking on iOS and watchOS. Source: Apple.

You also have the option to be reminded more frequently to take a mental health assessment both during and after pregnancy. I think this is an excellent idea. Especially after birth, it’s so difficult to find time to step back and consider your changes in mental health, so being prompted to take stock of how you feel is essential in managing any changes that might come up.

Lastly, you will be able to see the time you were pregnant across all charts within the Health app, and your Medical ID will display your pregnancy along with your due date.

There many great apps available to help you manage or log your pregnancy, but having essential tools like walking steadiness tracking and heart rate notifications built into your Apple Watch is Apple at its pragmatic best.

More Health and Activity Additions

I’m not sure how many different types of sporting activities there are in the world, but surely Apple must be close to adding all of them as workout types on the Apple Watch. I say this because this is the second year in a row the company hasn’t added any new items to its infinitely scrollable list in the Workout app. What it has done is make notable expansions to its existing range of workout types, particularly the outdoor kind.

Route maps have been added to the following workout types: cross-country skiing, downhill skiing, snowboarding, outdoor rowing, and even golf. You can now display distance as an on-screen metric in outdoor hockey, lacrosse, disc sports, outdoor skating, American football, Australian football, rugby, soccer, paddle sports, cross-country skiing, downhill skiing, snowboarding, outdoor rowing, and – again – golf.

Custom workouts, a feature of the Workout app I feel people should be talking more about, can now create interval-based workouts for swimming, using haptics to signal when to switch segments. As I mentioned in the Fitness app section, each time you start a custom workout, you will see a new view on the workout screen showing your next interval.

These are all small additions when considered individually, but stepping back and looking at watchOS updates over the last few years, it’s clear that the enhancements to many different sports and activities have been quite astounding. I suspect Apple is far from finished.

OS and Apps

While the Apple Watch dominates its field, it often lags behind its very attached sibling – the iPhone – in gaining the latest apps and features. This year, several additions are coming to watchOS not long after debuting on iOS. Heck, some even arrive this year in tandem with their release on the iPhone.

It’s good to see, and it means there’s plenty to try out when you upgrade to watchOS 11. Let’s take a look.

Watch Faces

There are two new watch faces coming this year. As part of the Apple Watch Series 10 unveiling, Apple announced the Flux and Reflections watch faces. Flux features what the company calls a “bold graphic design” with numbers that take up the whole screen. As the seconds tick by, color fills the screen from the bottom up. It’s nice enough, but maybe on the lower end of the inventive scale.

The new Flux face. Source: Apple.

The new Flux face. Source: Apple.

Then there’s Reflections, which features a simple hand design in front of a metallic-like dial. The gimmick here is that the dial will change and shimmer to imitate real metal as you turn your wrist. This is a clever effect and the first of its kind for the Apple Watch. I’d love to see these sorts of subtle reactions come to more watch faces in the future.

The new Reflection face. Source: Apple.

The new Reflection face. Source: Apple.

A few years ago, Apple proudly stated that the Photos face is their most popular watch face. In watchOS 8, the company introduced the Portraits face to take advantage of depth data within your photos, making the subject interact with the digital time. Well, the Portraits watch face is no more, as those features have now been merged into the Photos face, which is also seeing some upgrades.

When you first edit an existing Portraits or Photos watch face, you will be prompted to upgrade to the new Photos face in the Watch app on your iPhone. This is where the creation and editing tools for this watch face now reside. You can keep an existing Portraits watch face intact, which you may wish to do since there are only three very condensed fonts available in the new Photos watch face.

Try to edit your photos face and this is what you’ll get.

Try to edit your photos face and this is what you’ll get.

You can add up to two text complications to the Photos face and choose whether you want to use just one photo or have your Apple Watch shuffle through several that you specify, just like before.

Creating a new photos face in the Watch app on iOS.

Creating a new photos face in the Watch app on iOS.

Where this watch face gets its supposed upgrade is the way it chooses how your selection of photos is displayed. You can choose your preferred font and a style (basically, a tint) to be applied, but if you set the time size to ‘Dynamic,’ the position and size of the digital time will change to compliment the subject’s position in the current photo. Apple uses machine learning to do this and claims that if you choose a category of photos to shuffle through, such as ‘People,’ it will select the best pictures based on facial expressions, aesthetics, and composition.

The new Photos watch face gets very mixed results.

The new Photos watch face gets very mixed results.

When I tried this watch face, it occasionally chose a picture that didn’t fit in – like a picture of me instead of my wife and daughter – or the subject appeared behind the time. It also seems to struggle with landscapes, especially when a tint is applied. But with faces, it works fine, I guess.

Look, I don’t wish to sound down on these watch faces because they are nice. It’s just hard not to feel a bit disappointed with what feels like such a minor update to one popular watch face, along with two subtle new ones. Really, what I want is third-party watch faces, or at least some daring new designs to choose from. Remember last year when we got that incredible Snoopy watch face?

Say goodbye to Numerals, Siri, Chronograph watch faces.

Say goodbye to Numerals, Siri, Chronograph watch faces.

There are actually fewer watch faces in watchOS 11 than before. In addition to Portraits and Photos combining into one watch face, the Siri watch face is gone, as are Explorer, Numerals, and Chronograph (though Chronograph Pro still exists).

For once, it’s all a bit quiet on the watch face front, which is frankly a bit disheartening.

Check In

Check In was introduced last year in iOS 17. I’ve used it quite a few times when cycling home from work as peace of mind for my wife. If I’m not home by the expected time, she will receive a notification to check in with me, along with my location, battery level, and more.

Check In in the Messages app.

Check In in the Messages app.

watchOS 11 brings the feature to the Messages app for watchOS, the same place it appears on iOS. More importantly, though, Check In has also been added to workouts.

When I start an outdoor cycling workout on my Apple Watch as I head home, I can swipe right on the workout screen to show the controls, then scroll down to tap the Check In button. From there, I can Check In with my wife after a certain amount of time, when I arrive at a specific location, or when my workout ends.

Check In when doing a workout.

Check In when doing a workout.

The Apple Watch even takes Check In to the next level by adding the ability to let my wife know if my speed dramatically increases or my heart rate drops to an unusually low level. Thankfully, I have not needed these features yet, but I can foresee many people using Check In on a cellular Apple Watch, leaving their iPhone at home, and heading out for a late-night run.

The ability to start a Check In when you’re not connected to your iPhone but have a Wi-Fi or cellular connection is also welcome. Lastly, you can have your Apple Watch remind you to send a Check In to someone every time you start an outdoor workout. This is perfect for when I forget to do so manually, which is about 50% of the time.

Check In is a great feature, but I would still like to see the option to send your live location along with your Check In. My family can, of course, use the Find My app for this, but not everyone wants to share their location with their friends and family all the time, so an option to do so temporarily within Check In would be a good addition.

Hiking Routes

Apple loves talking about – and even raising money for – the U.S. national parks – so much so that they have an annual national park-themed activity challenges to earn special awards. This year, they brought their love of national parks to a new Hiking Routes feature in the Maps app on iOS. You can create your own custom routes, too, but the thousands of built-in hikes you can browse through are all located in U.S. national parks.

Browsing, searching, and viewing a hiking route on iOS.

Browsing, searching, and viewing a hiking route on iOS.

To be clear, hiking trails and trailheads aren’t new this year, despite what Apple implies. They’ve just been given more prominence and curation.

When viewing a U.S. national park in the maps app on iOS, you can find hikes by bringing up the search bar and typing “Hikes” or by tapping on ‘Hikes’ under the ‘Find Nearby’ section. (This option disappears when looking at locations outside the U.S.) You can also find hikes by tapping on a trailhead on the map and seeing which hikes it is a part of. When you find a hiking route you like, you can then add it to your library and also get directions to its start.

Once you add a route to your library, that’s where the Apple Watch comes in. From the Maps app on your Apple Watch, you can see a route’s duration, length, and elevation. When you are near enough, you can tap ‘Go’ to start turn-by-turn directions with optional voice guidance.

The library is new to the Maps app. This is where you can browse your saved routes.

The library is new to the Maps app. This is where you can browse your saved routes.

Details of your route. When you’re nearby just hit go!

Details of your route. When you’re nearby just hit go!

It’s also possible to save these routes to your Apple Watch for offline use. This process is not clearly explained, but you need to download the route to your iPhone first. Then, in my experience, the route will sync over whenever my Series 9 darn well feels like it.

This sort of vagueness would be frustrating if I were just about to head out on a hike and wanted to leave my iPhone behind. Why introduce such a great feature and make its setup process so opaque? There should really be a button next to each route in the iOS Maps app that syncs the route to your Apple Watch.

Creating a custom route in the Maps app on iOS. You can do this anywhere in the world, like this random route in New York.

Creating a custom route in the Maps app on iOS. You can do this anywhere in the world, like this random route in New York.

You can also create your own routes on your iPhone by using a trailhead or any landmark as a starting point and then simply tapping around the map. The route will snap to the nearest pedestrian route, such as paths or roads. This is perfect for creating a route for a lovely country walk wherever you are in the world and then leaving your iPhone at home to truly take a break from your online life.

When I was able to download a route to my watch, I enjoyed the experience, even using it once for a cycling route. However, the unreliability of the offline syncing between my iPhone and Apple Watch is disappointing. I’d also like the option to turn a map from a walking workout I’ve previously completed into a saveable route in the Maps app.

Translation

I’m perfectly primed to take advantage of the Translation app’s transition to the Apple Watch: my wife is Spanish, and only a couple of her family members speak English. I also have a three-year-old who is rapidly overtaking my ability to speak the language.

I frequently use the app on iOS to quickly translate a word, tell me how to say a phrase in Spanish, or sometimes have an entire back-and-forth conversation using the Conversation tab. The iOS app is clean, easy to use, and very fast at translating. I’m excited about the watchOS version because it puts the app on my wrist, making it even easier to access.

The Translate app on iOS is excellent.

The Translate app on iOS is excellent.

To be perfectly honest, if you’ve already been using the Translate app on iOS, there isn’t a whole lot more to talk about here. The watchOS version is missing camera translation (unsurprisingly) and the conversation feature. My guess is that Apple thought having two people huddle together over a watch screen might be too much. Otherwise, all the other Translate features are present, including your favorites, which sync between your watch and iPhone.

The Translate app on watchOS is identical to iOS. Just compare with the previous image.

The Translate app on watchOS is identical to iOS. Just compare with the previous image.

To translate a word or phrase, you simply select the language you speak and the one you want to translate to, then tap the microphone button. (You can change the default input to the on-screen keyboard, should you wish.) When you’re finished speaking, you can tap the stop button or wait a couple of seconds for the app to automatically stop listening. Your translation will appear, and you can tap either language’s version to hear it phonetically or swipe right to add it to your favorites.

If you don’t have headphones connected to your Apple Watch, the translation’s audio will play through the device’s speaker, which I appreciate. I certainly don’t want to have to reach for my AirPods anytime I want to hear a Spanish phrase.

The Translate app comes with a final touch I appreciate, and it involves the Smart Stack. When you travel to another country, the Smart Stack will suggest the Translate widget automatically, giving you handy access to the appropriate translation you might need.

The Translate widget (left), and the suggested widget when you travel to another country. It’s a great touch.

The Translate widget (left), and the suggested widget when you travel to another country. It’s a great touch.

Writing this section was hilariously straightforward because the Translate app is just the iOS version running on the Apple Watch. But let me be clear: this isn’t a dig at Apple in any way. The Translate app on the iPhone works brilliantly; now, it also works brilliantly on the Apple Watch. Job well done.

Additional OS Features

Ultra Wideband Home Keys

Up to this point, a home key on the Apple Watch has allowed you to use the device’s NFC chip to unlock supported smart door locks by tapping them. In watchOS 11, you can now leverage the Apple Watch’s Ultra Wideband chip to unlock doors hands-free as you approach.

This sounds great, but there’s a twist: no door locks currently support this feature, and none will until the release of a standard called Aliro in 2025. So great job, Apple, for getting ahead of the game, but this isn’t anything to get too excited about just yet – especially when we can’t see how reliably it will work. (For an explanation of Aliro and this possibly tactical move by Apple, check out this video from Eric Welander.)

Enhanced Ticketing

This is another feature that I haven’t been able to test and won’t be able to until companies start rolling it out. Enhanced ticketing allows issuers to add much more information to tickets in your watchOS wallet.

This feature, also launching on iOS, can include extra information like an event guide, directions to parking, venue open and start times, and more.

As with many new features this year, the Smart Stack also has a role to play: a Live Activity for an event will start when you arrive at its location so you can quickly get information like your seat number. This all sounds great, but no one has rolled this out yet, so its usefulness remains to be seen.

Indian Script Support

There is now support for showing the time in a variety of Indian scripts on certain watch faces. The nine Indian scripts are Bengali, Gujarati, Gurmukhi, Kannada, Malayalam, Meitei, Odia, Ol Chiki, and Telugu. These can be used in three different languages (Burmese, Khmer, Urdu) and are supported on the Astronomy, California, Memoji, Modular, Modular Duo, Modular Compact, Utility, World Time, and X-Large watch faces.

Maybe it’s because I’m used to staring at boring western numerals every day but these look beautiful.

Maybe it’s because I’m used to staring at boring western numerals every day but these look beautiful.

I’m not fluent in any of these languages whatsoever, but having used scripts like these in design materials before, I must say that they are very nicely crafted.

Double Tap

The Double Tap feature introduced in watchOS 10 is now more functional. You can use it to scroll through almost any app, such as the Activity app with its various screens or a third-party app like CARROT. This feature is automatically available in all apps, but developers can also take advantage of a new Double Tap API to control specific parts of their apps. I’ve been testing a few apps that have done so, which I’ve featured in the Tidbits and Extras chapter.

This is one of those accessibility features that will hugely benefit users with limited mobility as well as regular users like myself. I hope to see it expand even further in the coming years with more functionality – maybe a tap-and-hold gesture, or even a triple tap.

Tidbits and Extras

Small Tweaks and Tidbits

One of the fun parts of any OS release is spotting the smaller tweaks and extras. They can be fun, helpful, or potentially even annoying, no matter their seemingly trivial nature.

This year is no different, with many interesting tidbits scattered across watchOS 11.

  • Swimming workouts will now display the water temperature if you’re wearing an Apple Watch Series 10 or Apple Watch Ultra 2 .
  • The new AirPods Pro 2 hands-free gestures on iOS – which let you say yes with a nod or no with a head shake – also work with automatic workout prompts on the Apple Watch. Nice.
  • When you are in the Sleep Focus and want to unlock your Apple Watch, you only need to press the Digital Crown instead of holding it down for a few seconds like in watchOS 10.
  • Similarly, if you have the water lock on, you only need to hold down the Digital Crown for about one second to turn it off, as opposed to around three seconds in watchOS 10.
  • The small Now Playing icon that appears at the top of the watch face now gently pulsates when media is playing and stops when it’s paused.
  • Full-screen notifications now have a subtle fade-in and scale-up effect.
  • Scrolling down with the Digital Crown from the watch face will now pull your notifications down from the top of the screen.
  • watchOS 11 includes the ability to change the system ringtone and set separate ringtones for different apps like Calendar and Reminders.
The new Quick Switch menu on the Apple Watch Ultra will be a favourite for Ultra fans. The new stopwatch (left) and timer (right) apps.

The new Quick Switch menu on the Apple Watch Ultra will be a favourite for Ultra fans. The new stopwatch (left) and timer (right) apps.

  • When you start multiple timers, they appear as widget-shaped tiles in the Timers app. Previously, you had to swipe through full-screen timers; having them in a scrollable list is much better for access and glanceability. It also enables you to quickly swipe to delete each one. Each timer also shows up as a widget in the Smart Stack.
  • The Stopwatch app’s white background has been replaced with a black one that’s easier on the eyes.
  • The Action button on the Apple Watch Ultra has a useful new feature: holding it down brings up the Quick Switch menu, allowing you to quickly reassign it without going into the Settings app. You can adjust the options shown in the Quick Switch menu in Settings.
  • Tap to Cash, the feature announced for the iPhone that allows you to transfer money by tapping two devices together, is also available on the Apple Watch.
The new Remote app (left and right) and the only part of Apple Intelligence coming to watchOS: notification summaries.

The new Remote app (left and right) and the only part of Apple Intelligence coming to watchOS: notification summaries.

  • The Remote app for your Apple TV gains the ability to control the power and volume, as well as use Siri hold-to-talk.
  • While the Apple Watch will not get any Apple Intelligence features for the foreseeable future, it will be able to mirror summarised notifications from the iPhone when Apple Intelligence launches with iOS 18.1.

Third-Party Apps

The last piece of the puzzle in all of Apple’s platforms is third-party apps. Now more than ever, their importance in shaping the platforms is beyond doubt, so I’d like to feature updates to a few indie apps that are using some of the new watchOS APIs.

Oxpecker

Oxpecker is a Mastodon client built exclusively for the Apple Watch. You can do quite a lot with it, from scrolling your timeline to replying to posts to viewing your notifications.

Its latest update supports Double Tap, but developer Vincent Neo has gone further by making it very customizable. Within the app’s settings, you can choose what Double Tap does when the app is open. You can choose from scrolling (the default), creating a post, viewing the nearest visible post, and refreshing your timeline.

As good as this new functionality is, it does cry out for Apple to add more variations to the tapping gesture so you both can scroll and create a new post without touching your screen.

Zenitizer

From Manuel Kehl, Zenitizer is a clutter-free meditation app for iPhone, iPad, Apple Watch, and even Vision Pro. With its latest release, the app now supports interactive widgets, Live Activities, and Double Tap. There are two widget options: one showing your daily goal and a group widget to launch up to three different meditations.

When you tap any of the meditations in the group widget, the app will launch and start the selected meditation. It then will launch a Live Activity in the Smart Stack that includes the current stage of the meditation and a stop button.

As if that wasn’t enough, Double Tap now works within the app to quickly start or stop a meditation. It’s great upgrade all around.

Actionary

A simple app for glanceable health stats like heart rate, steps, and cardio fitness, Actionary has embraced the new group widget displaying the three most recent metrics.

µBrowser

As far as I know, this is the only browser available for the Apple Watch. You can enter a URL and scroll through almost any website, all on your wrist.

The latest update, developer Arno Appenzeller adds a new group widget containing either three bookmarked sites or your three most recently visited sites. And because watchOS supports scrolling in any app with Double Tap, you can scroll down a webpage hands-free. I’m not suggesting this means you can secretly scroll through websites in boring work meetings, but I’m not not suggesting it.

Wrist Pipe

This app is built to be a stand-in for a pitch pipe. It’s a utility that helps you tune your instruments and includes the ability to store many different set lists.

In the latest update, developer David Freeman has added the ability to play the selected pitch with Double Tap. I imagine this would be perfect for times when your opposite hand is busy holding an instrument.

Chronicling

Chronicling, by Rebecca Owen, allows you to track anything in your life. The latest update for watchOS 11 adds an interactive widget to log an event directly from the Smart Stack. It also includes Double Tap support to log the event that’s currently on-screen in the app.

Conclusion

This is my first watchOS review, and to be honest, I wasn’t fully sure what to expect going in. Alex has done such a great job with previous releases that I just wanted to provide a good enough experience for readers that they wouldn’t come banging down the doors at MacStories HQ (a fictional place that should definitely exist), asking him to return to reviewing watchOS.

One aspect of the review I didn’t expect was the sheer scale of what there was to cover. I think there were two reasons for this. On the one hand, there is far more to these operating systems that we use every day than we realize; we only use the parts that we need. On the other hand, this has been a bumper year not just for watchOS but for the Apple Watch as a whole. (Just don’t tell the Apple Watch Ultra.)

The Series 10’s form factor underwent notable changes, and charging speeds and screen viewing angles were upgraded. On the software side, there was something new in almost every corner of the OS.

Training Load and Fitness app upgrades improve how we exercise. Vitals can help us sleep better (and that’s to say nothing of the just-released sleep apnea detection capabilities). The Smart Stack is fully smart now and creates an Apple Watch that is even more adaptable to our current situations. Then there’s Translate and Check In that make their way over from our iPhones and land successfully on the Apple Watch. Even Double Tap is more useful.

It’s a really expansive update. There is one outlier, however.

In his review of watchOS 10 last year, Alex praised the new Smart Stack but lamented the reassignment of Control Center to the side button without any UI changes. He suggested that Apple may not have had enough time to update it and watchOS 11 could see some refinement. Well, spoiler alert: they haven’t touched it.

This is the one significant part of watchOS that Apple has completely ignored, and it’s starting to show its age. In the same year the company introduced interactive widgets in the Smart Stack, the Control Center does exactly the same thing it has done for – checks notes – seven years. This really has to change in watchOS 12, and seeing how much work has been poured into watchOS 11, I’m hopeful that Apple has the bandwidth to take a real run at it.

Now back to the good part. Reading the fantastic conclusion to Federico’s iOS and iPadOS 18 review, I was struck by what he described as his interest in people. The Apple Watch has always been a very personal device – it tracks your heart rate, for goodness sake – but so many of the changes in watchOS 11 are designed to make fundamental differences for the people who wear an Apple Watch.

Of course, the fitness elements are very much aimed at people, but take a look at all the other new additions. Pregnancy tracking aims not only to keep track of a pregnancy, but also to help with things like heart rate monitoring and mental health. Check In is meant to help you feel safe when you’re out and about. Translation is now here to help you communicate better with people who don’t speak your native tongue. Finally, the Smart Stack adjusts to what’s happening in your day.

All of these features are focused on people and their interactions with each other and the world around them.

It’s also notable that while so much coming from Apple right now seems focused on that other thing, watchOS 11 is almost entirely plowing its own furrow. It arrives complete and ready to go. As someone who’s been using the iOS and iPadOS 18 betas over the summer, it’s been great not to have to wait for the other shoe to drop for watchOS.

There is a lot to enjoy about this year’s release, and the past three months I’ve been using it have been fun. Almost all of the features relevant to me (again, I can’t enjoy the miracle of pregnancy) have stuck after I tried using them. I feel that the Apple Watch is more relevant than ever, and judging from this release, so does Apple – which makes me happy.

Credits

This is my first year reviewing watchOS, and it is by far the biggest project I’ve ever undertaken. It could not have been done without chocolate, the music of Eels, and the following people (cats are people):

  • L&L, my girls. Thank you for the amazing love and support you always give me, as well as making me laugh more than anyone possibly could.
  • Nina and Ella, for keeping me company while I wrote and bringing me that dead mouse once.
  • My family, and in particular my parents, for the lifelong support of my nerdy tendencies. Thanks for introducing me to a Macintosh 30 years ago, Dad.
  • Federico and John, for your patience, support, and for having faith in me to do this. You constantly inspire.
  • Devon, for being an incredible editor and offering suggestions that are genuinely helping me do good words and write more pretty.
  • Niléane, we’ve been on the team for a year now, and it’s been so much fun doing it with you.
  • Paul Stockton, for walking around for ages trying to test out a feature for me.
  • Jonathan (no, the other one), for getting me that Vitals screenshot.
  • All the app developers for doing awesome work and in particular those who sent me betas to test.
  • Ulysses, Remind Me Faster, Shortcuts, Apple Notes, Reeder Classic, Omnivore, Photoshop, and Blender. Awesome apps that got this done.
  • Everyone at Apple who keeps plugging away at these great features for the Apple Watch. Can’t wait to see what’s next (sleep debt tracking and recovery scores?!)
  • Every Club MacStories member for supporting everything we do here.
  • Our amazing Discord members. As Community Manager, you make it easy; I’ve not come close to kicking any of you. You’re so nice, smart, and considerate of one another. The Internet could learn a lot from you.

And of course, you, dear reader. You made it this far and even read the credits. Kudos. You get a gold star and my eternal gratitude.


Access Extra Content and Perks

Founded in 2015, Club MacStories has delivered exclusive content every week for nearly a decade.

What started with weekly and monthly email newsletters has blossomed into a family of memberships designed every MacStories fan.

Club MacStories: Weekly and monthly newsletters via email and the web that are brimming with apps, tips, automation workflows, longform writing, early access to the MacStories Unwind podcast, periodic giveaways, and more;

Club MacStories+: Everything that Club MacStories offers, plus an active Discord community, advanced search and custom RSS features for exploring the Club’s entire back catalog, bonus columns, and dozens of app discounts;

Club Premier: All of the above and AppStories+, an extended version of our flagship podcast that’s delivered early, ad-free, and in high-bitrate audio.

Learn more here and from our Club FAQs.

Join Now]]>
iOS and iPadOS 18: The MacStories Review https://www.macstories.net/stories/ios-and-ipados-18-the-macstories-review/ Mon, 16 Sep 2024 14:30:31 +0000 https://www.macstories.net/?p=76527 There is still fun beyond AI.]]> Maybe I’m Not a Pro Anymore https://www.macstories.net/stories/maybe-im-not-a-pro-anymore/ Mon, 09 Sep 2024 22:49:14 +0000 https://www.macstories.net/?p=76432

Apple just wrapped up their September event revealing a bunch of updates to the Apple Watch, AirPods, and iPhone lineups, and I have only one major takeaway as a person who was absolutely, positively going to get the iPhone 16 Pro if it came in gold: I am seriously considering downgrading to the non-Pro iPhone 16 this time around. 

I’ve been an iPhone Pro user since it was first announced — and in some ways even earlier if we consider the iPhone X to be the first true Pro iPhone when it was released alongside the 8. I always went Pro because I take a huge amount of photos with my phone and, despite currently owning four dedicated photography cameras, none beat the ease of use, access, and versatility of the iPhone. It made sense to use the best camera system available. In a lot of ways, I credit the Pro models with keeping me interested in photography throughout all these years, and learning to utilize the different lenses for different scenes and subjects has felt like learning a skill, like the iPhone Pro was helping me become a pro. I truly believe a lot of those skills transferred directly to the photography I create today.

I read a piece by Allison Johnson over the weekend about the base iPhone experience that resonated with me. She wrote:

At the very least, it looks like Apple is about to bring a little more balance back with the iPhone 16 series. I think that’s how it should be — the Pro iPhone should feel like you’re getting something extra, not being cornered into paying more for an essential feature.

It occurred to me watching today’s presentation that this is the first year where the iPhone’s Pro models feel like they were actually made for professionals. The “extras” Allison is referring to used to be the marquee features of the newest lineup — the Dynamic Island, the Action button, etc — but are now seriously professional features. The latest additions and the reasoning behind their inclusion all felt completely alien to me. I kept wondering: how many people will actually use camera options like the ability to dynamically shift the audio mix while recording in ProRes or capturing 4K content in slow motion to use for a music video? It became clear that the reach of what “Pro” means to Apple has outpaced my own photographic use case. I simply went with the Pro models year after year because I liked having a third lens in my kit and also got some kind of fun bonus feature to mess around with, but I don’t think that’s enough of a reason to justify the jump anymore. And that’s okay!

The base iPhone lineup seems to have pretty much everything I could ever need this time around — including the dedicated Camera Control button and the more fun colors I’ve always been so envious of — so the idea of downgrading seems almost prudent despite being a nonstarter for me as early as this very morning. I think I can be perfectly happy with one less lens and Halide’s Process Zero to keep experimenting with. And maybe the limitation of having one less lens will help me grow as a photographer in an entirely new way this time around. That would be nice!

I’m legitimately very happy for the people out there who will get the iPhone 16 Pro and make use of the latest and greatest Apple’s incredible product design teams have to offer, but I think it’s finally time I step down from the precipice of what the iPhone is capable of. In some ways, it feels simpler now, and the “Pro” name has the weight it should have all along. No artificial reasoning behind which features make it to which device, just one iPhone for the people who need it and one iPhone for the rest of us.

I’m fine being lumped into that second bucket this time around.


You can follow all of our September 2024 Apple event coverage through our September 2024 Apple event hub or subscribe to the dedicated September 2024 Apple event RSS feed.


Access Extra Content and Perks

Founded in 2015, Club MacStories has delivered exclusive content every week for nearly a decade.

What started with weekly and monthly email newsletters has blossomed into a family of memberships designed every MacStories fan.

Club MacStories: Weekly and monthly newsletters via email and the web that are brimming with apps, tips, automation workflows, longform writing, early access to the MacStories Unwind podcast, periodic giveaways, and more;

Club MacStories+: Everything that Club MacStories offers, plus an active Discord community, advanced search and custom RSS features for exploring the Club’s entire back catalog, bonus columns, and dozens of app discounts;

Club Premier: All of the above and AppStories+, an extended version of our flagship podcast that’s delivered early, ad-free, and in high-bitrate audio.

Learn more here and from our Club FAQs.

Join Now]]>
iPadOS 18’s Smart Script: A Promising Start But Don’t Toss Out Your Keyboard Yet https://www.macstories.net/stories/ipados-18s-smart-script-a-promising-start-but-dont-toss-your-keyboard-out-yet/ Thu, 29 Aug 2024 14:00:11 +0000 https://www.macstories.net/?p=76321 Source: Apple.

Source: Apple.

The carefully controlled demos of Smart Script at WWDC reminded me of every time Apple shows off the Photos app, where each picture is a perfectly composed, beautiful image of smiling models. In contrast, my photo library is full of screenshots and random images like the blurry one I took the other day to capture my refrigerator’s model number.

Likewise, handwriting demos on the iPad always show someone with flawless, clear penmanship who can also draw. In both cases, the features demonstrated may work perfectly well, but the reality is that there’s always a gap between those sorts of perfect “lifestyle” demos and everyday use. So today, I thought I’d take iPadOS 18’s Smart Script for a spin and see how it holds up under the stark reality of my poor handwriting.

Smart Script, meet John's handwriting (auto-refine enabled).

Smart Script, meet John’s handwriting (auto-refine enabled).

The notion behind Smart Script is to make taking handwritten notes as easy and flexible as typing text. As someone who can touch type with my eyes closed, that’s a tall order, but it’s also a good goal. I’ve always been drawn to taking notes on an iPad with the Apple Pencil, but it’s been the constraints that held me back. It’s always been easier to move and change typed text than handwritten notes. Add to that the general messiness of my handwriting, and eventually, I abandoned every experiment with taking digital handwritten notes out of frustration.

Smart Script tries to address all of those issues, and in some cases, it succeeds. However, there are still a few rough edges that need to be ironed out before most people’s experience will match the demos at WWDC. That said, if those problem areas get straightened out, Smart Script has the potential to transform how the iPad is used and make the Apple Pencil a much more valuable accessory.

First off, it’s worth noting that Apple is doing some very impressive real-time machine learning to process what you write. I’ve primarily been testing Smart Script on an 11” M4 iPad Pro with 1TB of storage and 16GB of memory. That’s a top-of-the-line iPad Pro, but the feature also works with most modern iPads. Here’s the full list of supported devices:

  • iPad Pro 12.9-inch (5th generation and later)
  • iPad Pro 11-inch (3rd generation and later)
  • iPad Air (M2)
  • iPad Air 10.9-inch (4th generation and later)
  • iPad (10th generation)
  • iPad mini (6th generation)

I briefly tested Smart Script on my iPad mini too, and so far, the experience has been roughly the same as on my iPad Pro.

That said, on the M4 iPad Pro, Smart Script runs smoothly, but my iPad gets noticeably hot and the battery drains more quickly than usual while filling a full screen of notes with my handwriting. This is the sort of issue that tends to settle down with the public release of an OS update, but it’s worth noting because the heat and battery drain I’ve experienced during the betas makes using Smart Script feel more like working in Final Cut Pro for iPad than Notes.

Auto-refinement is subtle. The notes on the left were auto-refined by Smart Script, while the ones on the right weren't.

Auto-refinement is subtle. The notes on the left were auto-refined by Smart Script, while the ones on the right weren’t.

I suspect most people are probably not thrilled with their own handwriting. I know I’m certainly not; it pains me to show mine off in this story. But that’s why Smart Script’s ability to refine your handwriting is so useful. The feature isn’t replacing your handwriting with a facsimile. Instead, it’s analyzing your writing style and preserving it while also smoothing it to make it straighter and more legible, which helps take the edge off of my scratchings.

Turning on auto-refine.

Turning on auto-refine.

To turn on text refinement, go to the three-dot ‘More’ button in the Apple Pencil’s tool menu. (With the Apple Pencil Pro, this is the array of buttons that appears when you squeeze the barrel of the Pencil.) Choose the gear icon, and you’ll see a toggle to turn on ‘Auto-refine Handwriting.’ I’ve found that I can write a line or two of text using Notes in landscape orientation before refinement kicks in, animating to magically improve my handwriting. It’s wild to see and works just as Apple says: the more you write in this mode, the closer the refinement looks to your own writing style. The one glitch I’ve run into with auto-refine is that if I’m furiously taking notes and refinement kicks in while I’m writing, Notes will stop accepting my writing for a beat, which usually means I have to rewrite one or two letters.

Smart Script recognizes misspelled words and offers suggested changes.

Smart Script recognizes misspelled words and offers suggested changes.

I don't think this looks like my handwriting, but at least it's spelled correctly.

I don’t think this looks like my handwriting, but at least it’s spelled correctly.

Spell-checking is baked into Smart Script, too. If a word isn’t recognized, it will be underlined with a blue dotted line. Tap it, and suggested spellings will appear in the action menu popover. If you select one, the word you hand-wrote will be replaced by the correctly spelled word in your own handwriting. The replaced word doesn’t always match my handwriting well, but it’s close enough and still breaks my brain a little to see in action.

Also, if you scratch out a word without removing the Apple Pencil from the surface of the iPad when you’re finished, the word will be erased. For more than a single word, I think the eraser tool works better, but the scribble gesture is a nice alternative because it doesn’t require switching tools.

The parts of Smart Script that have the most rough edges are moving text around and pasting text in your own handwriting. If you tap and hold with the Apple Pencil, a cursor will appear that can be used to move text around the page. You can move the text a little to make room to add a word or two, or you can move whole lines and paragraphs down to make room for a drawing, larger blocks of handwriting, or pasted text. The trouble is that moving text around isn’t as precise or accurate as it should be.

Moving text around can be a challenge.

Moving text around can be a challenge.

In the video above, I first moved my notes a little to make room for some additional text in a different color. That caused the notes to wrap down the page as I made space, but the left margin didn’t stay aligned the way I would have liked. Next, I added a few sloppy words in a different color, which, once refined, changed the color of one of the words I hadn’t added. I then accepted a spelling suggestion, which resulted in a jumbled mess. The problem seems to stem from having added two lines of text on one line and cramming in the last word diagonally because adding just a couple of words that match the words before and after the edit worked fine.

During earlier betas, I saw even worse results. However with yesterday’s beta release, many of the issues seem to have been resolved. Reflowing text still isn’t perfect, but it’s come a long way in a short period of time. So, while I recommend sticking to simpler edits to avoid text reflowing problems, Apple seems to have gotten the feature on the right track going into the fall updates.

Another interesting aspect of Smart Script is the ability to insert copied text in the style of your handwriting. Digital type can be jarring when looking at a page of handwritten notes. Smart Script’s solution is to give you the option to paste text that approximates your handwriting.

Pasting works a lot like moving text. After you copy something, long-press with the Apple Pencil until the cursor appears. If you have text on your clipboard, ‘Paste’ will appear in the action menu allowing you to drop in whatever you copied. Here’s an example of a sentence I pasted from an article I wrote on MacStories.

The text at the top of the note was pasted in and doesn't exactly match my handwriting, but it's sort of close.

The text at the top of the note was pasted in and doesn’t exactly match my handwriting, but it’s sort of close.

It’s pretty clearly not my handwriting. The pasted text looks a little too perfect. Perhaps that’s a good thing because it helps distinguish your words from the ones pulled from another source, but if that’s the intent, I’d think Apple would automatically allow you to paste in a source reference, too. That’s not an option, though.

One quirk of pasting text is that you can only do it within existing handwritten notes, meaning that if you start by making room for what you want to paste on a page and long-press on a blank space, you won’t get the option to paste what’s on your clipboard. Instead, you need to long-press on a line that already includes text. It’s a strange limitation that confounded me at first.

Source: Apple.

Source: Apple.

Finally, Math Notes, a feature of the Calculator app and Notes, integrates with Smart Script. As you take notes, you can add mathematical expressions and formulas, including variables, and solve them inline. I particularly like the integration of Math Notes with both Calculator and Notes. You can start in Calculator by tapping the button with a calculator icon and choosing Math Notes. That opens a UI that looks just like Notes. Write your expressions, and moments after you write an equals sign, your answer will appear in the app’s tint color to indicate that it was calculated for you. If you prefer, you can turn off automatic calculations entirely or have the app suggest calculations before actually doing them. Then, when you switch over to Notes, you’ll find the note you created in Calculator waiting for you along with your other notes.

Some simple examples of Math Notes in action. The feature can also create graphs that change dynamically as you change variables.

Some simple examples of Math Notes in action. The feature can also create graphs that change dynamically as you change variables.

Math Notes is the sort of feature that I expect students will love, but it’s also going to be handy for the kinds of simple calculations everyone comes across in daily life. You might want to split the cost of a purchase among multiple people, calculate the cost of a trip you’re planning, or figure out how long it will take you to save for a big purchase. In all of these cases, it’s useful to have the math inline with surrounding notes that provide context.


Apple’s goal of making handwritten notes as versatile as typed text is admirable, but it’s not quite there yet. I love the auto-refinement of my handwriting and plan to use it anytime I take notes. Math Notes is another bright spot that I don’t expect I’ll use frequently, but I’m glad it will be there when I need it. I’ll use Smart Script’s text reflow feature, too, but sparingly until I’m sure the reliability issues I saw during the betas have been resolved.

Despite the rough edges, what I like most about Smart Script is that the Apple Pencil has become more than a tool for artists and a pointing device for the rest of us. It’s a far more capable accessory than ever before that I find myself slowly using more and more over time. Part of that evolution has been my move to the 11” iPad Pro, which I use far more often without the Magic Keyboard case than I did the larger iPad Pros I’ve owned. However, it also comes down to what the Apple Pencil Pro can do. It has become my highlighter and markup tool for PDFs, a way to take notes, and more. It’s not an accessory I use every day yet, but with Smart Script, I’m enjoying taking handwritten notes more than in the past – which has made me less likely to grab my keyboard and more likely to simply jot down a note or use Scribble in text fields. Smart Script is off to a promising but imperfect start, and with time, I hope it can fulfill Apple’s vision of handwriting as coequal with typing as an input method.


Access Extra Content and Perks

Founded in 2015, Club MacStories has delivered exclusive content every week for nearly a decade.

What started with weekly and monthly email newsletters has blossomed into a family of memberships designed every MacStories fan.

Club MacStories: Weekly and monthly newsletters via email and the web that are brimming with apps, tips, automation workflows, longform writing, early access to the MacStories Unwind podcast, periodic giveaways, and more;

Club MacStories+: Everything that Club MacStories offers, plus an active Discord community, advanced search and custom RSS features for exploring the Club’s entire back catalog, bonus columns, and dozens of app discounts;

Club Premier: All of the above and AppStories+, an extended version of our flagship podcast that’s delivered early, ad-free, and in high-bitrate audio.

Learn more here and from our Club FAQs.

Join Now]]>
New Projects, New Setup https://www.macstories.net/stories/new-projects-new-setup/ Tue, 16 Jul 2024 15:02:08 +0000 https://www.macstories.net/?p=76061

The past few months have been busy at MacStories. The release of new iPads was followed by our launch of new podcasts and then WWDC. Along the way, my gear setup has changed a little, too.

Portable Setup Changes

11” iPad Pro and Accessories

Leading up to Apple’s spring event and knowing that it would feature new iPads, I was on the fence about buying one. The Pencil Pro and the Tandem OLED display tipped the balance, though, and I ended up getting the 11” Wi-Fi-only model with a Magic Keyboard Case, a Smart Folio cover, and Apple Pencil Pro.

I was tempted by the nano-texture display but ultimately passed on it as well as cellular connectivity. I expect there’s a nano-texture device of some sort in my future, but even without it, the iPad Pro’s Tandem OLED display works better in sunlight than previous displays. I don’t use an Apple Pencil often, but with the new Pro model, I plan to play around with it more to see if I can find a place for it in my day-to-day computing. The lack of meaningful iPadOS 18 updates coming this fall is a letdown, but I’m still pleased with my purchase because the smaller form factor and fantastic display have led me to use my iPad Pro more.

Desk Setup Changes

Balolo's tablet holder accessory.

Balolo’s tablet holder accessory.

With the change in iPad sizes, the articulating arm I used for the 12.9” iPad Pro no longer worked for me. Instead, I’ve transitioned to another Balolo accessory, the Tablet Holder. It sits neatly in the center of my Desk Cockpit shelf, where I can set my iPad for use with Sidecar or Universal Control. If you’re a Club member and interested in Balolo’s Desk Cockpit setup, which I covered in detail this past February, there’s a coupon code for 10% off on the Club Discounts page.

My new video gear from Elgato.

My new video gear from Elgato.

I have been experimenting more with video in recent weeks, too. That led to the addition of an Elgato Facecam Pro and Key Light to my desk, along with an Elgato Mini Mount stand for the camera.

Portable Podcasting

The Shure MV7+ in its protective case.

The Shure MV7+ in its protective case.

This year, I decided to update my portable podcasting setup with a new microphone that can record over USB-C and an XLR cable at the same time. The Shure MV7+ is a step up in sound quality from the Shure Beta 87A I used in the past, even though it doesn’t sound quite as good to my ear as the Earthworks ETHOS mic I use at my desk. The biggest advantage, though, is that I can record directly to my laptop over USB-C and to an SD card in the Sound Devices Mix-Pre 3 II that I pulled out of storage. This offers me the simplicity of direct recording to my laptop’s internal SSD and the security of having a backup on a separate device. Along with the Shure MV7+ mic, I bought a hard case to protect it when traveling.

Headphones

The Beats Fit Pro.

The Beats Fit Pro.

As I mentioned in this week’s AppStories+ pre-show, I’ve recently been testing out the Sennheiser Momentum Sport Earbuds. The jury’s still out on those for the reasons I explained on the podcast, but a set of headphones that I now use daily is the Beats Fit Pro. I had a pair of Powerbeats Pro for about five years, but I wanted something smaller with transparency mode so that I can be more aware of my surroundings. After a couple of months of use, I’m very happy with the Beats Fit Pro. They sound good, connect easily to my Apple Watch, and have most of the other features found in AirPods Pro.

Gaming

This is how you do Windows, right?

This is how you do Windows, right?

The biggest change in my gaming setup is the addition of the Lenovo Legion Go portable Windows gaming handheld to my setup. I don’t use it for every game I play, but I also purchased a ONEXGPU eGPU that gives me a nice graphics boost when playing the most demanding games. I pair the Legion Go with Logitech’s new Keys-To-Go 2 (which I reviewed in MacStories Weekly last month) and a yellow POP Mouse whenever I need to dive into Windows for more than launching a game.

The Anbernic RG35XXSP (left) and Miyoo Mini A30 (right).

The Anbernic RG35XXSP (left) and Miyoo Mini A30 (right).

Since I started NPC: Next Portable Console with Federico and Brendon Bigley, I have been exploring retro handhelds again, including two new devices:

  • Miyoo Mini A30, a Game Boy Micro-inspired handheld that’s perfect for travel
  • Anbernic RG35XXSP, a Game Boy Advance SP-inspired handheld that folds up to protect the screen and controls

Over time, I’m sure I’ll settle on one of these devices or my Miyoo Mini+ over the others, but so far, I’ve enjoyed rotating between all three.

Save the Hero Builders cartridge reader.

Save the Hero Builders cartridge reader.

To copy ROMs from my collection of game cartridges I also added a Save the Hero Builders cartridge reader to my setup. It’s custom-built in Japan and based on an open source project. I love the look of the model I got with its yellow PCB and black thumbscrews. It looks intimidating but is actually pretty simple to use.

Of course, I have new controllers.

Of course, I have new controllers.

Finally, no gear update would be complete without a couple of new controllers. On Brendon’s recommendation, I picked up 8BitDo’s SN30 Pro in transparent green. It looks a lot like a classic SNES controller, but with thumbsticks. It’s nice and compact, which makes it a good companion for playing games on my 11” iPad Pro. For iPhone gaming I recently acquired the Razer Ultra controller, which is big but solidly-built and features high-quality components.

I also purchased a CRKD Nitro Deck+ wrap-around controller for my OLED Switch. The Nitro Deck+ transforms the Switch into something about the size of a Steam Deck, and compared to the original Nitro Deck model, this one features a quieter rumble feature and easier removal of the Switch. Plus, it’s transparent, which sealed the deal for me. My only reservation is that I’m not sure putting the ABXY buttons under the right thumbstick was a good idea, but I haven’t had the Nitro+ long enough to decide yet.


Okay, that’s all for now. For a complete overview of the tech I use every day, check out the Setups page. If you have any questions about the gear discussed above, feel free to get in touch with me in the Club MacStories Discord community, on Mastodon, or on Threads.


Access Extra Content and Perks

Founded in 2015, Club MacStories has delivered exclusive content every week for nearly a decade.

What started with weekly and monthly email newsletters has blossomed into a family of memberships designed every MacStories fan.

Club MacStories: Weekly and monthly newsletters via email and the web that are brimming with apps, tips, automation workflows, longform writing, early access to the MacStories Unwind podcast, periodic giveaways, and more;

Club MacStories+: Everything that Club MacStories offers, plus an active Discord community, advanced search and custom RSS features for exploring the Club’s entire back catalog, bonus columns, and dozens of app discounts;

Club Premier: All of the above and AppStories+, an extended version of our flagship podcast that’s delivered early, ad-free, and in high-bitrate audio.

Learn more here and from our Club FAQs.

Join Now]]>
iOS 18 After One Month: Without AI, It’s Mostly About Apps and Customization https://www.macstories.net/stories/ios-18-public-beta-preview/ Mon, 15 Jul 2024 20:22:37 +0000 https://www.macstories.net/?p=76034 iOS 18 launches in public beta today.

iOS 18 launches in public beta today.

My experience with iOS 18 and iPadOS 18, launching today in public beta for everyone to try, has been characterized by smaller, yet welcome enhancements to Apple’s productivity apps, a redesign I was originally wrong about, and an emphasis on customization.

There’s a big omission looming over the rollout of these public betas, and that’s the absence of any Apple Intelligence functionalities that were showcased at WWDC. There’s no reworked Siri, no writing tools in text fields, no image generation via the dedicated Image Playground app, no redesigned Mail app. And that’s not to mention the AI features that we knew were slotted for 2025 and beyond, such as Siri eventually becoming more cognizant of app content and gaining the ability to operate more specifically inside apps.

As a result, these first public betas of iOS and iPadOS 18 may be – and rightfully so – boring for most people, unless you really happen to care about customization options or apps.

Fortunately, I do, which is why I’ve had a pleasant time with iOS and iPadOS 18 over the past month, noting improvements in my favorite system apps and customizing Control Center with new controls and pages. At the same time, however, I have to recognize that Apple’s focus this year was largely on AI; without it, it feels like the biggest part of the iOS 18 narrative is missing.

As you can imagine, I’m going to save a deeper, more detailed look at all the visual customization features and app-related changes in iOS and iPadOS 18 for my annual review later this year, where I also plan to talk about Apple’s approach to AI and what it’ll mean for our usage of iPhones and iPads.

For now, let’s take a look at the features and app updates I’ve enjoyed over the past month.

Apps

There are lots of app-related improvements in iOS 18, which is why I’m looking forward to the next few months on AppStories, where we’ll have a chance to discuss them all more in-depth. For now, here are my initial highlights.

Reminders and Calendar, Together At Last

I almost can’t believe I’m typing this in 2024, but as things stand in this public beta, I’m very excited about…the Calendar app.

In iOS and iPadOS 18, the Calendar app is getting the ability to show your scheduled reminders alongside regular calendar events. I know, I know: that’s not the most incredible innovation since apps like Fantastical have been able to display events and tasks together for over a decade now. The thing is, though, with time I’ve realized that I don’t need a tool as complex as Fantastical (which is a great app, but its new business-oriented features are something I don’t need); I’m fine with Apple’s built-in Calendar app, which does everything I need and has an icon on the Home Screen that shows me the current date.

By enabling the ‘Scheduled Reminders’ toggle in the app’s Calendars section, all your scheduled tasks from the Reminders app will appear alongside events in the calendar. You can click a reminder inside Calendar to mark it as complete or use drag and drop to reschedule it to another day. As someone who creates only a handful of calendar events each week but gives every task a due date and time, I find it very helpful to have a complete overview of my week in once place rather than having to use two apps for the job.

Reminders alongside events in the Calendar app.

Reminders alongside events in the Calendar app.

The integration even extends to Calendar’s Home Screen widgets, including the glorious ‘XL’ one on iPad, which can now show you reminders and events for multiple days at a glance.

Reminders are also displayed in the XL Calendar widget on iPad. iOS and iPadOS 18 also include the ability to resize widgets without deleting and re-adding them from scratch every time.

Reminders are also displayed in the XL Calendar widget on iPad. iOS and iPadOS 18 also include the ability to resize widgets without deleting and re-adding them from scratch every time.

Historically speaking, this is not the first time we’ve seen a direct, two-way communication between Apple apps on iOS: in iOS 12, Apple brought News articles to the Stocks app, which I covered in my review at the time as an exciting opportunity for system apps to cross-pollinate and for Apple to provide iOS users with an experience greater than the sum of its parts. This year, the company is going to much greater lengths with the same idea. Not only is Reminders data appearing inside Calendar, but tasks are interactive in the Calendar app, to the point where you can access the full-blown Reminders task creation UI from inside Calendar:

The Reminders UI embedded within Calendar.

The Reminders UI embedded within Calendar.

Does this mean the Calendar app has become the one productivity app to rule them all for me? Obviously not, since there are still plenty of reasons to open the standalone Reminders app, if only to browse my various lists and smart lists or use specific features like tagging and rich links. But the idea of providing a common ground between the two apps is solid, and as a result, I find myself spending more time managing my week inside the Calendar app now.

As we’ll see later this year, these two apps aren’t the only ones becoming capable of talking to each other in iOS 18: Notes and Calculator will also get the ability to share Math Notes and allow users to edit the same document in two distinct places. This is a trend worth keeping an eye on.

Speaking of Reminders, there are a handful of improvements in the app I want to mention.

For starters, subtasks now appear in the ‘Today’ and ‘Scheduled’ lists as well as custom smart lists. Previously, reminders nested within a parent reminder would be hidden from those special views, which – again, as someone who schedules everything in his task manager – hindered the utility of subtasks in the first place. To give you a practical example, I have an automation that creates a new list for the next issue of MacStories Weekly every week (which I adapted from my Things suite of shortcuts), and one of the tasks inside that list is an ‘App Debuts’ parent reminder. When I come across an interesting app or update during the week, I save it as a subtask of that reminder. In iOS and iPadOS 18, those subtasks can appear in the ‘Today’ view on Saturday morning, when it’s time to assemble MacStories Weekly.

Subtasks for a reminder in a specific list (left) can now appear in the Today page and other smart lists.

Subtasks for a reminder in a specific list (left) can now appear in the Today page and other smart lists.

Although the ‘Today’ default smart list still doesn’t support columns, it now lets you customize the order of its default sections: all-day, overdue, and timed reminders.

You can now customize the order of sections in the Today page.

You can now customize the order of sections in the Today page.

I’m also intrigued by Apple’s promise of new Shortcuts actions for Reminders, though I suspect the work is unfinished and we’re only seeing partial results in this public beta. There is a new ‘Create Reminder’ action in Shortcuts (which I can only see on my iPhone, not on the iPad) that exposes more options for task creation than the old ‘Add New Reminder’ action.

Namely, this action now lets you enter properties for a list’s sections and assignees; strangely enough, the action doesn’t contain an option to enter a URL attachment for a task, which the previous action offered. I’m guessing that, as part of Apple Intelligence and with the ultimate goal of making Siri more integrated with apps, Apple is going to retool a lot of their existing Shortcuts actions. (It’s about time.) I wouldn’t be surprised if more apps follow Reminders in modernizing their Shortcuts integrations within the iOS 18 cycle because of Apple Intelligence.

Passwords: The “Finally” of the Year

At long last – and the finally is deserved here – Apple has made a standalone Passwords app for iOS, iPadOS, and macOS. I (and many others) have been arguing in favor of cutting password management out of Settings to let it blossom into a full-blown app for years now; I’m not kidding when I say that, on balance, the addition of the Passwords app has been the most important quality-of-life improvement in iOS 18 so far.

I moved all of my passwords out of 1Password and into iCloud Keychain just before WWDC (following Simon’s excellent guide). As soon as I updated to iOS 18, they were all transferred to the Passwords app, and I didn’t have to do anything else. iCloud Keychain already supported key password management features like verification codes; with the Passwords app, you don’t need to go hunt for them inside Settings anymore thanks to a more intuitive UI that also adds some welcome options.

The Passwords app has a design that’s reminiscent of Reminders, with pinned sections at the top for passkeys, logins that have one-time codes, Wi-Fi networks, security recommendations, and deleted items. With the exception of the Wi-Fi section, these aren’t new features, but the presentation makes them easier to find. These sections are followed by your shared groups, which aren’t new either, but are more discoverable and prominent. The design of login item pages is consistent with iOS 17’s old iCloud Keychain UI, but the app now supports multiple URLs for the same login; the main list view also includes sorting options.

The Passwords app is my favorite addition of the year.

The Passwords app is my favorite addition of the year.

My favorite detail of the Passwords app, however, is this: if you need to quickly copy an item’s username, password, code, or URL, you can simply right-click it in the main view and copy what you need:

This is a good menu.

This is a good menu.

The best thing I can say about Passwords is that it obliterated a particular kind of frustration from my life. With Apple’s app, I don’t need to worry about my password manager not working anymore. In fact, I’d argue that Passwords’ strongest quality is that you never think about it that much, and that means it’s doing its job.

Those who, like me, come from more than a decade of using 1Password and have witnessed the app’s slow descent into instability know what I’m talking about: having to constantly worry about the app’s Safari extension not working, search not bringing up results inside the app, or the extension auto-filling the wrong information on a page. With Passwords, all these issues have evaporated for me, and I can’t describe how happy it makes me that I just don’t have these thoughts about my password manager anymore.

Don’t get me wrong; there are features of 1Password that Apple’s Passwords app can’t match yet, and that I’m guessing will be the reason why some folks won’t be able to switch to it just yet. The biggest limitation, in my opinion, is the lack of secure attachments: if you want to store a file (like, say, a PDF document or an encryption key) associated with a login item, well, you can’t with Passwords yet.

These limitations need to be addressed, and now that Passwords is a standalone experience, I’m more confident that Apple will have the headroom to do so rather than having to cram everything into a Settings page. Moving from 1Password to the Passwords app has been one of the most useful tech-related migrations I’ve made in recent memory; if you’re on the verge and primarily use Apple devices, I highly recommend taking the time to do it.

The New Photos App: Tradition, Discovery, and Customization

Beyond Apple Intelligence, I’d argue that the most important change of iOS 18 – and something you can try right now, unlike AI – is the redesigned Photos app. As I shared on MacStories a couple of weeks ago, I was initially wrong about it. Having used it every day for the past month, not only do I think Apple is onto something with their idea of a single-page app design, but, more importantly, the new app has helped me rediscover old memories more frequently than before.

The concept behind the new Photos app is fairly straightforward to grasp, yet antithetical to decades of iOS UI conventions: rather than organize different app sections into tabs, everything has been collapsed into a single page that you can scroll to move from your library to various kinds of collections and suggestions. And that’s not at all; this fluid, single-page layout (which is powered by SwiftUI) is also highly customizable, allowing users to fine-tune which sections they want to see at the top in the main carousel and which pinned collections, albums, or trips they want to see further down the page.

The new Photos UI on iPad.

The new Photos UI on iPad.

It’s easy to understand why the move from a tabbed interface to a unified single-page design may – at least in theory – bolster discovery inside the app: if people aren’t using the app’s other sections, well, let’s just move all those sections into the screen we know they’re using. Or, think about it this way: we already spend hours of our lives discovering all kinds of digital information – news, music, memes, whatever – by scrolling. Why not use the same gesture to rediscover photos in our libraries, too? (The opposite of doomscrolling, if you will.)

What I think Apple designers have achieved with the new Photos app – and what I will explore more in-depth later this year in my review – is balance between tradition, discovery, and customization. By default, the new Photos app shows your usual grid of recent photos and videos at the top, occupying roughly 60% of the screen on iPhone. Immediately below, the app will automatically compile smart collections for recent days (where only the best shots are highlighted, removing stuff like screenshots and receipts) as well as your favorite people and pets. So if you’re looking for the photo you just took, you can still find it in the grid, but there’s also a chance something else may catch your eye down the page.

Photos' new UI.

Photos’ new UI.

The grid can be expanded to full-screen with a swipe, which reveals a new segmented control to enable the Years and Months views as well as a menu for sorting options and new filters to exclude media types such as screenshots and videos. The transition from half-grid to full-grid is incredibly smooth and pleasant to look at.

Expanding the grid reveals new filters.

Expanding the grid reveals new filters.

So that’s tradition: if you want to keep using the Photos app as a grid of photos, you can, and the app supports all the features you know and love in the grid, such as long presses to show context menus and drag and drop. This is where Photos bifurcates from the past, though: if you want, at this point, you can also keep scrolling to discover more, or you can spend some time customizing the look of the app to your needs and preferences.

There are a lot of recommended sections (and more coming with AI in the future) and customization options – too many for me to cover in this preview article today. Allow me to highlight just a few. The main grid at the top of the app? That’s actually a carousel that you can swipe horizontally to move from the grid to other “pinned” sections, and you can customize which collections are displayed in here. In my app, I put my favorites, videos, photos of my dogs, featured photos, and screenshots (in this order) after the grid. This way, I can move to the right if I want to discover old favorites and memories, or I can quickly move the left to find all my screenshots quickly. Once again: it’s all about tradition, discovery, and customization.

Customizing the carousel.

Customizing the carousel.

I’ve become a huge fan of Recent Days, which is a section that follows the grid, automatically groups photos by day, and sort of serves as a visual calendar of your life. Apple’s algorithm, in my experience, does a great job at picking a key photo from a particular day, and more often than not, I find myself swiping through this section to remember what I did on any given day.

I also like the ability to customize Pinned Collections, which is another section on the page and, effectively, a user-customizable space for shortcuts to your Photos library. You can pin anything in here: media types, specific albums, specific trips (which are curated by iOS 18), photos of your pets, photos of people and groups of people (also new in iOS 18), and more.

Recent Days and Pinned Collections.

Recent Days and Pinned Collections.

I’ll save more comments and tidbits on the redesigned Photos app for my iOS and iPadOS 18 review later this year. For now, though, I’ll say this: a month ago, I thought Apple was going to revert this drastic redesign like they did with Safari three years ago; now, I think Apple has created something special, and they should be diligent enough to iterate and listen to feedback, but also stick to their ideas and see this redesign through. The new Photos app allows me to see recently-taken pictures like before; at the same time, it gives me an easier, less disorienting way to discover forgotten moments and memories from my library that are continuously surfaced throughout the app. And at any point, I can choose to customize what I see and shape the app’s experience into something that is uniquely mine.

I was skeptical about iOS 18’s Photos app at first, but now I’m a believer.

User Customization: Home Screen Icons and Control Center

Apple’s progressive embrace of user customization on its mobile platforms isn’t new. (We could trace the company’s efforts back to iOS 12’s rollout of Shortcuts and, of course, iOS 14’s Home Screen widgets.) For the first time in years, however, I feel like a part of Apple’s customization features in iOS 18 aren’t meant for people like me at all. Fortunately, there’s another aspect to this story that is very much up my alley and, in fact, the area of iOS 18 I’m having the most fun tweaking.

Let’s start with the part that’s not for me this year: Home Screen customization and icon theming. At a high level, Apple is bringing three key changes to the Home Screen in iOS 18:

  • You can now place app icons and widgets anywhere, leaving empty spaces around them.
  • You can make app icons larger, hiding their text labels in the process.
  • You can switch app icons to a special dark mode version, as well as apply any color tint you like to them.

Of these three changes, I’ve only been using the setting that makes icons larger and hides their labels. I think it makes my Home Screen look more elegant and less crowded by getting rid of app and shortcut titles underneath their icons.

Regular icons (left) compared to the new larger icon size in iOS 18.

Regular icons (left) compared to the new larger icon size in iOS 18.

As for the other two changes…they’re really not my kind of thing. I think the ability to freely rearrange icons on the Home Screen, going beyond the limitation of the classic grid, is long overdue and something that a lot of folks will joyfully welcome. Years ago, I would have probably spent hours making dynamic layouts for each of my Home Screen pages with a particular flow and order to their icons. These days, however, I like to use a single Home Screen page, with Spotlight and the App Library filling the gaps for everything else. And – as you’ve seen above – I like filling that Home Screen page to the brim with icons for maximum efficiency when I’m using my phone.

I don’t foresee a scenario in which I voluntarily give up free space on my Home Screen to make it more “aesthetic” – including on my iPad, where this feature is also supported, but I’d rather use the extra space there for more widgets. At the same time, I know that literally millions of other iPhone users will love this feature, so Apple is right in adding support for it. As a meme would put it, in this case, I think it’s best if people like me shut up and let other people enjoy things.

It’s a similar story with icon tinting, which takes on two distinct flavors with iOS 18. Apps can now offer a dark mode icon, a great way to make sure that folks who use their devices in dark mode all the time have matching dark icons on their Home Screens. Generally speaking, I like what Apple is doing with their system apps’ icons in dark mode, and I appreciate that there are ways for developers to fine-tune what their icons should look like in this mode. My problem is that I never use dark mode – not even at night – since I find it too low-contrast for my eyes (especially when reading), so I don’t think I’m ever going to enable dark icons on my Home Screen.

From left to right: large icons, dark mode icons (for some of my apps), and tinted icons.

From left to right: large icons, dark mode icons (for some of my apps), and tinted icons.

The other option is icon tinting using a color picker, and…personally, I just don’t think it looks good at all. With this feature, you can effectively apply a color mask on top of every app icon and change the intensity of the color you’ve selected, thus completely disregarding the color palettes chosen by the app’s creator. To my eyes, the results look garish, to the point where even Apple’s promotional examples – historically, images that are meant to make the new version of iOS appear attractive – look awful to me. Are there going to be people out there who will manage to make a tinted layout that looks nice and that they’re going to love? I’m sure. And this is why user customization is important: we all have different tastes and needs, and I think it’s great when software doesn’t judge us for what we like (or dislike) and lets us shape the computer however we want.

I want to wrap up this story with the one customizable feature that I adore in iOS 18, and which I know is going to define my summer: the new Control Center.

In iOS 18, Control Center is becoming a multi-page, customizable affair. Controls now come in multiple sizes, and they’re powered by the same technologies that allow developers to create widgets and Shortcuts actions (WidgetKit and App Intents). This rewrite of Control Center has some deep ramifications: for the first time, third-party apps can offer native controls in Control Center, controls can be resized like widgets, and there is a Controls Gallery (similar to the Widget Gallery on the Home and Lock Screens) to pick and choose the controls you want.

The new Control Center can span multiple pages with support for resizable controls and third-party ones.

The new Control Center can span multiple pages with support for resizable controls and third-party ones.

Given the breadth of options at users’ disposal now, Apple decided to eschew Control Center’s single-page approach. System controls have been split across multiple Control Center pages, which are laid out vertically (rather than horizontally, as in the iOS 10 days); plus, users can create new pages to install even more controls, just like they can create new Home Screen pages to use more widgets.

Basically, Apple has used the existing foundation of widgets and app intents to supercharge Control Center and make it a Home Screen-like experience. It’s hard for me to convey in an article how much I love this direction: app functionalities that maybe do not require opening the full app can now be exposed anywhere (including on the Lock Screen), and you get to choose where those controls should be positioned, across how many pages, and how big or small they should be.

You can customize Control Center sort of like the Home Screen by rearranging and resizing controls. There is a Controls Gallery (center) and controls can be configured like widgets, too.

You can customize Control Center sort of like the Home Screen by rearranging and resizing controls. There is a Controls Gallery (center) and controls can be configured like widgets, too.

If you know me, you can guess that I’m going to spend hours infinitely tweaking Control Center to accommodate my favorite shortcuts (which you can run from there!) as well as new controls from third-party apps. I’m ecstatic about the prospect of swapping the camera and flashlight controls at the bottom of the Lock Screen (which are now powered by the same tech) with new, custom ones, and I’m very keen to see what third-party developers come up with in terms of controls that perform actions in their apps without launching them in the foreground. A control that I’m testing now, for instance, starts playing a random album from my MusicBox library without launching the app at all, and it’s terrific.

So far, the new Control Center feels like peak customization. Power users are going to love it, and I’m looking forward to seeing what mine will look like in September.

iOS and iPadOS 18

There’s a lot more I could say about iOS 18 and its updated apps today. I could mention the ability to create colored highlights in Notes and fold sections, which I’m using on a daily basis to organize my iOS and iPadOS research material. I could point out that Journal is receiving some terrific updates across the board, including search, mood logging based on Health, and integration with generic media sources (such as third-party podcast apps and Spotify, though this is not working for me yet). I could cover Messages’ redesigned Tapbacks, which are now colorful and finally support any emoji you want.

But I’m stopping here today, because all of those features deserve a proper, in-depth analysis after months of usage with an annual review this fall.

Should you install the iOS and iPadOS 18 public betas today? Unless you really care about new features in system apps, the redesigned Photos app, or customization, probably not. Most people will likely check out iOS 18 later this year to satisfy their curiosity regarding Apple Intelligence, and that’s not here yet.

What I’m trying to say, though, is that even without AI, there’s plenty to like in the updated apps for iOS and iPadOS 18 and the reimagined Control Center – which, given my…complicated feelings on the matter is, quite frankly, a relief.

I’ll see you this fall, ready, as always, with my annual review of iOS and iPadOS.


You can also follow our 2024 Summer OS Preview Series through our dedicated hub, or subscribe to its RSS feed.


Access Extra Content and Perks

Founded in 2015, Club MacStories has delivered exclusive content every week for nearly a decade.

What started with weekly and monthly email newsletters has blossomed into a family of memberships designed every MacStories fan.

Club MacStories: Weekly and monthly newsletters via email and the web that are brimming with apps, tips, automation workflows, longform writing, early access to the MacStories Unwind podcast, periodic giveaways, and more;

Club MacStories+: Everything that Club MacStories offers, plus an active Discord community, advanced search and custom RSS features for exploring the Club’s entire back catalog, bonus columns, and dozens of app discounts;

Club Premier: All of the above and AppStories+, an extended version of our flagship podcast that’s delivered early, ad-free, and in high-bitrate audio.

Learn more here and from our Club FAQs.

Join Now]]>
macOS Sequoia: The MacStories Public Beta Preview https://www.macstories.net/stories/macos-sequoia-the-macstories-public-beta-preview/ Mon, 15 Jul 2024 20:22:24 +0000 https://www.macstories.net/?p=76038

Sequoia is not your typical macOS release. In recent years, new features of all Apple OSes have been increasingly meted out over time instead of dropping all at once in the fall. That’s been true of macOS, too, but this year, the magnitude of the staged release will be more pronounced, which has trickled through to the public beta released today.

macOS Sequoia will be a phased release. That means you won’t find everything announced at WWDC in the public beta. Some features, notably large parts of Apple Intelligence, won’t be available until 2025. That’s something worth keeping in mind if you’re thinking about installing the Sequoia public beta today. The beta is generally stable, but you’re likely to run into bugs, and with many features still to come in the months ahead, the upside of running it is more limited than in past years.

Apple Intelligence promises to round out Sequoia over time, but neither I nor anyone else outside of Apple has had a chance to try those features yet. So, for now, let’s focus on what you can expect if you install the macOS Sequoia public beta today.

Is It Safe to Install the Beta?

I get this loaded question every year. Generally, yes, it’s safe. I’ve been running Sequoia since WWDC, and I’ve run into bugs and had apps crash, but I haven’t lost any data, which in my estimation makes it safe.

However, my Mac setup isn’t your Mac setup. I work with a lot of different apps, but they aren’t going to be the same mix as yours, so I always encourage anyone interested in macOS betas to think about what they would do if a critical app is broken by one of the beta releases.

My ‘Plan B’ is a MacBook Pro running Sonoma instead of Sequoia. If something goes terribly wrong with the macOS beta, I can easily switch to that laptop. So, before diving into the beta, have a ‘Plan B,’ make sure all of your data is safely backed up, then go for it.

iPhone Mirroring and Notifications

Apple Podcasts running in the macOS iPhone Mirroring app.

Apple Podcasts running in the macOS iPhone Mirroring app.

The biggest difference in my day-to-day workflow with Sequoia has been the addition of iPhone Mirroring (when it works) and iPhone notifications on the Mac.

iPhone Mirroring displays your locked iPhone’s UI on your Mac, where it’s fully interactive. To work, your iPhone must be nearby and locked. That makes it perfect for those times when you get a notification on your iPhone for an app that’s not on your Mac but your iPhone is across the room, in your pocket, or in a bag. Another great use case for mirroring is when your iPhone is in StandBy and you see a notification come in. Instead of picking up and unlocking your iPhone in any of these situations, you can simply open the iPhone Mirroring app in your Mac’s Dock.

If an app is landscape-only, iPhone Mirroring switches to that orientation automatically.

If an app is landscape-only, iPhone Mirroring switches to that orientation automatically.

Your iPhone’s UI takes a few moments to load in the Mac app, but once it appears, you can use all of your apps as you would on your iPhone using your Magic Trackpad. You can navigate with a mouse, too, but I haven’t been able to scroll app interfaces with my Logitech MX Master 3S mouse. The iPhone Mirroring app also has three useful keyboard shortcuts that allow you to navigate to the Home Screen, App Switcher, and Spotlight Search on your iPhone.

iPhone Mirroring isn’t a feature I’ve found myself using daily, but it can come in handy. For instance, the app that controls my Roomba isn’t available on the Mac. When the vacuum is on another floor of my house, I like to check in on it to see if it’s gotten stuck or needs emptying. In the past, that has meant checking the app on my iPhone from time to time as the Roomba does its thing. With iPhone Mirroring, I can simply open that app in a window on my Mac and flip over to it for a quick status check now and then. It’s still an interruption of what I’m doing, but it’s less so than grabbing my iPhone.

As much as I’ve enjoyed iPhone Mirroring, it has been buggy. In fact, for most of the past week, it didn’t work at all. The problem arose after the release of iOS 18 developer beta 3. When I’d try to connect my iPhone to my Mac via the iPhone Mirroring app, it would tell me that it couldn’t connect because my iPhone’s microphone was in use. The trouble was that the mic wasn’t actually being used. Nothing I tried would fix the problem until, on a whim, I opened the microphone access section of my Mac’s System Settings and toggled microphone access off and then on again for one random app I haven’t used in months, which fixed it.

What made me try that? I don’t know. Call it reviewer’s intuition. I’ve seen a lot of bugs, and I figured it might shake something loose, which it did. Despite those troubles, I’m a fan of iPhone Mirroring. Just keep in mind that it’s still a very new feature and might cause you some trouble if you install the public beta.

It’s less flashy than the iPhone Mirroring app, but getting iPhone notifications on my Mac has really grown on me. They appear just like any Mac notification and are fully interactive.

iPhone notifications from my ecobee security system. Note the small iPhone icon in the corner of the notification indicating its source.

iPhone notifications from my ecobee security system. Note the small iPhone icon in the corner of the notification indicating its source.

I get it if your first reaction to getting iPhone notifications on your Mac is a negative one. That was certainly how I felt. However, the reality is that it’s been a fantastic addition to macOS because I’m already careful about which apps can notify me on my iPhone. As a result, I’m only getting notifications I want, and being able to deal with them on my Mac is easier than pulling out my phone.

For example, I run the reminder app Due on my iPhone. The app is perfect for tasks you can’t forget to do because it will alert you continually until you check them off. However, many of those tasks are things I need to do on my Mac, and once I’m finished with them, pulling out my iPhone to mark the task as completed is a distraction. Instead, I can do that from my Mac now.

Another good example is Xero, the accounting app we use for MacStories, which wanted me to confirm my identity using its iPhone app when I tried to log in on the web the other day. Xero doesn’t have a Mac app, but with iPhone notifications, its interactive alert asking me to confirm my identity popped up immediately on my Mac, allowing me to quickly gain access to Xero’s website.

Clicking on a Gentler Streak notification opened the app in iPhone Mirroring so I could check my latest workout on my Mac.

Clicking on a Gentler Streak notification opened the app in iPhone Mirroring so I could check my latest workout on my Mac.

I’ve only been using Sequoia for about a month, so the jury’s still out on whether iPhone Mirroring and notifications prove to be sticky long-term, but I think they probably will for a couple of reasons. First, in the 10 days or so that iPhone Mirroring didn’t work on my Mac, I missed it. Second, the longer I’ve used Sequoia, the more I’ve run across iPhone notifications that I’m glad to have appear on my Mac. Both features are what Continuity is great at: reducing friction and creating a more seamless experience across devices.

Window Tiling

Hovering over the green button reveals Sequoia's window tiling options.

Hovering over the green button reveals Sequoia’s window tiling options.

Why it took macOS until 2024 to include basic window tiling is beyond me, but it will finally arrive with Sequoia, and it is nicely done. There are too many third-party apps that have filled this gap in macOS to list, but as well as window tiling is implemented in Sequoia, I don’t think the best third-party apps have anything to worry about. The new macOS feature will satisfy many users, but the OS’s built-in options are not nearly as extensive as apps like Lasso, Rectangle, or Magnet. Power users will still turn to third-party apps, but I’m glad to see there will be a built-in solution for anyone who doesn’t need the added bells and whistles.

Apple has added two groups of window management options with four options each, and they can be accessed in a variety of ways. The first group is ‘Move & Resize,’ which moves and resizes the current window to fill the left half, right half, top, or bottom of your Mac’s screen. The other group is called ‘Fill & Arrange’ and allows you to

  • fill the entire screen with the current window,
  • tile the current and last active window to fill the left and right halves of your screen
  • tile the current and last two active windows with the current window on the left half of the screen and the other two windows in the right quarters, or
  • tile the four most recent windows in quarters.
Two Obsidian windows with a narrow border around each.

Two Obsidian windows with a narrow border around each.

In the current beta, the quarter-sized windows overlap vertically on my Studio Display, which seems wrong and is probably a bug. I’m fairly certain that behavior wasn’t present in earlier betas. Other than that, though, the feature works as expected.

As I mentioned above, there are several ways to invoke window tiling. You can hover over a window’s green button, use a keyboard shortcut, drag a window near the edge of your screen (activating a preview of where your window will be tiled), or ⌥-drag a window in the direction of where you want it to be tiled, which I find is easiest.

The System Settings app also includes a few options for how window tiling works. There are toggles to turn off tiling by dragging windows to your screen’s edges or using the Option key. Also, you can specify whether the windows should touch or leave a narrow margin between each.

Window Tiling options are found in the Desktop & Dock section of System Settings.

Window Tiling options are found in the Desktop & Dock section of System Settings.

I rarely do anything other than fill my screen with one window or use two apps side-by-side in equally-sized windows. For that, Raycast has done the trick for me, but over time, I’ve found myself using the new Sequoia tiling options more and more. It’s hard to beat Raycast’s keyboard-driven efficiency, but the built-in drag options have grown on me, especially when I’m already using my mouse or trackpad.

Notes

Notes continues to add powerful features year after year. What was once a simple text editor with limited formatting options now rivals many of the best third-party note-taking apps. Chief among the changes coming this fall is the ability to record audio that is transcribed live inside your notes. The transcription can be mixed with typed text and other media in the same note as well. When your transcript is visible, the related text is highlighted as the audio plays back, similar to how lyrics are highlighted in the Music app when a song is playing.

Transcription is a fantastic addition to Notes.

Transcription is a fantastic addition to Notes.

The Notes toolbar has a new button that looks like a sound wave and opens a panel on the right side of a note that reveals the audio recording interface. At the bottom of the panel are playback controls, a record button, and a transcription toggle that switches between a visual representation of your recorded audio and its transcript. At the top of the panel is a ‘More’ button with options to share your audio, copy the transcript, incorporate the transcript’s text into your note, and find text within the transcript. There are also options to save the audio as a separate file or delete it. Overall, the new transcription tool is a great addition to Notes and is going to be perfect for students and anyone who wants to create a record of a meeting.

Math Notes made a splashy debut on on the iPad, where it works with the Apple Pencil, but it’s available on the Mac too. Just type an equation, and when you hit the ‘=’ key, a preview of the result will appear after the equal sign. Type Return or Tab, and the result will added to your note. You can even assign numbers or equations to words and then use the words as variables in separate equations. Unfortunately, Math Notes does not work inside tables embedded in notes.

There are a few other refinements to Notes, too, including

  • the ability to highlight text in any of five built-in colors,
  • inline searching of PDFs, which has some visual glitches currently, and
  • collapsible heading sections, which will make navigating long notes much easier.
An example of a collapsed and uncollapsed section of a note.

An example of a collapsed and uncollapsed section of a note.

Few system apps have received as much attention over the past few years as Notes. The result is an app that works in a variety of contexts for a wide range of users. Notes may also benefit from the writing tools coming with Apple Intelligence, but even without them, this fall’s update to the app promises to be another good one.

Passwords

Federico and I finally got one of our long-term wishes this year with the introduction of a standalone Passwords app on the iPhone, iPad, and Mac that syncs between devices securely using iCloud. I have been slowly but surely transitioning my saved logins from 1Password to Apple’s system for a couple of years in anticipation of this day, and it has paid off. When I opened the new Passwords app on my Mac, it was already pre-populated with over 1,500 passwords, passkeys, verification codes, and Wi-Fi credentials. The app also collects the apps and websites where you’ve used ‘Sign in with Apple’ or ‘Hide My Email’ and includes both a Security category alerting you to any issues with your passwords and a Deleted section where you can recover any recently deleted passwords. There is a section that collects shared passwords, and the app supports importing and exporting passwords, too.

What you won’t find in Passwords is the ability to save attachments or take notes about accounts. That’s too bad because I’ve used 1Password to securely store important legal documents and add notes to shared passwords about how to use certain web accounts in the past. However, with password-protected shared notes in the Notes app, you can partially accomplish the same result, albeit in a different app.

The Passwords app will look very familiar if you've used Passwords in Safari's settings before.

The Passwords app will look very familiar if you’ve used Passwords in Safari’s settings before.

If you’ve used the Passwords section of Safari before, you’ll have no trouble acclimating to the new Passwords app because it includes all the same functionality. If you forget about the app and head to Safari’s settings instead, you’ll find a button that will open the Passwords app, which I think is a great touch because I opened Safari’s settings out of habit the first week I was using Sequoia, and I’m sure other people will too.

While the Passwords app doesn’t break any new ground in terms of features this year, the fact that it is a standalone app is significant. By its very existence, it signals to users the importance of passwords, making them more central to users’ daily computing experience. What’s more, with smart sorting options and a simple interface, it makes good security habits easy to get started with and not intimidating, which I love.

Messages

This fall, you'll be able to schedule messages.

This fall, you’ll be able to schedule messages.

I spend a lot of time in Messages on my Mac. It’s not as fun as the app’s iPhone and iPad versions, but at least most of the same features are available everywhere. This year, that will include the ability to schedule a message for sending later and text effects.

Scheduling messages to be sent later is a nice addition, but it’s not one I expect I’ll use often. With friends and MacStories colleagues in a wide range of time zones, it’s understood that the burden is on the recipient to silence their notifications at night or when they don’t want to be bothered. But that doesn’t work for everyone; I have family members who don’t live by what I think of as the International Messaging Treaty. For them, it will be nice to have the option to set a message so it arrives at a sane time of day. Likewise, I’m sure I’ll use the feature when I know someone is busy, pushing a message out until later whether they have silenced notifications or not, so I don’t distract them.

The text effects that can be added to messages are the same as you’ll find on the iPhone and iPad. There are eight, including Big, Small, Shake, Nod, Explode, Ripple, Bloom, and Jitter. However, unlike on the iPhone and iPad, you don’t get a preview of the effect before you pick it on the Mac. Instead, you select the text you want to add an effect to and add the effect, which is then previewed in the Messages text field before you click send. Messages will also add more standard text styling this year with bold, italics, underline, and strikethrough options.

Messages and text effects.

Messages and text effects.

Finally, as in iOS and iPadOS 18, tapbacks are now colorful and can include emoji and stickers. Once activated, the tapback popup can be scrolled horizontally with a swipe to reveal your most frequently used tapbacks, emoji, and stickers, too.

Calendar and Reminders

Reminders is integrating with Calendar.

Reminders is integrating with Calendar.

Sequoia continues Apple’s trend of integrating system apps with each other by giving you the option to display Reminders tasks alongside your events. If you turn on the integration in the left sidebar under the ‘Other’ section, not every task will appear in Calendar. Instead, the calendar will only show tasks that have been assigned a date, which I think makes sense for what most people would want to see on their calendars.

In addition to bringing scheduled tasks into Calendar, you can now create, complete, and delete tasks from within Calendar using an interface that’s a lot like what you’re already familiar with in the Reminders app. Apple has updated the interface of Calendar’s month view, which distinguishes between events (using a color-coded bar) and tasks (using a color-coded dot) too.

Task integration probably isn’t a feature I’ll use because I don’t like the visual clutter of mixing events with tasks. However, I know a lot of people rely on a system of strict event and task scheduling, and this will be an excellent addition for them.

Safari

The only article I've found that uses Safari Reader's new summary feature takes Apple's side regarding the DMA. 🤔

The only article I’ve found that uses Safari Reader’s new summary feature takes Apple’s side regarding the DMA. 🤔

This fall, Safari Reader will add a sidebar with a table of contents for navigating within a webpage, beneath which you’ll find a summary of the article. So far, this feature seems to have been applied to vanishingly few articles. I browsed several websites and the longest stories I could find on Longreads, and I’ve only found one article that included a short summary and no table of contents. Perhaps the feature isn’t finished yet, but I expected to be able to find more examples. You’ll know a summary is available if you see a purple sparkle icon overlaid on top of the icon on the far left side of Safari’s address bar.

The same gleam will appear when you visit other websites, indicating that the Highlights feature is available. Highlights surfaces key information from web pages, like pulling the address from a hotel’s site and offering a map in a popover window. The feature also offers summaries of articles that are identical to what you’ll find in Safari Reader and information about people, music, movies, and TV shows. In my early searches, I haven’t found any Highlights featuring people or music, and movie and TV show Highlights have been limited to Wikipedia pages, but I expect this feature will become more widespread later.

Highlights is handy for locating hotels as long as they are national chains.

Highlights is handy for locating hotels as long as they are national chains.

An example of Highlights linking to a TV show in the Apple TV app, although Amazon Prime isn't available to stream in that app from the Mac.

An example of Highlights linking to a TV show in the Apple TV app, although Amazon Prime isn’t available to stream in that app from the Mac.

Finally, Safari has an excellent new viewer interface for videos. You invoke the viewer from the same address bar button that activates Safari Reader and Highlights. On a site like YouTube, the video takes over a bigger portion of the screen, and the rest of the site fades into the background. QuickTime-like controls appear at the bottom of the video too. If you switch to another app, the video automatically continues playing in a resizable and re-positionable Picture-in-Picture window. My early experience with the new viewer has been nothing but positive. Having one set of controls for every video regardless of the website makes life easier and reduces the clutter of sites in the same way Safari Reader does for text.

Watching [Comfort Zone](https://www.youtube.com/watch?v=yI382L5g6XU&list=PLDSpdP6G2mHuQfDP4-fM8XQxLYmc90xNi) in Safari's new video viewer.

Watching Comfort Zone in Safari’s new video viewer.

Everything Else

A hiking trail at Yosemite national park.

A hiking trail at Yosemite national park.

There are other changes coming to macOS Sequoia this fall that I’ll cover in greater depth in my full review, but here’s a quick rundown of the highlights.

Video Conferencing: Video conferencing has been a highlight of macOS updates for the past few years, and I have a hard time getting excited about it because I don’t do many video calls. That said, there are useful new features like the ability to replace your background with one of Apple’s built-in options or your own photos and a presenter preview that lets you see what you are going to share before you share it and limit what others can see.

Maps: This year’s Maps update is all about getting outdoors. There are new topographic maps and hikes in U.S. national parks, and you can save your own hiking routes, too. The new maps and hikes are excellent, so I hope they expand to other countries and other U.S. parks in the future. A new Places Library has been incorporated into Maps’ toolbar as well. With the click of a button, search results can be added to your Places Library, where you’ll also find saved guides and hiking routes.

Freeform: This fall, you’ll be able to define areas of a board called Scenes that can be played back in sequence as a slideshow or shared individually. In addition, the app will add a new snap-to-grid feature for easily aligning board elements.

macOS Sequoia

Sequoia: A Mac and iPhone 2-in-1.

Sequoia: A Mac and iPhone 2-in-1.

I’ll cover a few additional odds and ends along with everything else in my macOS Sequoia review this fall. By then, some (but not all) of the Apple Intelligence features should be entering beta, which I expect will help round out the feature set beyond where it is today.

That said, my first month with Sequoia has been a little disappointing. It’s not that the features that have been added aren’t welcome additions to the OS; it’s just that few will make an everyday difference to most Mac users. I know from my day-to-day work on the Mac Studio where I’m running Sequoia that most days, I forget I’m even running a beta.

On the one hand, that’s good because, other than a few bugs here and there, the experience has been smooth. On the other hand, though, there remain many apps and features like Music, Stage Manager, and Shortcuts that have barely been touched in recent years and are in need of attention. Have those things been put on the back burner in favor of Apple Intelligence? Only Apple can answer that question. However, regardless of the answer, I hope Apple Intelligence is worth the wait. My first month with macOS Sequoia has been a positive one on balance, but at the same time incomplete and a little underwhelming.


You can also follow our 2024 Summer OS Preview Series through our dedicated hub, or subscribe to its RSS feed.


Access Extra Content and Perks

Founded in 2015, Club MacStories has delivered exclusive content every week for nearly a decade.

What started with weekly and monthly email newsletters has blossomed into a family of memberships designed every MacStories fan.

Club MacStories: Weekly and monthly newsletters via email and the web that are brimming with apps, tips, automation workflows, longform writing, early access to the MacStories Unwind podcast, periodic giveaways, and more;

Club MacStories+: Everything that Club MacStories offers, plus an active Discord community, advanced search and custom RSS features for exploring the Club’s entire back catalog, bonus columns, and dozens of app discounts;

Club Premier: All of the above and AppStories+, an extended version of our flagship podcast that’s delivered early, ad-free, and in high-bitrate audio.

Learn more here and from our Club FAQs.

Join Now]]>
watchOS 11: The MacStories Public Beta Preview https://www.macstories.net/stories/watchos-11-the-macstories-public-beta-preview/ Mon, 15 Jul 2024 20:22:11 +0000 https://www.macstories.net/?p=76042

Last year, Apple declared watchOS 10 the biggest update to the Apple Watch’s software since its introduction. I don’t think that was actually the case, but there were undoubtedly some notable changes to how we interact with our watches every day, with the introduction of the Smart Stack being key among them.

While Apple hasn’t forgotten about UI enhancements like the Smart Stack, this year sees the company turning its focus back to health and fitness tracking with some significant new features in those areas. I’ll be saving a deeper dive into the software update – including all the tiny changes and fun additions – until the fall, but with the watchOS 11 public beta going live today via the Apple Beta Software Program, now is the perfect time to go over the key features Apple has in store for Apple Watch users.

Smart Stack

The introduction of the Smart Stack last year ran hot and cold with many users. Some found it difficult to relearn years of muscle memory previously tied to interacting with the watch face (by turning the Digital Crown) or bringing up Control Center (by swiping up from the bottom of the screen). These actions were remapped to open the new Smart Stack instead. Almost a year after I installed the watchOS 10 beta, I still occasionally swipe up from the bottom of my Apple Watch screen, trying to bring up Control Center.

However, change can be good, and the new Smart Stack held a lot of promise for widgets from both system apps and third-party ones. Unfortunately, since the initial burst of activity from developers, new widgets have become quite rare. I came to feel that the Smart Stack, while certainly useful, was not quite fulfilling its potential. This could all change with watchOS 11, which introduces several compelling features to the Smart Stack, including Live Activities.

Live Activities from Crouton, Flighty, and Timery.

Live Activities from Crouton, Flighty, and Timery.

Live Activities were introduced in iOS 16 alongside the iPhone 14 Pro and its Dynamic Island. While the Dynamic Island has been useful, I (and I suspect many others) mainly use it to interact with media playback. Live Activities on the Home Screen, on the other hand, have gone from strength to strength. Many developers have adopted them, and now, they’re coming to the Smart Stack in watchOS 11.

Apple’s implementation of Live Activities on watchOS is impressive in terms of how easy it is to use and how customizable it is. To start with, you don’t have to have an app installed on your Apple Watch to use its Live Activity. If you have the app on your iPhone and start a Live Activity there, the same Live Activity will automatically launch on your Apple Watch. It’s incredibly simple, and it also makes the feature discoverable for all users, as the Smart Stack will auto-launch when the Live Activity starts and even stay open when you put your wrist down.

I initially thought it would be annoying to have every Live Activity automatically appear on my Apple Watch, but in the Watch app on iOS, you can customize Live Activity behavior on a per-app basis. For example, if you start a timer with Timery, you can specify that its Live Activity should appear at the top of the Smart Stack, but only when you rotate the Digital Crown; you can also turn it off completely. I instantly turned off auto-launch for all media apps since I’m constantly listening to something on my iPhone and would never see my watch face otherwise.

Fine-tuning Live Activity settings in the Watch app on iOS.

Fine-tuning Live Activity settings in the Watch app on iOS.

The feature feels very well thought out, even down to the dismissible mini-widget that appears if you’re doing something on your Apple Watch and a Live Activity starts on your phone. It brings a lot of new functionality to the Smart Stack without any extra effort on the user’s part.

Another addition to the Smart Stack is interactive widgets. Whereas in watchOS 10, tapping on a widget would launch its app, you can now control apps directly via buttons within a widget. Apple has shown this capability off with widgets for the Home app, but it’s a little disappointing that they are limited to only a single button, which seems to be Apple’s recommendation for all apps.

New interactive widgets on watchOS 11 take a cue from Live Activities such as the timer.

New interactive widgets on watchOS 11 take a cue from Live Activities such as the timer.

Lastly, there are suggested widgets that appear in the Smart Stack automatically based on a number of parameters. These work pretty much the way you’d expect. For example, if my Activity rings are low and it’s getting into the late afternoon, the Activity app’s Rings widget will appear in my Smart Stack. As with Live Activities, you can specify which widgets can be suggested and which should not appear.

Suggested widgets and customizing which ones can be suggested.

Suggested widgets and customizing which ones can be suggested.

After trying them for a month, I really like these new additions. It feels like the Smart Stack is now a more complete feature and presumably closer to what Apple imagined it to be in the first place. However, I still feel the huge clock and date that appear at the top are a waste of space. The clock may now reflect the font of your current watch face, but surely that area could be better utilized with a couple of complications or another widget alongside a small time indicator in the corner.

Training Load

This year, Apple has introduced two new features that push the boundaries of what the Apple Watch can do more into ‘pro’ territory. The first is Training Load. You might be familiar with the concept of training load if you’re a serious athlete or if you’ve ever owned a Garmin smartwatch. It’s a way of measuring the cumulative amount of exercise performed over a week and its impact on the body. Typically, metrics like duration and intensity are used to calculate this figure, but whereas duration is simple to calculate, intensity is more esoteric.

Training Load on watchOS 11.

Training Load on watchOS 11.

Apple determines intensity using what the company calls an ‘effort’ rating. Every completed workout is categorized as either easy, moderate, hard, or all-out. These scores are calculated automatically for cardio workouts, though you can adjust them manually. For non-cardio workouts such as weight training, you’ll be prompted to add your own effort rating when you’re finished exercising.

The effort score is multiplied by the duration of the workout to calculate your training load. (See, I told you this was a pro feature.) Figures for the last seven days are weighted against the previous 28 days to give you a chart showing whether your training load is going up or down and to what extent.

Training Load data is also available in the Fitness app on iOS.

Training Load data is also available in the Fitness app on iOS.

I’ve been impressed with this feature thus far. Seeing my progress in the past week compared to the previous month is great for motivation and allows me to moderate what I do to the correct level. While it’s not a full-blown pro fitness tracker like Athlytic, Apple has done a good job of bringing a pro feature to the Apple Watch and making it easy for the average user to understand.

Vitals

The Vitals app, new in watchOS 11, is the other feature pushing watchOS to pro levels this year. It allows you to monitor key overnight health metrics – average heart rate, respiratory rate, wrist temperature, blood oxygen, and sleep duration – and receive notifications if one of them is outside of your normal range. This might sound like Apple is simply taking metrics already available in the Health app and putting them together in a different way, but what the company is doing here is actually quite interesting.

The new Vitals app in watchOS 11.

The new Vitals app in watchOS 11.

After an initial period of you wearing your watch overnight for seven days, the Vitals app calculates your typical range for each metric. However, it’s not quite that simple. Apple is using data from its Heart and Movement Study to generate an algorithm used in these calculations. The company hasn’t gone into more detail than that, but this suggests the Vitals app is doing far more than using an average of each metric and creating a standard range on either side.

If a metric is out of its typical range, the Vitals app will notify you when you wake up with information about what’s out of range and what might be causing it. I’ve been lucky enough not to have received any of these notifications as of yet, but the idea that your Apple Watch could suggest tangible reasons for a change in your vitals seems like a nice addition.

Information from the Vitals app is also available in the Health app on iOS.

Information from the Vitals app is also available in the Health app on iOS.

You can choose to display your latest vital measurements in the Smart Stack, and they also integrate with Training Load to give you additional context when planning your exercise for the day.

Fitness and Activity

Pausing your rings in watchOS 11.

Pausing your rings in watchOS 11.

watchOS 11 includes arguably the most-requested Apple Watch ‘feature’ ever, after years of users asking for it: the ability to pause your Activity rings. It’s frankly ridiculous that this hasn’t been possible until now, but I’m glad it finally is. The option has been introduced nicely, too. When you pause your Activity rings, you can choose to do so for today, a week, a month, or a custom time period up to 90 days. Additionally, the day before your Activity rings are due to resume, you will receive a notification asking if you’re “ready to get back to it,” with the option to extend the pause if you wish to.

You can also adjust each Activity ring goal based on the day of the week. For me, weekends are always filled with family activities, so it’s great to be able to reduce my Move ring goal for those days and not feel the pressure to do a workout or pause my rings.

Creating a goal schedule in the Fitness app on iOS.

Creating a goal schedule in the Fitness app on iOS.

Over on iOS, Apple has introduced the ability to personalize the Summary tab in the Fitness app, enabling you to rearrange tiles and prioritize the information you care about most. I enjoyed playing around with these tiles, and it’s good to be able to feature an activity I do a lot, like cycling, prominently on the screen. However, in classic Apple style, each element is very chunky, and you can’t move the huge Activity rings tile at the top, which takes up over a quarter of the screen. Even shrinking that tile down to just show the rings themselves would be welcome. Nevertheless, it’s a nice change to the app.

Customizing the Fitness app on iOS.

Customizing the Fitness app on iOS.

Finally, as is tradition each year, Apple has improved the Workout app, adding distance as an in-session metric to more workouts and route maps to outdoor rowing and cross-country skiing. Pool swims now support custom workouts, and there is a new Up Next view in the app that shows how much of the current interval you have left and what the next interval will be. I don’t personally use any of these new Workout features, but many will, and it’s good to see Apple continue to push what the Apple Watch can do with their already massive array of workout options.

Photos Watch Face

Creating a Photos face in the Watch app on iOS.

Creating a Photos face in the Watch app on iOS.

Apple is very proud of the Photos face, which the company says is the watch face used most by Apple Watch owners. In watchOS 11, the company has brought its old friend machine learning along for the ride. Now, just like when you choose photos for your Lock Screen in iOS, your Apple Watch will surface suggested photos that the system thinks will work well as a watch face. Also, as on iOS, you can adjust selected photos with tints and use depth data to position elements in front of or behind the time.

This is, well, fine. It’s what you’d expect, but it’s a bit difficult to get excited about a customizability feature like this when what we really want are third-party, custom watch faces. Nevertheless, this will be a great enhancement for the many people who already use the Photos watch face.

Other Notable Additions

The Translate app is now available on the Apple Watch. It’s essentially the same as the iOS version, except for the fact that you can’t use voice and text input at the same time; you have to choose one or the other. My wife is Spanish, so I’ve been able to test the app out quite a bit, and it is a solid translation tool that is now even more handy, being as it is on your wrist.

The Translate app on watchOS.

The Translate app on watchOS.

One addition that I will never be able to test due to my inability to experience the miracle of childbirth is new support for pregnant users. The existing Cycle Tracking feature will now change to show your gestational age when you are pregnant and enable you to track various symptoms. You will get a notification if your heart rate goes below the normal threshold, receive prompts to take a mental health assessment, and be offered the option to add your pregnancy status to your Medical ID. Apple’s focus on pregnancy health is impressive, and I hope the company continues to add more – like the option to bring in data like blood pressure for additional context, even though the Apple Watch doesn’t measure that (yet).

Maps in iOS 18 adds thousands of curated hikes across America’s national parks, as well as the ability to create custom routes. You can access these routes from your Apple Watch and store them for offline use. It’s not made explicitly clear, but what the Maps app on the Apple Watch seems to do is sync its library of routes with the Maps app on iOS. If you’ve downloaded the route for offline use on your iPhone, the Apple Watch will do the same. You can browse routes on your Apple Watch, but you can’t save them for offline viewing, which is a bit frustrating, especially if you’re out walking with your cellular watch and want to download a route before moving away from cell service.

A curated route in Maps on watchOS.

A curated route in Maps on watchOS.

In addition to support for enhanced ticketing in the Wallet app and a new Double Tap API for developers, there is one more notable new feature in watchOS 11: Check In.

Check In is available in Messages (left) and Workout (center and right).

Check In is available in Messages (left) and Workout (center and right).

Check In was introduced last year in iOS 17 as a safety feature for letting friends and family know when you arrive at a destination and alerting them if you are later than planned or outside of your expected destination. Just like on iOS, the feature is available in Messages on watchOS 11, but you can also use it in the Workout app. By swiping to the right during a workout and tapping the Check In button, you can choose to notify a contact when your workout ends. If your speed increases abnormally or if your heart rate decreases to near resting when it shouldn’t, you will receive a notification on your Apple Watch, and your selected contact will be notified if you don’t respond. Check In is a great feature, and it’s good to see Apple extending it to watchOS and even taking it a step further with exercise integration.

watchOS 11

Apple Watch updates often seem on the lighter side post-WWDC, but after a month with watchOS 11, I’ve come to appreciate it as a significant upgrade. While last year’s update was geared mainly toward UI changes, this year’s is firmly focused on the health and fitness aspects of the Apple Watch.

It’s also notable that Apple has expanded into two areas – the new Vitals app and Training Load – that use already-available data to enable more advanced features. As Apple continues struggling to develop new sensors for the Apple Watch, it seems that the company has been pushed to increase the device’s usefulness with more advanced health and fitness capabilities. They’re very welcome, and while Apple certainly isn’t sleeping on features like the Smart Stack, these sorts of additions emphasize how much potential the Apple Watch has as a device for serious athletes as well as everyday users.

If you want to try out these new features, you can now do so with the watchOS 11 public beta, available here.


You can also follow our 2024 Summer OS Preview Series through our dedicated hub, or subscribe to its RSS feed.


Access Extra Content and Perks

Founded in 2015, Club MacStories has delivered exclusive content every week for nearly a decade.

What started with weekly and monthly email newsletters has blossomed into a family of memberships designed every MacStories fan.

Club MacStories: Weekly and monthly newsletters via email and the web that are brimming with apps, tips, automation workflows, longform writing, early access to the MacStories Unwind podcast, periodic giveaways, and more;

Club MacStories+: Everything that Club MacStories offers, plus an active Discord community, advanced search and custom RSS features for exploring the Club’s entire back catalog, bonus columns, and dozens of app discounts;

Club Premier: All of the above and AppStories+, an extended version of our flagship podcast that’s delivered early, ad-free, and in high-bitrate audio.

Learn more here and from our Club FAQs.

Join Now]]>
LGBT and Marginalized Voices Are Not Welcome on Threads https://www.macstories.net/stories/lgbt-and-marginalized-voices-are-not-welcome-on-threads/ Fri, 12 Jul 2024 15:00:17 +0000 https://www.macstories.net/?p=75984

As Twitter was crumbling under Elon Musk’s new leadership in 2023, various online circles found themselves flocking to alternative platforms. While some may have kept using Twitter (now known as… X), a non-negligible number of communities migrated over to Mastodon and other smaller platforms. Meanwhile, Meta shipped its own textual social media platform, Threads. The service initially launched in most parts of the world except for the European Union, but it’s been available in Europe for over six months now and has seen its usage soar.

For many, Threads understandably felt like a breath of fresh air following the chaos that engulfed Twitter. Unlike the latter, Threads is not run by someone that I and many others find to be an exceptionally despicable human. Its algorithmic timeline contrasts with Mastodon’s exclusively chronological feeds, and its integration with Instagram has attracted a number of big names and stars.

I’m an activist. In my daily life, I work and advocate for the advancement of trans people’s rights in France. As a result, my expanded online social circle mostly consists of LGBT people, and most of them are activists, too. However, in the span of a few months, almost everyone in that circle who was excited about Threads launching in Europe has now stopped using it and migrated back to Twitter, Mastodon, or elsewhere. When I ask around about why those people left Threads behind, their responses vary, but a trend persists: most felt like they were being shadow-banned by the platform.

Without hard data, it is difficult to investigate this feeling, to understand if it is truly widespread or specific to some online bubbles. But one thing is certain: Threads hasn’t felt like a breath of fresh air for all who tried to use it. In my experience as a trans woman, at its best, it has felt like Jack Dorsey’s old Twitter: a social platform overrun by an opaque moderation system, free-roaming hate speech, and a frustrating algorithm that too often promotes harmful content.

As months go by, incidents where Threads consistently failed to uphold its understood promise of a better-moderated Twitter-like platform have added up. Today, for many non-white, non-straight, non-male users, it is a repulsive social media experience, where their voices are silenced and where hate speech offenders who target them go unpunished.

Let’s talk about this.

This prevalent sentiment among LGBT people has increased over time. You need only rewind back to January 2024, soon after Threads launched in Europe, to find something pretty disgusting that started happening in people’s For You feeds. Homophobic and transphobic posts kept plaguing the app, despite people’s efforts to report these posts and hide them from their timeline. This wave of hateful content also included anti-abortion posts, and reports of this happening were widely shared. Worse, these posts were often pushed into people’s feeds on Instagram as well, prompting some trans people to stay clear of Threads altogether.

This widespread incident is the only one so far that has prompted an official response from Threads, through the voice of Instagram’s head Adam Mosseri. However, that response is very telling of Meta’s approach when it comes to addressing these moderation issues. In his 23-second video, Mosseri acknowledges that there have been “low-quality recommendations” in users’ feeds but makes no explicit mention of what these recommendations were actually made of: anti-abortion comments, sexism, and violent homophobia and transphobia.

In March, this spike of violent hate speech targeted at LGBT people was followed by the release of a damning report from GLAAD, a renowned non-profit organization focused on LGBT advocacy. This report, entitled Unsafe: Meta Fails to Moderate Extreme Anti-trans Hate Across Facebook, Instagram, and Threads, paints a picture that has come as no surprise to any trans or marginalized person who has ever used any of Meta’s platforms for a significant amount of time. It puts forward a sample collection of viral harmful posts that, despite being in clear violation of Meta’s policies, have remained in circulation across Facebook, Instagram, and Threads.

For anyone doubting that there is a consistent, reproducible, and wide-ranging problem of Meta failing to moderate harmful content targeted at marginalized people on its platforms, I highly encourage that you set aside some time and read through the report.

In its introduction, GLAAD writes,

Characterized by fear-mongering, lies, conspiracy theories, dehumanizing tropes, and violent rhetoric, these posts — many by high-follower accounts — aim to boost engagement, generate revenue, and seed hateful narratives about trans, nonbinary, and gender non-conforming people. These accounts profit from such hate, and so does Meta and its shareholders. Meanwhile, LGBTQ people and other targeted groups experience an increasing number of well-documented real-world harms stemming from these long-term anti-LGBTQ propaganda campaigns, driven by the anti-LGBTQ extremists that Meta allows to flourish on its platforms.

Since the release of the GLAAD report, Meta has not offered any further acknowledgment of this long-standing issue. Instead, in February 2024 – a couple of weeks before the report was released – the company introduced a worrying and seemingly unrelated policy change: Meta would now opt all of its users out of “political content” in their platform’s algorithmic timelines. Along with that change, a new toggle was added in Instagram’s settings panel. If you want to see political content in your feed, you need to flip the switch.

Since it was announced, the nature of what would be filtered by this new, on-by-default ‘Political Content Control’ on Instagram and Threads has never been clarified. On an Instagram support page, the company states: “Political content includes content that mentions: Governments, Elections, Social topics.”

What are “social topics”? Does that include black people speaking out against systemic racism? Does that include LGBT people speaking out against homophobia and transphobia? Does that include anyone identified as a part of a socially marginalized group of people? I am worried that it does – especially thinking back to my online circle of LGBT activists who have all left Meta’s platforms behind in favor of Twitter, Mastodon, or Bluesky, because they were all under the impression that they were being shadow-banned. I feel certain that opting everyone on the platform out of seeing political content has only contributed to this impression.

But I’m not alone. In the wake of this change, hundreds of political and news content creators, LGBT activists, and journalists, have signed an open letter to Meta asking the company to reverse its decision.

They write:

…Meta’s vague definition of political content as “likely to mention governments, elections, or social topics that affect a group of people and/or society at large” endangers the reach of individuals and organizations whose identities and/or advocacy have been rendered a ‘social topic’ in this country. This undermines the reach of marginalized folks speaking to their own lived experience on Meta’s platforms and undermines the reach of advocacy work in important areas that have become ‘social topics’ including climate change, gun violence prevention, racial justice, transgender rights, and reproductive freedom to name just a few.

Meanwhile in France, during that same month of February, Le Coin des LGBT+ (@lecoindeslgbt) was inexplicably suspended from Threads and Instagram. The account is extremely popular in the country, as it focuses on relaying LGBT news and events and plays a key role in online advocacy of LGBT people’s rights. Today, the account is still up and running on Instagram, but if you try looking for it on Threads, all you will find are countless messages of outrage from people asking Meta to reinstate the account.

While Le Coin des LGBT+ was restored a few days after its suspension without any explanation, the account’s holders chose not to stick around on Threads, limiting their activity to Instagram and Twitter. I would have made the same choice. For the few weeks that it was around on Threads, every single one of the account’s posts was bombarded with insults, death threats, and literal nazi imagery. Day after day, I spent hours trying to report every single one of them. Not only did the swastikas stay up and reappear faster than I could report them, but every single one of my reports was also met with a cold, disturbing, automated message from Instagram in my email inbox telling me that, “No violation was found”.

The extremely worrying thing is that we don’t know how often this happens. Le Coin des LGBT+ has a huge following, and as result, the issue became widely relayed on both Instagram and Threads. But what about all of the victims who have less than a few hundred followers? Is light ever shed on the threats, insults, and hate that they have received?

This is all starting to add up. If you’re an activist, a journalist reporting on issues affecting LGBT people, or an LGBT content creator, Threads is now both silencing your voice and exposing you to death threats.

Quite recently, I once more experienced this frustrating and harmful opacity around Meta’s moderation. Snap legislative elections took place in France this month. I will spare you the summary of this chaotic political situation at home, but long story short: we were at risk of an extremist far-right party rising to power. For all minorities and LGBT people in France, sharing our voices and our concerns online about this imminent threat was of crucial importance.

On the opposite end of the French political spectrum, an alliance of all the left and green parties was formed, and it became the only real hope for all marginalized people to elect a government that would further their rights instead of repressing them. This alliance is called the New Popular Front. One of its main ways of raising awareness online during the campaign was sharing its political manifesto, accessible via the following URL: nouveaufrontpopulaire.fr.

However, a mere week before the first round of the legislative elections, people started to realize that this URL was blocked by Meta. It suddenly became impossible to post a link to the New Popular Front’s website on all of its platforms: Facebook, Instagram, and Threads. I documented this finding myself on Mastodon and Threads. The URL was disallowed in posts, in Instagram stories, and even inside conversations in Messenger and Instagram DMs with no explanation.

The New Popular Front’s website worked and could be shared on Twitter, Mastodon, and other social platforms without any issues. While many of us thought this ban was perhaps applied to all the running parties in the election, it turned out that sharing a link to the far-right party’s website on Threads, Instagram, and Facebook was still possible. To add insult to injury, Meta started removing all posts on Threads that contained a link to the New Popular Front’s website; one of mine, posted one week prior, was affected. My Instagram and Threads accounts were even suspended for a few hours after that post was removed.

There we were, seven days before a crucial election that could upend our entire lives, unable to share links to one of the few platforms that could save us. This suspension lasted for more than 24 hours, which felt like an eternity in an incredibly fast-paced campaign where every second counted. When Meta lifted the ban on shared links to the New Popular Front’s website, it once again didn’t issue any statement. Posts that were previously removed were quietly restored, and moderation appeals that had been filed by hundreds of French users have seemingly been lost into Meta’s automated moderation void.

This was probably a mistake, a glitch in Meta’s automated process that perhaps erroneously flagged the URL as dangerous. But even if that was the case, why did it take more than 24 hours for the company to resolve the problem? Why would they refuse to even acknowledge that something happened? Like I said, it starts to add up. Now, as a black or LGBT person, in addition to having your voice silenced and exposing yourself to death threats on Meta’s platforms, you are also at risk of having your account suspended for sharing a link to a political manifesto just days before a crucial election.

As Threads is slowly opening up to the fediverse, perhaps federation can feel like an escape hatch. It could be reassuring to think, “Perhaps if I can’t possibly stay active on Threads, I’ll be active on Mastodon, and thanks to federation, people on Threads can still read and follow me.” But I’m not hopeful. Last month, Threads published a new page that lists all of the fediverse servers that it blocks and does not federate with. Any account hosted on one of the Mastodon servers listed on this page cannot follow and interact with Threads users. And when Threads starts allowing its users to follow Mastodon accounts, they will be barred from following any account hosted on the listed servers.

Of course, this is not surprising by any means. All Mastodon servers maintain such a list on their About pages. (You can take a look at my own server’s list of moderated servers here.) These lists often contain the same set of long-known offenders: servers harboring and promoting hate speech, CSAM, and a wide range of gruesome things. Motivated by the recrudescence of Meta’s moderation failings, a number of Mastodon servers have also chosen to list Threads as a blocked server.

However, if you take a look at Meta’s list of moderated servers, you are going to notice a pattern. Alongside the well-established set of known offenders, Meta is also blocking a number of well-known Mastodon servers that host LGBT people and marginalized communities. Here is just a handful of them:

  • tech.lgbt (listed reason: “Violated our Community Guidelines or Terms of Use”)
  • eldritch.cafe (listed reason: “Violated our Community Guidelines or Terms of Use”)
  • octodon.social (listed reason: “No publicly accessible feed, violated our Community Guidelines or Terms of Use”)
  • queer.party (listed reason: “Violated our Community Guidelines or Terms of Use”)
  • disabled.social (listed reason: “Violated our Community Guidelines or Terms of Use”)

There are even more with less obvious domain names. While some of these servers owners don’t mind being blocked by Threads, having themselves already blocked Meta’s platform, this still raises several questions. Why were these servers blocked? Was it just out of reprisal for blocking Threads? Which community guidelines did they breach? Is it nudity, even when it’s only allowed behind content warnings? If they ever reach compliance with those guidelines, will the suspension be lifted? How long does that take? Meta has a form for appealing blocked server decisions, but how are those reviewed? Is it just as automated as the majority of the moderation decisions made on the platform?

Mastodon and the fediverse are often described as a web of interconnected servers that enforce their own moderation policies towards each other, sometimes at the cost of user experience and clarity when selecting a new server to join. From what I can tell, this not only applies to Threads now, but also adds another layer to the ongoing moderation issues that contribute to Threads and Meta’s other platforms being a dangerous and repulsive environment for marginalized people. The escape hatch that the fediverse seemed to represent can now also act as a banishment system. With Meta is already unilaterally blocking LGBT accounts and letting the authors of hate speech go unpunished, who’s to say they won’t arbitrarily block other LGBT servers on the fediverse in the future as well?

It adds up.

You may ask, “Why write this on MacStories now? Do you expect Meta to change anything?” I realize that I’ve asked a lot of unanswered questions here, but no, I don’t. I don’t expect Meta to ever really address the fact that brick by brick it has rebuilt the same sort of problematic foundation on which Twitter was built under Jack Dorsey’s leadership. I don’t expect it to move away from its dangerous automated moderation systems that silence activists trying to speak out for their lives and the lives of their peers. I don’t expect the company to acknowledge that it may have played a part in artificially boosting far-right discourses on its platforms by banning all links to their main opponent during the last stretch of the French legislative elections.

However, I do have some sincere expectations for you all who are reading this. I expect tech journalists to more systematically report on everything that fits into this pattern. I expect tech podcasters to acknowledge that Threads is only a great alternative to Twitter if you’re a straight, white male. I expect that we all start understanding why some marginalized communities are staying on Twitter despite all of its horrific flaws. I expect that awareness of Meta’s consistent pattern of silencing marginalized voices can help direct funds, donations, and efforts to Mastodon to make it a durable, more wide-reaching alternative.

More than anything, I just expect to be heard. And for Threads, that seems too much to ask for.


Access Extra Content and Perks

Founded in 2015, Club MacStories has delivered exclusive content every week for nearly a decade.

What started with weekly and monthly email newsletters has blossomed into a family of memberships designed every MacStories fan.

Club MacStories: Weekly and monthly newsletters via email and the web that are brimming with apps, tips, automation workflows, longform writing, early access to the MacStories Unwind podcast, periodic giveaways, and more;

Club MacStories+: Everything that Club MacStories offers, plus an active Discord community, advanced search and custom RSS features for exploring the Club’s entire back catalog, bonus columns, and dozens of app discounts;

Club Premier: All of the above and AppStories+, an extended version of our flagship podcast that’s delivered early, ad-free, and in high-bitrate audio.

Learn more here and from our Club FAQs.

Join Now]]>
AI Companies Need to Be Regulated: An Open Letter to the U.S. Congress and European Parliament https://www.macstories.net/stories/ai-companies-need-to-be-regulated-an-open-letter-to-the-u-s-congress-and-european-parliament/ Tue, 02 Jul 2024 14:00:05 +0000 https://www.macstories.net/?p=75929 [CC0: No rights reserved](https://pxhere.com/en/photo/489513).

CC0: No rights reserved.

Federico: Historically, technology has usually advanced in lockstep with opening up new creative opportunities for people. From word processors allowing writers to craft their next novel to digital cameras letting photographers express themselves in new ways or capture more moments, technological progress over the past few decades has sustained creators and, perhaps more importantly, spawned industries that couldn’t exist before.

Technology has enabled millions of people like myself to realize their life’s dreams and make a living out of “creating content” in a digital age.

This is all changing with the advent of Artificial Intelligence products based on large language models. If left unchecked without regulation, we believe the change may be for the worse.

Over the past two years, we’ve witnessed the arrival of AI tools and services that often use human input without consent with the goal of faster and cheaper results. The fascination with maximization of profits above anything else isn’t a surprise in a capitalist industry, but it’s highly concerning nonetheless – especially since, this time around, the majority of these AI tools have been built on a foundation of non-consensual appropriation, also known as – quite simply – digital theft.

As we’ve documented on MacStories and as other (and larger) publications also investigated, it’s become clear that foundation models of different LLMs have been trained on content sourced from the open web without requesting publishers’ permission upfront. These models can then power AI interfaces that can regurgitate similar content or provide answers with hidden citations that seldom prioritize driving traffic to publishers. As far as MacStories is concerned, this is limited to text scraped from our website, but we’re seeing this play out in other industries too, from design assets to photos, music, and more. And top it all off, publishers and creators whose content was appropriated for training or crawled for generative responses (or both) can’t even ask AI companies to be transparent about which parts of their content was used. It’s a black box where original content goes in and derivative slop comes out.

We think this is all wrong.

The practices followed by the majority of AI companies are ethically unfair to publishers and brazenly walk a perilous line of copyright infringement that must be regulated. Most worryingly, if ignored, we fear that these tools may lead to a gradual erosion of the open web as we know it, diminishing individuals’ creativity and consolidating “knowledge” in the hands of a few tech companies that built their AI services on the back of web publishers and creators without their explicit consent.

In other words, we’re concerned that, this time, technology won’t open up new opportunities for creative people on the web. We fear that it’ll destroy them.

We want to do something about this. And we’re starting with an open letter, embedded below, that we’re sending on behalf of MacStories, Inc. to U.S. Senators who have sponsored AI legislation as well as Italian members of the E.U. Special Committee on Artificial Intelligence in a Digital Age.

In the letter, which we encourage other publishers to copy if they so choose, we outline our stance on AI companies taking advantage of the open web for training purposes, not compensating publishers for the content they appropriated and used, and not being transparent regarding the composition of their models’ data sets. We’re sending this letter in English today, with an Italian translation to follow in the near future.

I know that MacStories is merely a drop in the bucket of the open web. We can’t afford to sue anybody. But I’d rather hold my opinion strongly and defend my intellectual property than sit silently and accept something that I believe is fundamentally unfair for creators and dangerous for the open web. And I’m grateful to have a business partner who shares these ideals and principles with me.

With that being said, here’s a copy of the letter we’re sending to U.S. and E.U. representatives.


Hello,

We are writing to you on behalf of MacStories, Inc. in support of legislation regulating:

  • the non-consensual training of large language models by artificial intelligence companies using the intellectual property of third parties for commercial gain; and
  • the generation of AI-based content designed to replace or diminish the source material from which it was created

MacStories is a small U.S. media company that was founded in Italy by Federico Viticci in 2009. Today, MacStories operates MacStories.net and produces several podcasts covering apps, technology, videogames, and the world of media, which draw a worldwide audience centered in the EU and US.

As business owners with a long history of operating on the web, we wanted to share our perspective on Artificial Intelligence (“AI”) large language model (“LLM”) training and some of the products created using them. What’s come into sharp focus for us in the past several weeks is that, as an industry, companies training AI models don’t respect the intellectual property rights of web-based content creators. Moreover, the cavalier attitude of these companies toward decades-old norms on the Internet makes it clear that AI model training and some of the products built with them threaten the very foundations of the web as an outlet for human creativity and communication.

The danger to the Internet as a cultural institution is real and evolving as rapidly as AI technology itself. However, while the threat to the web is new and novel, what these AI companies are doing is not. Quite simply, it’s theft, which is something as old as AI is new. The thieves may be well-funded, and their misdeeds wrapped in a cloak of clever technology, but it’s still theft and must be stopped.

The source of the Internet’s strength is hyperlinks, which create value by connecting people and ideas in a way that is more valuable than the sum of their parts. But as the web grew, discovery became a problem. Google and other companies built search engines that use web crawlers to index the web. Search engines like Google’s are imperfect, but by and large, they offer a fair trade. In exchange for crawling and indexing a publisher’s website, links to that content appear in search results, sending traffic to the publisher. And, if a publisher doesn’t want their site crawled, they can opt out thanks to the Robots Exclusion Protocol by adding a simple robots.txt file to their website. It’s a social contract among the participants of the web that worked for decades before the advent of AI.

However, it turns out that feeding more raw material into an LLM produces models that perform better. As a result, the companies making these models have an insatiable appetite for text, images, and video, which led them straight to the web and strip mining its landscape for fuel to feed their voracious models.

The trouble with the companies developing LLMs is that instead of offering a fair trade to publishers and other creators and respecting their wishes about whether their content is crawled, they just took it, and in some cases, brazenly lied to everyone along the way. The breadth of offenders is staggering. This isn’t just a startup problem. In fact, a wide swath of the tech industry, including behemoths like Apple, Google, Microsoft, and Meta, have joined OpenAI, Anthropic, and Perplexity, to ingest the intellectual property of publishers without their consent, and then used that property to build their own commercial products. None of them paid any regard to the Robots Exclusion Protocol. Instead, some offered a way for publishers to opt out of their crawling activities, but only after they’d already taken the entire corpus of the Internet like a thief offering a shopkeeper a lock after emptying their storefront.

Some companies have gone even further, devising products aimed at replacing the web as we know it by substituting AI-generated web pages for source material, which in many situations amounts to plagiarism. Perplexity Pages, The Browser Company’s Arc Search app, and the incorporation of AI answers in Google Search results are all designed to step between people and the creators of web content. All profess to drive traffic to the source material with obfuscated citations, but as Wired recently reported, which we’ve also seen, the traffic these products drive is negligible.

As technology writers and podcasters, we’ve built careers on our enthusiasm and excitement for new technology and how it can help people. AI is no different. There are roles it can play in fighting disease, climate change, and other challenges, big and small, that are faced by humanity. However, in the race to advance AI and satisfy investors, the tech industry has lost sight of the value of the Internet itself. Left unchecked, this devaluation of Internet culture will undermine the ability of today’s creators to earn a fair wage from their work and prevent the next generation of creators from ever hoping to do the same.

Consequently, on behalf of ourselves and similarly situated Internet publishers and creators, we request that you support legislation regulating the artificial intelligence industry to prevent further damage from being done and to compensate creators whose work has already been misappropriated without their consent. The existing tools for protecting what is published on the web are too limited and imperfect. What’s needed is a comprehensive regulatory regime that puts content creators on an even footing with the companies that want to use what they publish to feed their models. That starts by putting publishers in control of their content, requiring their consent before it can be used to train LLMs, and mandating transparency regarding the source material used to train those models.

Federico Viticci, Editor-in-Chief and Co-Owner of MacStories
John Voorhees, Managing Editor and Co-Owner of MacStories


Access Extra Content and Perks

Founded in 2015, Club MacStories has delivered exclusive content every week for nearly a decade.

What started with weekly and monthly email newsletters has blossomed into a family of memberships designed every MacStories fan.

Club MacStories: Weekly and monthly newsletters via email and the web that are brimming with apps, tips, automation workflows, longform writing, early access to the MacStories Unwind podcast, periodic giveaways, and more;

Club MacStories+: Everything that Club MacStories offers, plus an active Discord community, advanced search and custom RSS features for exploring the Club’s entire back catalog, bonus columns, and dozens of app discounts;

Club Premier: All of the above and AppStories+, an extended version of our flagship podcast that’s delivered early, ad-free, and in high-bitrate audio.

Learn more here and from our Club FAQs.

Join Now]]>
Wired Confirms Perplexity Is Bypassing Efforts by Websites to Block Its Web Crawler https://www.macstories.net/stories/wired-confirms-perplexity-is-bypassing-efforts-by-websites-to-block-its-web-crawler/ Thu, 20 Jun 2024 15:09:36 +0000 https://www.macstories.net/?p=75856 Last week, Federico and I asked Robb Knight to do what he could to block web crawlers deployed by artificial intelligence companies from scraping MacStories. Robb had already updated his own site’s robots.txt file months ago, so that’s the first thing he did for MacStories.

However, robots.txt only works if a company’s web crawler is set up to respect the file. As I wrote earlier this week, a better solution is to block them on your server, which Robb did on his personal site and wrote about late last week. The setup sends a 403 error if one of the bots listed in his server code requests information from his site.

Spoiler: Robb hit the nail on the head the first time.

Spoiler: Robb hit the nail on the head the first time.

After reading Robb’s post, Federico and I asked him to do the same for MacStories, which he did last Saturday. Once it was set up, Federico began testing the setup. OpenAI returned an error as expected, but Perplexity’s bot was still able to reach MacStories, which shouldn’t have been the case.1

Yes, I took a screenshot of Perplexity's API documentation because I bet it changes based on what we discovered.

Yes, I took a screenshot of Perplexity’s API documentation because I bet it changes based on what we discovered.

That began a deep dive to try to figure out what was going on. Robb’s code checked out, blocking the user agent specified in Perplexity’s own API documentation. What we discovered after more testing was that Perplexity was hitting MacStories’ server without using the user agent it said it used, effectively doing an end run around Robb’s server code.

Robb wrote up his findings on his website, which promptly shot to the top slot on Hacker News and caught the eye of Dhruv Mehrotra and Tim Marchman of Wired, who were in the midst of investigating how Perplexity works. As Mehrotra and Marchman describe it:

A WIRED analysis and one carried out by developer Robb Knight suggest that Perplexity is able to achieve this partly through apparently ignoring a widely accepted web standard known as the Robots Exclusion Protocol to surreptitiously scrape areas of websites that operators do not want accessed by bots, despite claiming that it won’t. WIRED observed a machine tied to Perplexity—more specifically, one on an Amazon server and almost certainly operated by Perplexity—doing this on wired.com and across other Condé Nast publications.

Until earlier this week, Perplexity published in its documentation a link to a list of the IP addresses its crawlers use—an apparent effort to be transparent. However, in some cases, as both WIRED and Knight were able to demonstrate, it appears to be accessing and scraping websites from which coders have attempted to block its crawler, called Perplexity Bot, using at least one unpublicized IP address. The company has since removed references to its public IP pool from its documentation.

That secret IP address—44.221.181.252—has hit properties at Condé Nast, the media company that owns WIRED, at least 822 times in the last three months. One senior engineer at Condé Nast, who asked not to be named because he wants to “stay out of it,” calls this a “massive undercount” because the company only retains a fraction of its network logs.

WIRED verified that the IP address in question is almost certainly linked to Perplexity by creating a new website and monitoring its server logs. Immediately after a WIRED reporter prompted the Perplexity chatbot to summarize the website’s content, the server logged that the IP address visited the site. This same IP address was first observed by Knight during a similar test.

This sort of unethical behavior is why we took the steps we did to block the use of MacStories’ websites as training data for Perplexity and other companies.2 Incidents like this and the lack of transparency about how AI companies train their models have led to a lot of mistrust in the entire industry among creators who publish on the web. I’m glad we’ve been able to play a small part in revealing Perplexity’s egregious behavior, but more needs to be done to rein in this sort of behavior, including closer scrutiny by regulators around the world.

As a footnote to this, it’s worth noting that Wired also puts to rest the argument that websites should be okay with Perplexity’s behavior because they include citations in their plagiarism. According to Wired’s story:

WIRED’s own records show that Perplexity sent 1,265 referrals to wired.com in May, an insignificant amount in the context of the site’s overall traffic. The article to which the most traffic was referred got 17 views.

That’s next to nothing for a site with Wired’s traffic, which Similarweb and other sites peg at over 20 million page views that same month. That’s a mere 0.006% of Wired’s May traffic. Let that sink in, and then ask yourself whether it seems like a fair trade.


  1. Meanwhile, I was digging through bins of old videogames and hardware at a Retro Gaming Festival doing ‘research’ for NPC↩︎
  2. Mehrotra and Marchman correctly question whether Perplexity is even an AI company because they piggyback on other company’s LLMs and use them in conjunction with scraped web data to provide summaries that effectively replace the source’s content. However, that doesn’t change the fact that Perplexity is surreptitiously scraping sites while simultaneously professing to respect sites’ robot.txt file. That’s the unethical bit. ↩︎

Access Extra Content and Perks

Founded in 2015, Club MacStories has delivered exclusive content every week for nearly a decade.

What started with weekly and monthly email newsletters has blossomed into a family of memberships designed every MacStories fan.

Club MacStories: Weekly and monthly newsletters via email and the web that are brimming with apps, tips, automation workflows, longform writing, early access to the MacStories Unwind podcast, periodic giveaways, and more;

Club MacStories+: Everything that Club MacStories offers, plus an active Discord community, advanced search and custom RSS features for exploring the Club’s entire back catalog, bonus columns, and dozens of app discounts;

Club Premier: All of the above and AppStories+, an extended version of our flagship podcast that’s delivered early, ad-free, and in high-bitrate audio.

Learn more here and from our Club FAQs.

Join Now]]>
How We’re Trying to Protect MacStories from AI Bots and Web Crawlers – And How You Can, Too https://www.macstories.net/stories/ways-you-can-protect-your-website-from-ai-web-crawlers/ Mon, 17 Jun 2024 15:22:38 +0000 https://www.macstories.net/?p=75835 Over the past several days, we’ve made some changes at MacStories to address the ingestion of our work by web crawlers operated by artificial intelligence companies. We’ve learned a lot, so we thought we’d share what we’ve done in case anyone else would like to do something similar.

If you read MacStories regularly, or listen to our podcasts, you already know that Federico and I think that crawling the Open Web to train large language models is unethical. Industry-wide, AI companies have scraped the content of websites like ours, using it as the raw material for their chatbots and other commercial products without the consent or compensation of publishers and other creators.

Now that the horse is out of the barn, some of those companies are respecting publishers’ robots.txt files, while others seemingly aren’t. That doesn’t make up for the tens of thousands of articles and images that have already been scraped from MacStories. Nor is robots.txt a complete solution, so it’s just one of four approaches we’re taking to protect our work.

Preventing AI Crawlers Using Robots.txt

The first step, and one of the easiest to implement, is to request that the web crawlers of AI companies not crawl your site using robots.txt. The trouble with this approach is that it’s nothing more than the Internet equivalent of an “AI Bots Keep Out” sign hung on your website. It can be ignored and only works if crawlers identify themselves, which not all seem to do. That said, it’s a good first step and the first thing we did. I highly recommend Dan Moren’s article on Six Colors that I linked to last week for more information about robots.txt and details on implementing it on your site.

Blocking AI Bots at Your Server

We don’t trust AI companies to respect our robots.txt file. After all, they already took our content without our consent. So, we went a step further and blocked known AI crawlers at the server level with the help of Robb Knight. Doing so requires that you know your way around a web server, but it’s more effective than simply editing your robots.txt file. If you want to learn more about configuring your site to block AI crawlers, Robb has written about the work he did for his personal site and MacStories here.

Update Your Terms of Service

I also recommend having a Terms of Service for your website. The New York Times, which is currently litigating OpenAI’s LLM training practices updated their terms of service late last summer, which we’ve used as a guide to carefully define how MacStories content, whether it’s an article, image, or podcast, can be used in our own Terms of Service.

Rest assured, you have a lot of latitude for personal use of MacStories content. Nor do we have an issue with commercial uses that use reasonable portions of our content as long as they are properly attributed in line with the content that is used. However, we do not consent to the use of our content for AI model training.

Support Legislation Regulating AI Training

None of the above are complete solutions, which is why we support legislation regulating how AI companies train their LLMs. Last summer, media organizations from around the world signed an open letter asking lawmakers to regulate LLM training, stating:

We, the undersigned organizations, support the responsible advancement and deployment of generative AI technology, while believing that a legal framework must be developed to protect the content that powers AI applications as well as maintain public trust in the media that promotes facts and fuels our democracies.

The letter goes to the heart of something we believe, too. We’re not against artificial intelligence as a technology. Many of the tools being built are promising. However, we don’t believe that it’s right for tech companies worth billions and even trillions of dollars to be given a pass for building those tools on the backs of others’ work, especially in an economic environment where so many online media companies are struggling to survive. It’s just not right.


The solutions above aren’t perfect or foolproof, and as a result, some people have told us that we shouldn’t bother; we should just give in. In a sign of just how strapped media companies are for cash, others have cut deals with AI companies figuring that getting something is better than nothing.

But, here’s the thing. The web is a special place. Every day, it brings people from around the world together to share their thoughts and express their creativity. That’s something nobody should take for granted, and it’s worth protecting. AI is cool and all, but it’s not worth destroying the web.


Access Extra Content and Perks

Founded in 2015, Club MacStories has delivered exclusive content every week for nearly a decade.

What started with weekly and monthly email newsletters has blossomed into a family of memberships designed every MacStories fan.

Club MacStories: Weekly and monthly newsletters via email and the web that are brimming with apps, tips, automation workflows, longform writing, early access to the MacStories Unwind podcast, periodic giveaways, and more;

Club MacStories+: Everything that Club MacStories offers, plus an active Discord community, advanced search and custom RSS features for exploring the Club’s entire back catalog, bonus columns, and dozens of app discounts;

Club Premier: All of the above and AppStories+, an extended version of our flagship podcast that’s delivered early, ad-free, and in high-bitrate audio.

Learn more here and from our Club FAQs.

Join Now]]>