iOS 18 – MacStories https://www.macstories.net Apple news, app reviews, and stories by Federico Viticci and friends. Fri, 07 Feb 2025 17:55:46 +0000 en-US hourly 1 https://wordpress.org/?v=5.3.2 iOS and iPadOS 18.2: Everything New Besides Apple Intelligence https://www.macstories.net/reviews/ios-and-ipados-18-2-everything-new-besides-apple-intelligence/ Wed, 11 Dec 2024 13:24:34 +0000 https://www.macstories.net/?p=77411

Today, Apple is releasing iOS and iPadOS 18.2, the second major updates to the iPhone and iPad’s latest operating system versions. Once again, this release’s main highlight is a wave of new Apple Intelligence features that are now available to the public. And just like in October, we’re covering these new AI features separately in a special story for MacStories readers. Be sure to check out Federico’s story, which goes over the new Apple Intelligence features included in iOS and iPadOS 18.2.

But besides another batch of Apple Intelligence features, this release also includes a series of changes to the system, from updates to Safari, Find My, and Photos to the arrival of new system-wide settings for Default Apps and more. Here’s a roundup of everything new besides Apple Intelligence in iOS and iPadOS 18.2.

New Default Apps Settings

Earlier this year, Apple announced the addition of a new Settings screen in iOS and iPadOS 18.2 for choosing default apps as part of its plan to comply with the Digital Markets Act in the European Union, alongside a redesigned default browser choice screen. While it was possible to change these settings before, choosing a new default app required you to navigate to the individual app’s Settings screen (for example, Settings → Apps → Safari to make Safari the default browser). The new screen centralizes all of the default app options and is accessible at the top of the Apps section in the Settings app.

From this screen, you can set the default app for each of the following features:

  • Browsing the web
  • Sending emails
  • Sending messages
  • Dialing phone numbers
  • Filtering spam calls
  • Managing passwords
  • Using alternative keyboards
  • Performing contactless payments

If you live in the EU, you may notice an additional default app setting that allows you to set a default app marketplace instead of Apple’s App Store.

Note that you may not have any options currently available in one or more of the default app categories. To appear in the list of available defaults, developers will need to update their apps and make sure they qualify in their category.

Safari

Perhaps surprisingly, Safari ships with several new and previously unannounced features in iOS and iPadOS 18.2.

Since the release of iOS and iPadOS 15, we’ve had the ability to customize the browser’s start page with an Apple-provided wallpaper or with any image from Photos. Starting today, Safari now offers six new background options, bringing the total to 16. The first two each picture an abstract landscape, the next are renders of some of Safari’s glyphs and icons, and the last two form a pattern of curvy shapes and gradients. I’m a fan of the pastel color choices in these new backgrounds, but I do wish Apple would also make them available to use as wallpapers on the Lock and Home Screens.

These are the six new background options for Safari in iOS and iPadOS 18.2. They are also available on the Mac in Sequoia 15.2.

These are the six new background options for Safari in iOS and iPadOS 18.2. They are also available on the Mac in Sequoia 15.2.

In addition to the new backgrounds, Safari now supports linking to text highlights on web pages. The idea here is that you can select text on any web page, tap ‘Copy Link with Highlight’, and share the copied link so that anyone opening it will immediately be shown the specific text you highlighted in yellow on that web page. It’s a small but amazing functionality for any kind of work that relies on referencing things from the web. This is what it looks like on the iPhone:

To share a highlighted portion of text on a web page in Safari, select the text, scroll the edit menu (1), and tap 'Copy Link with Highlight' (2). Anyone opening the link in a supported browser will automatically be directed to the highlighted text in yellow.

To share a highlighted portion of text on a web page in Safari, select the text, scroll the edit menu (1), and tap ‘Copy Link with Highlight’ (2). Anyone opening the link in a supported browser will automatically be directed to the highlighted text in yellow.

Linking to text highlights on the web isn’t entirely new; the feature was already supported by some web browsers, most notably Google Chrome and Chromium-based browsers. However, now that the feature is also available in Safari (on all of Apple’s platforms, including macOS), I suspect that it will slowly become hugely popular.

Last but not least in Safari, the app will spawn a Live Activity when a download is in progress, and it now features HTTPS Priority, an aptly named mechanism that will automatically upgrade HTTP URLs to HTTPS when available.

Photos

In iOS 18, the Photos app received a major redesign and a series of new features, including Collections. 18.2 includes several tweaks to the new Photos app, most of which revolve around navigation.

In the Recently Viewed collection, you can now clear the full history of recently viewed media. If you long-press on a photo or video, you can also remove it individually from the collection.

Navigating collections is now easier, too. Apple has finally re-added the ability to swipe right to go back pretty much anywhere in the Photos app, including when you’re several levels deep into an album folder. It’s hard to know whether it was a bug or if Apple was being intentional about not implementing the gesture in certain places in the app before, but I’m glad to report that it is now consistently available.

Swiping right to go back now works consistently across the app.

Apple has also improved the video viewing experience in the Photos app. Unlike in previous versions of iOS 18, the app will no longer zoom in and out on a video when you tap the screen. Instead, the video will now always fit the width of the screen, and you can tap the screen to show playback controls, the video scrubber, and the gallery view along the bottom of the screen. These will appear over the video and disappear if you tap the screen again. But that’s not all: you can now scrub videos on a precise frame-by-frame basis, and there’s a new option to disable auto-looping video playback.

The improved video player in Photos now lets you scrub frame by frame, which is indicated by a decimal in the timecode above the scrubber.

The improved video player in Photos now lets you scrub frame by frame, which is indicated by a decimal in the timecode above the scrubber.

Overall, these small changes and improvements have helped reduce the friction I’ve been feeling since I first started using the drastically redesigned Photos app this summer. I’m hoping Apple will keep iterating on it.

And More…

Return of the volume slider on the Lock Screen. This will make a ton of people happy. iOS 18.2 brings back the volume slider on the Lock Screen when music or media is playing, in the form of an Accessibility setting. The slider was removed in iOS 16 when Apple redesigned the Lock Screen, and since then, it has only appeared under certain circumstances, such as when controlling an AirPlay device. But now, you can bring it back permanently.

To add the volume slider back to your Lock Screen, head to Settings → Accessibility → Audio & Visual, and turn on ‘Always Show Volume Control’.

Head to Settings → Accessibility → Audio & Visual to bring back the volume slider on the Lock Screen.

Head to Settings → Accessibility → Audio & Visual to bring back the volume slider on the Lock Screen.

New Camera Control settings for iPhone 16 models. iPhone 16 owners now have the ability to lock the camera’s exposure and focus with a light press on the Camera Control. The AE/AF Lock option can be enabled by going to Settings → Camera → Camera Control.

There is also a new option to adjust the double-click speed of Camera Control. iPhone 16 users can choose between Default, Slow, and Slower. The new adjustment options are present alongside previously available options to tweak the double light-press speed and the light-press force.

New Voice Memos features for iPhone 16 Pro models. Originally announced at Apple’s September event, iPhone 16 Pro models now have access to an upgraded version of the Voice Memos app with support for layered recording. Additionally, these multitrack projects can later be imported into Logic Pro.

Natural language search in Apple Music and Apple TV. In the Music and TV apps, Apple says you can now use natural language to search for media by specifying genres, moods, actors, decades, and more. In my experience so far, natural language search in these apps doesn’t seem to make a huge difference when you’re only using a handful of words, and it starts to break down as soon as you try to input longer phrases.

Shazam history now includes location. If you’ve ever wondered where you heard a specific song, iOS 18.2 has you covered. Now, after you ask Shazam to recognize a song, you can go back to your song history and tap on each one to reveal a minimap pinpointing where you heard it.

A map pin is now included below each of your recently recognized songs.

A map pin is now included below each of your recently recognized songs.

Favorite categories in Podcasts. In Podcasts, a new Categories section has been added to the Library tab. This essentially gives you access to Apple Podcasts’ catalog of categories. You can now also choose favorite categories, and your favorites will appear at the top of the Categories section in your Library. Additionally, the Search tab in Podcasts now dynamically reorders the category tiles depending on your podcast listening habits.

Favorite categories and the personalized Search tab in the Podcasts app

Favorite categories and the personalized Search tab in the Podcasts app

Better consistency for dark and tinted icons across Settings and the share sheet. App icons in the share sheet and Settings now reflect your light or dark mode preference and even icon tinting if you’ve enabled it on the Home Screen.

Find My now supports sharing AirTags with airlines. Find My has a new option to share an item’s location with an “airline or trusted person” who can help you locate something that you’ve misplaced. In the app, select an item you’re tracking with an AirTag (or other Find My-compatible tracker), and tap ‘Share Item Location’. This will generate a link that you can share with someone else so they can view the location of the lost item.

New ‘Get Current App’ action in Shortcuts. This is huge news for Shortcuts nerds: you can use the new ‘Get Current App’ action to detect which app is currently active on-screen and automate accordingly.

AirPods Pro Hearing Test and Hearing Aid features are available in more countries. The new Hearing Test and Hearing Aid features were recently launched in the U.S. and several other countries. However, these features were unavailable to many AirPods Pro 2 owners worldwide. This was primarily due to pending approval from the relevant authorities in each nation for these health features.

With iOS 18.2, Apple is rolling out the AirPods Pro 2’s Hearing Test feature to the following countries:

  • Cyprus
  • Czechia
  • France
  • Italy
  • Luxembourg
  • Romania
  • Spain
  • United Arab Emirates
  • United Kingdom

Sadly, the Hearing Aid functionality is only coming to one additional country in this release, the United Arab Emirates.


That’s it for iOS and iPadOS 18.2. Not unlike 18.1, this release includes a significant number of non-AI changes and additions, and I’m glad to see that Apple is still iterating on many of its native apps throughout the year.

You can update your device to iOS or iPadOS 18.2 today by navigating to Settings → General → Software Update.


Access Extra Content and Perks

Founded in 2015, Club MacStories has delivered exclusive content every week for nearly a decade.

What started with weekly and monthly email newsletters has blossomed into a family of memberships designed every MacStories fan.

Club MacStories: Weekly and monthly newsletters via email and the web that are brimming with apps, tips, automation workflows, longform writing, early access to the MacStories Unwind podcast, periodic giveaways, and more;

Club MacStories+: Everything that Club MacStories offers, plus an active Discord community, advanced search and custom RSS features for exploring the Club’s entire back catalog, bonus columns, and dozens of app discounts;

Club Premier: All of the above and AppStories+, an extended version of our flagship podcast that’s delivered early, ad-free, and in high-bitrate audio.

Learn more here and from our Club FAQs.

Join Now]]>
Apple Intelligence in iOS 18.2: A Deep Dive into Working with Siri and ChatGPT, Together https://www.macstories.net/stories/apple-intelligence-and-chatgpt-in-18-2/ Wed, 11 Dec 2024 13:10:15 +0000 https://www.macstories.net/?p=77415 The ChatGPT integration in iOS 18.2.

The ChatGPT integration in iOS 18.2.

Apple is releasing iOS and iPadOS 18.2 today, and with those software updates, the company is rolling out the second wave of Apple Intelligence features as part of their previously announced roadmap that will culminate with the arrival of deeper integration between Siri and third-party apps next year.

In today’s release, users will find native integration between Siri and ChatGPT, more options in Writing Tools, a smarter Mail app with automatic message categorization, generative image creation in Image Playground, Genmoji, Visual Intelligence, and more. It’s certainly a more ambitious rollout than the somewhat disjointed debut of Apple Intelligence with iOS 18.1, and one that will garner more attention if only by virtue of Siri’s native access to OpenAI’s ChatGPT.

And yet, despite the long list of AI features in these software updates, I find myself mostly underwhelmed – if not downright annoyed – by the majority of the Apple Intelligence changes, but not for the reasons you may expect coming from me.

Some context is necessary here. As I explained in a recent episode of AppStories, I’ve embarked on a bit of a journey lately in terms of understanding the role of AI products and features in modern software. I’ve been doing a lot of research, testing, and reading about the different flavors of AI tools that we see pop up on almost a daily basis now in a rapidly changing landscape. As I discussed on the show, I’ve landed on two takeaways, at least for now:

  • I’m completely uninterested in generative products that aim to produce images, video, or text to replace human creativity and input. I find products that create fake “art” sloppy, distasteful, and objectively harmful for humankind because they aim to replace the creative process with a thoughtless approximation of what it means to be creative and express one’s feelings, culture, and craft through genuine, meaningful creative work.
  • I’m deeply interested in the idea of assistive and agentic AI as a means to remove busywork from people’s lives and, well, assist people in the creative process. In my opinion, this is where the more intriguing parts of the modern AI industry lie:
    • agents that can perform boring tasks for humans with a higher degree of precision and faster output;
    • coding assistants to put software in the hands of more people and allow programmers to tackle higher-level tasks;
    • RAG-infused assistive tools that can help academics and researchers; and
    • protocols that can map an LLM to external data sources such as Claude’s Model Context Protocol.

I see these tools as a natural evolution of automation and, as you can guess, that has inevitably caught my interest. The implications for the Accessibility community in this field are also something we should keep in mind.

To put it more simply, I think empowering LLMs to be “creative” with the goal of displacing artists is a mistake, and also a distraction – a glossy facade largely amounting to a party trick that gets boring fast and misses the bigger picture of how these AI tools may practically help us in the workplace, healthcare, biology, and other industries.

This is how I approached my tests with Apple Intelligence in iOS and iPadOS 18.2. For the past month, I’ve extensively used Claude to assist me with the making of advanced shortcuts, used ChatGPT’s search feature as a Google replacement, indexed the archive of my iOS reviews with NotebookLM, relied on Zapier’s Copilot to more quickly spin up web automations, and used both Sonnet 3.5 and GPT-4o to rethink my Obsidian templating system and note-taking workflow. I’ve used AI tools for real, meaningful work that revolved around me – the creative person – doing the actual work and letting software assist me. And at the same time, I tried to add Apple’s new AI features to the mix.

Perhaps it’s not “fair” to compare Apple’s newfangled efforts to products by companies that have been iterating on their LLMs and related services for the past five years, but when the biggest tech company in the world makes bold claims about their entrance into the AI space, we have to take them at face value.

It’s been an interesting exercise to see how far behind Apple is compared to OpenAI and Anthropic in terms of the sheer capabilities of their respective assistants; at the same time, I believe Apple has some serious advantages in the long term as the platform owner, with untapped potential for integrating AI more deeply within the OS and apps in a way that other AI companies won’t be able to. There are parts of Apple Intelligence in 18.2 that hint at much bigger things to come in the future that I find exciting, as well as features available today that I’ve found useful and, occasionally, even surprising.

With this context in mind, in this story you won’t see any coverage of Image Playground and Image Wand, which I believe are ridiculously primitive and perfect examples of why Apple may think they’re two years behind their competitors. Image Playground in particular produces “illustrations” that you’d be kind to call abominations; they remind me of the worst Midjourney creations from 2022. Instead, I will focus on the more assistive aspects of AI and share my experience with trying to get work done using Apple Intelligence on my iPhone and iPad alongside its integration with ChatGPT, which is the marquee addition of this release.

Let’s dive in.

ChatGPT Integration: Siri and Writing Tools

Apple Intelligence in iOS and iPadOS 18.2 offers direct integration with OpenAI’s ChatGPT using the GPT-4o model. This is based on a ChatGPT extension that can be enabled in Settings ⇾ Apple Intelligence & Siri ⇾ Extensions.

Setting up the ChatGPT extension.

Setting up the ChatGPT extension.

The mere existence of an ‘Extensions’ section seems to confirm that Apple may consider offering other LLMs in the future in addition to ChatGPT, but that’s a story for another time. For now, you can only choose to activate the ChatGPT extension (it’s turned off by default), and in doing so, you have two options. You can choose to use ChatGPT as an anonymous, signed-out user. In this case, your IP address will be obscured on OpenAI’s servers, and only the contents of your request will be sent to ChatGPT. According to Apple, while in this mode, OpenAI must process your request and discard it afterwards; furthermore, the request won’t be used to improve or train OpenAI’s models.

You can also choose to log in with an existing ChatGPT account directly from the Settings app. When logged in, OpenAI’s data retention policies will apply, and your requests may be used for training of the company’s models. Furthermore, your conversations with Siri that involve ChatGPT processing will be saved in your OpenAI account, and you’ll be able to see your previous Siri requests in ChatGPT’s conversation sidebar in the ChatGPT app and website.

The onboarding flow for ChatGPT.

The onboarding flow for ChatGPT.

You have the option to use ChatGPT for free or with your paid ChatGPT Plus account. In the ChatGPT section of the Settings app, Apple shows the limits that are in place for free users and offers an option to upgrade to a Plus account directly from Settings. According to Apple, only a small number of requests that use the latest GPT-4o and DALL-E 3 models can be processed for free before having to upgrade. For this article, I used my existing ChatGPT Plus account, so I didn’t run into any limits.

The ChatGPT login flow in Settings.

The ChatGPT login flow in Settings.

But how does Siri actually determine if ChatGPT should swoop in and answer a question on its behalf? There are more interesting caveats and implementation details worth covering here.

By default, Siri tries to determine if any regular request may be best answered by ChatGPT rather than Siri itself. In my experience, this usually means that more complicated questions or those that pertain to “world knowledge” outside of Siri’s domain get handed off to ChatGPT and are subsequently displayed by Siri with its new “snippet” response style in iOS 18 that looks like a taller notification banner.

A response from ChatGPT displayed in the new Siri UI.

A response from ChatGPT displayed in the new Siri UI.

For instance, if I ask “What’s the capital of Italy?”, Siri can respond with a rich snippet that includes its own answer accompanied by a picture. However, if I ask “What’s the capital of Italy, and has it always been the capital of Italy?”, the additional information required causes Siri to automatically fall back to ChatGPT, which provides a textual response.

Basic questions (left) can still be answered by Siri itself; ask for more details, however, and ChatGPT comes in.

Basic questions (left) can still be answered by Siri itself; ask for more details, however, and ChatGPT comes in.

Siri knows its limits; effectively, ChatGPT has replaced the “I found this on the web” results that Siri used to bring up before it had access to OpenAI’s knowledge. In the absence of a proper Siri LLM (more on this later), I believe this is a better compromise than the older method that involved Google search results. At the very least, now you’re getting an answer instead of a bunch of links.

You can also format your request to explicitly ask Siri to query ChatGPT. Starting your request with “Ask ChatGPT…” is a foolproof technique to go directly to ChatGPT, and you should use it any time you’re sure Siri won’t be able to answer immediately.

I should also note that, by default, Siri in iOS 18.2 will always confirm with you whether you want to send a request to ChatGPT. There is, however, a way to turn off these confirmation prompts: on the ChatGPT Extension screen in Settings, turn off the ‘Confirm ChatGPT Requests’ option, and you’ll no longer be asked if you want to pass a request to ChatGPT every time. Keep in mind, though, that this preference is ignored when you’re sending files to ChatGPT for analysis, in which case you’ll always be asked to confirm your request since those files may contain sensitive information.

By default, you'll be asked to confirm if you want to use ChatGPT to answer questions. You can turn this off.

By default, you’ll be asked to confirm if you want to use ChatGPT to answer questions. You can turn this off.

The other area of iOS and iPadOS that is receiving ChatGPT integration today is Writing Tools, which debuted in iOS 18.1 as an Apple Intelligence-only feature. As we know, Writing Tools are now prominently featured system-wide in any text field thanks to their placement in the edit menu, and they’re also available directly in the top toolbar of the Notes app.

The updated Writing Tools in iPadOS 18.2.

The updated Writing Tools in iPadOS 18.2.

In iOS 18.2, Writing Tools gain the ability to refine text by letting you describe changes you want made, and they also come with a new ‘Compose’ submenu powered by ChatGPT, which lets you ask OpenAI’s assistant to write something for you based on the content of the document you’re working on.

If the difference between the two sounds confusing, you’re not alone. Here’s how you can think about it, though: the ‘Describe your change’ text field at the top of Writing Tools defaults to asking Apple Intelligence, but may fall back to ChatGPT if Apple Intelligence doesn’t know what you mean; the Compose menu always uses ChatGPT. It’s essentially just like Siri, which tries to answer on its own, but may rely on ChatGPT and also includes a manual override to skip Apple Intelligence altogether.

The ability to describe changes is a more freeform way to rewrite text beyond the three default buttons available in Writing Tools for Friendly, Professional, and Concise tones.

The ability to describe changes is a more freeform way to rewrite text beyond the three default buttons available in Writing Tools for Friendly, Professional, and Concise tones.

With Compose, you can use the contents of a note as a jumping-off point to add any other content you want via ChatGPT.

With Compose, you can use the contents of a note as a jumping-off point to add any other content you want via ChatGPT.

You can also refine results in the Compose screen with follow-up questions while retaining the context of the current document. In this case, ChatGPT composed a list of more games similar to Wind Waker, which was the main topic of the note.

You can also refine results in the Compose screen with follow-up questions while retaining the context of the current document. In this case, ChatGPT composed a list of more games similar to Wind Waker, which was the main topic of the note.

In testing the updated Writing Tools with ChatGPT integration, I’ve run into some limitations that I will cover below, but I also had two very positive experiences with the Notes app that I want to mention here since they should give you an idea of what’s possible.

In my first test, I was working with a note that contained a list of payments for my work at MacStories and Relay FM, plus the amount of taxes I was setting aside each month. The note originated in Obsidian, and after I pasted it into Apple Notes, it lost all its formatting.

There were no proper section headings, the formatting was inconsistent between paragraphs, and the monetary amounts had been entered with different currency symbols for EUR. I wanted to make the note look prettier with consistent formatting, so I opened the ‘Compose’ field of Writing Tools and sent ChatGPT the following request:

This is a document that describes payments I sent to myself each month from two sources: Relay FM and MacStories. The currency is always EUR. When I mention “set aside”, it means I set aside a percentage of those combined payments for tax purposes. Can you reformat this note in a way that makes more sense?

Where I started.

Where I started.

I hit Return, and after a few seconds, ChatGPT reworked my text with a consistent structure organized into sections with bullet points and proper currency formatting. I was immediately impressed, so I accepted the suggested result, and I ended up with the same note, elegantly formatted just like I asked.

And the formatted result, composed by ChatGPT.

And the formatted result, composed by ChatGPT.

This shouldn’t come as a surprise: ChatGPT – especially the GPT-4o model – is pretty good at working with numbers. Still, this is the sort of use case that makes me optimistic about this flavor of AI integration; I could have done this manually by carefully selecting text and manually making each line consistent, but it was going to be boring busywork that would have wasted a bunch of my time. And that’s time that is, frankly, best spent doing research, writing, or promoting my work on social media. Instead, Writing Tools and ChatGPT worked with my data, following a natural language query, and modified the contents of my note in seconds. Even better, after the note had been successfully updated, I was able to ask for more additional information including averages, totals for each revenue source, and more. I could have done this in a spreadsheet, but I didn’t want to (and I also never understood formulas), and it was easier to do so with natural language in a popup menu of the Notes app.

Fun detail: here's how a request initiated from the Notes app gets synced to your ChatGPT account. Note the <code>prompt</code> and <code>surroundingText</code> keys of the JSON object the Notes app sends to ChatGPT.

Fun detail: here’s how a request initiated from the Notes app gets synced to your ChatGPT account. Note the prompt and surroundingText keys of the JSON object the Notes app sends to ChatGPT.

The second example of ChatGPT and Writing Tools applied to regular MacStories work involves our annual MacStories Selects awards. Before getting together with the MacStories team on a Zoom call to discuss our nominees and pick winners, we created a shared note in Apple Notes where different writers entered their picks. When I opened the note, I realized that I was behind others and forgot to enter the different categories of awards in my section of the document. So I invoked ChatGPT’s Compose menu under a section heading with my name and asked:

Can you add a section with the names of the same categories that John used? Just the names of those categories.

My initial request.

My initial request.

A few seconds later, Writing Tools pasted this section below my name:

This may seem like a trivial task, but I don’t think it is. ChatGPT had to evaluate a long list of sections (all formatted differently from one another), understand where the sections entered by John started and ended, and extract the names of categories, separating them from the actual picks under each category. Years ago, I would have had to do a lot of copying and pasting, type it all out manually, or write a shortcut with regular expressions to automate this process. Now, the “automation” takes place as a natural language command that has access to the contents of a note and can reformat it accordingly.

As we’ll see below, there are plenty of scenarios in which Writing Tools, despite the assistance from ChatGPT, fails at properly integrating with the Notes app and understanding some of the finer details behind my requests. But given that this is the beginning of a new way to think about working with text in any text field (third-party developers can integrate with Writing Tools), I’m excited about the prospect of abstracting app functionalities and formatting my documents in a faster, more natural way.

The Limitations – and Occasional Surprises – of Siri’s Integration with ChatGPT

Having used ChatGPT extensively via its official app on my iPhone and iPad for the past month, one thing is clear to me: Apple has a long way to go if they want to match what’s possible with the standalone ChatGPT experience in their own Siri integration – not to mention with Siri itself without the help from ChatGPT.

The elephant in the room here is the lack of a single, self-contained Siri LLM experience in the form of an app that can remember all of your conversations and keep the context of an ongoing conversation across multiple sessions. Today, Apple’s efforts to infuse Siri with more “Apple Intelligence” result in a scattershot implementation comprised of disposable interactions that forego the true benefits of LLMs, lacking a cohesive vision. It’s quite telling that the best part of the “new” Siri experience is the ChatGPT integration in 18.2, and even then, it’s no replacement for the full-featured ChatGPT app.

With ChatGPT on my iPhone and iPad, all my conversations and their full transcripts are saved and made accessible for later. I can revisit a conversation about any topic I’m researching with ChatGPT days after I started it and pick up exactly where I left off. Even while I’m having a conversation with ChatGPT, I can look further up in the transcript and see what was said before I continue asking anything else. The whole point of modern LLMs is to facilitate this new kind of computer-human conversation where the entire context can be referenced, expanded upon, and queried.

Siri still doesn’t have any of this – and that’s because it really isn’t based on an LLM yet.1 While Siri can hold some context of a conversation while traversing from question to question, it can’t understand longer requests written in natural language that reference a particular point of an earlier request. It doesn’t show you the earlier transcript, whether you’re talking or typing to it. By and large, conversations in Siri are still ephemeral. You ask a question, get a response, and can ask a follow-up question (but not always); as soon as Siri is dismissed, though, the entire conversation is gone.

As a result, the ChatGPT integration in iOS 18.2 doesn’t mean that Siri can now be used for production workflows where you want to hold an ongoing conversation about a topic or task and reference it later. ChatGPT is the shoulder for Siri to temporarily cry on; it’s the guardian parent that can answer basic questions in a better way than before while ultimately still exposing the disposable, inconsistent, impermanent Siri that is far removed from the modern experience of real LLMs.

Do not expect the same chatbot experience as Claude (left) or ChatGPT (right) with the new ChatGPT integration in Siri.

Do not expect the same chatbot experience as Claude (left) or ChatGPT (right) with the new ChatGPT integration in Siri.

Or, taken to a bleeding-edge extreme, do not expect the kind of long conversations with context recall and advanced reasoning you can get with ChatGPT's most recent models in the updated Siri for iOS 18.2.

Or, taken to a bleeding-edge extreme, do not expect the kind of long conversations with context recall and advanced reasoning you can get with ChatGPT’s most recent models in the updated Siri for iOS 18.2.

But let’s disregard for a second the fact that Apple doesn’t have a Siri LLM experience comparable to ChatGPT or Claude yet, assume that’s going to happen at some point in 2026, and remain optimistic about Siri’s future. I still believe that Apple isn’t taking advantage of ChatGPT enough and could do so much more to make iOS 18 seem “smarter” than it actually is while relying on someone else’s intelligence.

Unlike other AI companies, Apple has a moat: they make the physical devices we use, create the operating systems, and control the app ecosystem. Thus, Apple has an opportunity to leverage deep, system-level integrations between AI and the apps billions of people use every day. This is the most exciting aspect of Apple Intelligence; it’s a bummer that, despite the help from ChatGPT, I’ve only seen a handful of instances in which AI results can be used in conjunction with apps. Let me give you some examples and comparisons between ChatGPT and Siri to show you what I mean.

In addition to text requests, ChatGPT has been integrated with image and file uploads across iOS and iPadOS. For example, if you have a long PDF document you want to summarize, you can ask Siri to give you a summary of it, and the assistant will display a file upload popup that says the item will be sent to ChatGPT for analysis.

Sending a PDF to ChatGPT for analysis and summarization.

Sending a PDF to ChatGPT for analysis and summarization.

In this popup, you can choose the type of file representation you want to send: you can upload a screenshot of a document to ChatGPT directly from Siri, or you can give it the contents of the entire document. This technique isn’t limited to documents, nor is it exclusive to the style of request I mentioned above. Any time you invoke Siri while looking at a photo, webpage, email message, or screenshot, you can invoke requests like…

  • “What am I looking at here?”
  • “What does this say?”
  • “Take a look at this and give me actionable items.”

…and ChatGPT will be summoned – even without explicitly saying, “Ask ChatGPT…” – with the file upload permission prompt. As of iOS and iPadOS 18.2, you can always choose between sending a copy of the full content of an item (usually as a PDF) or a screenshot of just what’s shown on-screen.

In any case, after a few seconds, ChatGPT will provide a response based on the file you gave it, and this is where things get interesting – in both surprising and disappointing ways.

You can also ask follow-up questions after the initial file upload, but you can't scroll back to see previous responses.

You can also ask follow-up questions after the initial file upload, but you can’t scroll back to see previous responses.

By default, you’ll find a copy button in the notification with the ChatGPT response, so that’s nice. Between the Side button, Type to Siri (which also got a Control Center control in 18.2), and the copy button next to responses, the iPhone now has the fastest way to go from a spoken/typed request to a ChatGPT response copied to the clipboard.

But what if you want to do more with a response? In iOS and iPadOS 18.2, you can follow up to a ChatGPT response with, “Make a note out of this”, and the response will be saved as a new note in the Notes app with a nice UI shown in the Siri notification.

Saving a ChatGPT response in Siri as a new note.

Saving a ChatGPT response in Siri as a new note.

This surprised me, and it’s the sort of integration that makes me hopeful about the future role of an LLM on Apple platforms – a system that can support complex conversations while also sending off responses into native apps.

Sadly, this is about as far as Apple’s integration between ChatGPT and apps went for this release. Everything else that I tried did not work, in the sense that Siri either didn’t understand what I was asking for or ChatGPT replied that it didn’t have enough access to my device to perform that action.

Specifically:

  • If instead of, “Make a note”, I asked to, “Append this response to my note called [Note Title]”, Siri didn’t understand me, and ChatGPT said it couldn’t do it.
  • When I asked ChatGPT to analyze the contents of my clipboard, it said it couldn’t access it.
  • When I asked to, “Use this as input for my [shortcut name] shortcut”, ChatGPT said it couldn’t run shortcuts.

Why is it that Apple is making a special exception for creating notes out of responses, but nothing else works? Is this the sort of thing that will magically get better once Apple Intelligence gets connected to App Intents? It’s hard to tell right now.

The lackluster integration between ChatGPT and native system functions goes beyond Siri responses and extends to Writing Tools. When I attempted to go even slightly beyond the guardrails of the Compose feature, things got weird:

  • Remember the Payments note I was so impressed with? When I asked ChatGPT in the Compose field to, “Make a table out of this”, it did generate a result…as a plain text list without the proper formatting for a native table in the Notes app.
  • When I asked ChatGPT to, “Turn this selected Markdown into rich text”, it performed the conversion correctly – except that Notes pasted the result as raw HTML in the body of the note.
  • ChatGPT can enter and reformat headings inside a note, but they’re in a different format than the Notes app’s native ‘Heading’ style. I have no idea where that formatting style is coming from.
When I asked Apple Intelligence to convert Markdown to rich text, it asked me to do it with ChatGPT instead.

When I asked Apple Intelligence to convert Markdown to rich text, it asked me to do it with ChatGPT instead.

But when I asked ChatGPT, it composed raw HTML.

But when I asked ChatGPT, it composed raw HTML.

Clearly, Apple has some work to do if they want to match user requests with the native styling and objects supported by the Notes app. But that’s not the only area where I’ve noticed a disparity between Siri and ChatGPT’s capabilities, resulting in a strange mix of interactions when the two are combined.

One of my favorite features of ChatGPT’s website and app is the ability to store bits of data in a personal memory that can be recalled at any time. Memories can be used to provide further context to the LLM in future requests as well as to jot down something that you want to remember later. Alas, ChatGPT accessed via Siri can’t retrieve the user’s personal memories, despite the ability to log into your ChatGPT account and save conversations you have with Siri. When asked to access my memory, ChatGPT via Siri responds as such:

I’m here to assist you by responding to your questions and requests, but I don’t have the ability to access any memory or personal data. I operate only within the context of our current conversation.

That’s too bad, and it only exacerbates the fact that Apple is limited to an à la carte assistant that doesn’t really behave like an LLM (because it can’t).

The most ironic part of the Siri-ChatGPT relationship, however, is that Siri is not multilingual, but ChatGPT is, so you can use OpenAI’s assistant to fill a massive hole in Siri’s functionality via some clever prompting.

My Siri is set to English, but if I ask it in Italian, “Chiedi a ChatGPT” (“Ask ChatGPT”), followed by an Italian request, “Siri” will respond in Italian since ChatGPT – in addition to different modalities – also supports hopping between languages in the same conversation. Even if I take an Italian PDF document and tell Siri in English to, “Ask ChatGPT to summarize this in its original language”, that’s going to work.

On its own, Siri is not bilingual…

On its own, Siri is not bilingual…


…but with ChatGPT, it can be.

…but with ChatGPT, it can be.

Speaking as a bilingual person, this is terrific – but at the same time, it underlines how deeply ChatGPT puts Siri to shame when it comes to being more accessible for international users. What’s even funnier is that Siri tries to tell me I’m wrong when I’m typing in Italian in its English text field (and that’s in spite of the new bilingual keyboard in iOS 18), but when the request is sent off to ChatGPT, it doesn’t care.

I want to wrap up this section with an example of what I mean by assistive AI in regards to productivity and why I now believe so strongly in the potential to connect LLMs with apps.

I’ve been trying Todoist again lately, and I discovered the existence of a TodoistGPT extension for ChatGPT that lets you interact with the task manager using ChatGPT’s natural language processing. So I had an idea: what if I took a screenshot of a list in the Reminders app and asked ChatGPT to identify the tasks in it and recreate them with the same properties in Todoist?

I asked:

 This is a screenshot of a work project in the Reminders app. Can you identify the two remaining tasks in it, along with their due dates and, if applicable, repeat patterns?

ChatGPT identified them correctly, parsing the necessary fields for title, due date, and repeat pattern. I then followed up by asking:

Can you add these to my Work Review project?

And, surely enough, the tasks found in the image were recreated as new tasks in my Todoist account.

In ChatGPT, I was able to use its vision capabilities to extract tasks from a screenshot, then invoke a custom GPT to recreate them with the same properties in Todoist.

In ChatGPT, I was able to use its vision capabilities to extract tasks from a screenshot, then invoke a custom GPT to recreate them with the same properties in Todoist.

The tasks in Todoist.

The tasks in Todoist.

Right now, Siri can’t do this. Even though the ChatGPT integration can recognize the same tasks, asking Siri a follow-up question to add those tasks to Reminders in a different list will fail.

Meanwhile, ChatGPT can perform the same image analysis via Siri, but the resulting text is not actionable at all.

Meanwhile, ChatGPT can perform the same image analysis via Siri, but the resulting text is not actionable at all.

Think about this idea for a second: in theory, the web-based integration I just described is similar to the scenario Apple is proposing with App Intents and third-party apps in Apple Intelligence. Apple has the unique opportunity to leverage the millions of apps on the App Store – and the multiple thousands that will roll out App Intents in the short term – to quickly spin up an ecosystem of third-party integrations for Apple Intelligence via the apps people already use on their phones.

How will that work without a proper Siri LLM? How flexible will the app domains supported at launch be in practice? It’s hard to tell now, but it’s also the field of Apple Intelligence that – unlike gross and grotesque image generation features – has my attention.

Visual Intelligence

The other area of iOS that now features ChatGPT integration is Visual Intelligence. Originally announced in September, Visual Intelligence is a new Camera Control mode and, as such, exclusive to the new iPhone 16 family of devices.

The new Visual Intelligence camera mode of iOS 18.2.

The new Visual Intelligence camera mode of iOS 18.2.

With Visual Intelligence, you can point your iPhone’s camera at something and get information about what’s in frame from either ChatGPT or Google search – the first case of two search providers embedded within the same Apple Intelligence functionality of iOS. Visual Intelligence is not a real-time camera view that can overlay information on top of a live camera feed; instead, it freezes the frame and sends a picture to ChatGPT or Google, without saving that image to your photo library.

The interactions of Visual Intelligence are fascinating, and an area where I think Apple did a good job picking a series of reasonable defaults. You activate Visual Intelligence by long-pressing on Camera Control, which reveals a new animation that combines the glow effect of the new Siri with the faux depressed button state first seen with the Action and volume buttons in iOS 18. It looks really nice. After you hold down for a second, you’ll feel some haptic feedback, and the camera view of Visual Intelligence will open in the foreground.

The Visual Intelligence animation.

Once you’re in camera mode, you have two options: you either manually press the shutter button to freeze the frame then choose between ChatGPT and Google, or you press one of those search providers first, and the frame will be frozen automatically.

Google search results in Visual Intelligence.

Google search results in Visual Intelligence.

Google is the easier integration to explain here. It’s basically reverse image search built into the iPhone’s camera and globally available via Camera Control. I can’t tell you how many times my girlfriend and I rely on Google Lens to look up outfits we see on TV, furniture we see in magazines, or bottles of wine, so having this built into iOS without having to use Google’s iPhone app is extra nice. Results appear in a popup inside Visual Intelligence, and you can pick one to open it in Safari. As far as integrating Google’s reverse image search with the operating system goes, Apple has pretty much nailed the interaction here.

ChatGPT has been equally well integrated with the Visual Intelligence experience. By default, when you press the ‘Ask’ button, ChatGPT will instantly analyze the picture and describe what you’re looking at, so you have a starting point for the conversation. The whole point of this feature, in fact, is to be able to inquire about additional details or use the picture as visual context for a request you have.

My [NPC](https://www.macstories.net/npc/) co-hosts still don't know anything about this new handheld, and ChatGPT's response is correct.

My NPC co-hosts still don’t know anything about this new handheld, and ChatGPT’s response is correct.

You can also ask follow-up questions to ChatGPT in Visual Intelligence.

You can also ask follow-up questions to ChatGPT in Visual Intelligence.

I’ll give you an example. A few days ago, Silvia and I noticed that the heated tower rail in our bathroom was making a low hissing noise. There were clearly valves we were supposed to operate to let air out of the system, but I wanted to be sure because I’m not a plumber. So I invoked Visual Intelligence, took a picture, and asked ChatGPT – in Italian – how I was supposed to let the air out. Within seconds, I got the confirmation I was looking for: I needed to turn the valve in the upper left corner.

This was useful.

This was useful.

I can think of plenty of other scenarios in everyday life where the ability to ask questions about what I’m looking at may be useful. Whether you’re looking up instructions to operate different types of equipment, dealing with recipes, learning more about landmarks, or translating signs and menus in a different country, there are clear, tangible benefits when it comes to augmenting vision with the conversational knowledge of an LLM.

By default, ChatGPT doesn't have access to web search in Visual Intelligence. If you want to continue a request by looking up web results, you'll have to use the ChatGPT app.

By default, ChatGPT doesn’t have access to web search in Visual Intelligence. If you want to continue a request by looking up web results, you’ll have to use the ChatGPT app.

Right now, all Apple Intelligence queries to ChatGPT are routed to the GPT-4o model; I can imagine that, with the o1 model now supporting image uploads, Apple may soon offer the option to enable slower but more accurate visual responses powered by advanced reasoning. In my tests, GPT-4o has been good enough to address the things I was showing it via Visual Intelligence. It’s a feature I plan to use often – certainly more than the other (confusing) options of Camera Control.

The Future of a Siri LLM

Sure, Siri.

Sure, Siri.

Looking ahead at the next year, it seems clear that Apple will continue taking a staged approach to evolving Apple Intelligence in their bid to catch up with OpenAI, Anthropic, Google, and Meta.

Within the iOS 18 cycle, we’ll see Siri expand its on-screen vision capabilities and gain the ability to draw on users’ personal context; then, Apple Intelligence will be integrated with commands from third-party apps based on schemas and App Intents; according to rumors, this will culminate with the announcement of a second-generation Siri LLM at WWDC 2025 that will feature a more ChatGPT-like assistant capable of holding longer conversations and perhaps storing them for future access in a standalone app. We can speculatively assume that Siri LLM will be showcased at WWDC 2025 and released in the spring of 2026.

Taking all this into account, it’s evident that, as things stand today, Apple is two years behind their competitors in the AI chatbot space. Training large language models is a time-consuming, expensive task that is ballooning in cost and, according to some, leading to diminishing returns as a byproduct of scaling laws.

Today, Apple is stuck between the proverbial rock and hard place. ChatGPT is the fastest-growing software product in modern history, Meta’s bet on open-source AI is resulting in an explosion of models that can be trained and integrated into hardware accessories, agents, and apps with a low barrier to entry, and Google – facing an existential threat to search at the hands of LLM-powered web search – is going all-in on AI features for Android and Pixel phones. Like it or not, the vast majority of consumers now expect AI features on their devices; whether Apple was caught flat-footed here or not, the company today simply doesn’t have the technology to offer an experience comparable to ChatGPT, Llama-based models, Claude, or Gemini, that’s entirely powered by Siri.

So, for now, Apple is following the classic “if you can’t beat them, join them” playbook. ChatGPT and other chatbots will supplement Siri with additional knowledge; meanwhile, Apple will continue to release specialized models optimized for specific iOS features, such as Image Wand in Notes, Clean Up in Photos, summarization in Writing Tools, inbox categorization in Mail, and so forth.

All this begs a couple of questions. Will Apple’s piecemeal AI strategy be effective in slowing down the narrative that they are behind other companies, showing their customers that iPhones are, in fact, powered by AI? And if Apple will only have a Siri LLM by 2026, where will ChatGPT and the rest of the industry be by then?

Given the pace of AI tools’ evolution in 2024 alone, it’s easy to look at Apple’s position and think that, no matter their efforts and the amount of capital thrown at the problem, they’re doomed. And this is where – despite my belief that Apple is indeed at least two years behind – I disagree with this notion.

You see, there’s another question that begs to be asked: will OpenAI, Anthropic, or Meta have a mobile operating system or lineup of computers with different form factors in two years? I don’t think they will, and that buys Apple some time to catch up.

In the business and enterprise space, it’s likely that OpenAI, Microsoft, and Google will become more and more entrenched between now and 2026 as corporations begin gravitating toward agentic AI and rethink their software tooling around AI. But modern Apple has never been an enterprise-focused company. Apple is focused on personal technology and selling computers of different sizes and forms to, well, people. And I’m willing to bet that, two years from now, people will still want to go to a store and buy themselves a nice laptop or phone.

Despite their slow progress, this is Apple’s moat. The company’s real opportunity in the AI space shouldn’t be to merely match the features and performance of chatbots; their unique advantage is the ability to rethink the operating systems of the computers we use around AI.

Don’t be fooled by the gaudy, archaic, and tone-deaf distractions of Image Playground and Image Wand. Apple’s true opening is in the potential of breaking free from the chatbot UI, building an assistive AI that works alongside us and the apps we use every day to make us more productive, more connected, and, as always, more creative.

That’s the artificial intelligence I hope Apple is building. And that’s the future I’d like to cover on MacStories.


  1. Apple does have some foundation models in iOS 18, but in the company’s own words, “The foundation models built into Apple Intelligence have been fine-tuned for user experiences such as writing and refining text, prioritizing and summarizing notifications, creating playful images for conversations with family and friends, and taking in-app actions to simplify interactions across apps.” ↩︎

Access Extra Content and Perks

Founded in 2015, Club MacStories has delivered exclusive content every week for nearly a decade.

What started with weekly and monthly email newsletters has blossomed into a family of memberships designed every MacStories fan.

Club MacStories: Weekly and monthly newsletters via email and the web that are brimming with apps, tips, automation workflows, longform writing, early access to the MacStories Unwind podcast, periodic giveaways, and more;

Club MacStories+: Everything that Club MacStories offers, plus an active Discord community, advanced search and custom RSS features for exploring the Club’s entire back catalog, bonus columns, and dozens of app discounts;

Club Premier: All of the above and AppStories+, an extended version of our flagship podcast that’s delivered early, ad-free, and in high-bitrate audio.

Learn more here and from our Club FAQs.

Join Now]]>
Apple Reveals A Partial Timeline for the Rollout of More Apple Intelligence Features https://www.macstories.net/news/apple-reveals-a-partial-timeline-for-the-rollout-of-more-apple-intelligence-features/ Mon, 28 Oct 2024 17:38:10 +0000 https://www.macstories.net/?p=77068

Last week, Apple released the first developer betas of iOS 18.2, iPadOS 18.2, and macOS 15.2, which the press speculated would be out by the end of the year. It turns out that was a good call because today, Apple confirmed that timing. In its press release about the Apple Intelligence features released today, Apple revealed that the next round is coming in December and will include the following:

  • Users will be able to describe changes they want made to text using Writing Tools. For example, you can have text rewritten with a certain tone or in the form of a poem.
  • ChatGPT will be available in Writing Tools and when using Siri.
  • Image Playground will allow users to create images with Apple’s generative AI model.
  • Users will be able to use prompts to create Genmoji, custom emoji-style images that can be sent to friends in iMessage and used as stickers.
  • Visual intelligence will be available via the Camera Control on the iPhone 16 and iPhone 16 Pro. The feature will allow users to point the iPhone’s camera at something and learn about it from Google or ChatGPT. Apple also mentions that visual intelligence will work with other unspecified “third-party tools.”
  • Apple Intelligence will be available in localized English in Australia, Canada, Ireland, New Zealand, South Africa, and the U.K.

Apple’s press release also explains when other languages are coming:

…in April, a software update will deliver expanded language support, with more coming throughout the year. Chinese, English (India), English (Singapore), French, German, Italian, Japanese, Korean, Portuguese, Spanish, Vietnamese, and other languages will be supported.

And Apple’s Newsroom in Ireland offers information on the Apple Intelligence rollout in the EU:

Mac users in the EU can access Apple Intelligence in U.S. English with macOS Sequoia 15.1. This April, Apple Intelligence features will start to roll out to iPhone and iPad users in the EU. This will include many of the core features of Apple Intelligence, including Writing Tools, Genmoji, a redesigned Siri with richer language understanding, ChatGPT integration, and more.

It’s a shame it’s going to be another six months before EU customers can take advantage of Apple Intelligence features on their iPhones and iPads, but it’s nonetheless good to hear when it will happen.

It’s also worth noting that the timing of other pieces of Apple Intelligence is unclear. There is still no word on precisely when Siri will gain knowledge of your personal context or perform actions in apps on your behalf, for instance. Even so, today’s reveal is more than Apple usually shares, which is both nice and a sign of the importance the company places on these features.


Access Extra Content and Perks

Founded in 2015, Club MacStories has delivered exclusive content every week for nearly a decade.

What started with weekly and monthly email newsletters has blossomed into a family of memberships designed every MacStories fan.

Club MacStories: Weekly and monthly newsletters via email and the web that are brimming with apps, tips, automation workflows, longform writing, early access to the MacStories Unwind podcast, periodic giveaways, and more;

Club MacStories+: Everything that Club MacStories offers, plus an active Discord community, advanced search and custom RSS features for exploring the Club’s entire back catalog, bonus columns, and dozens of app discounts;

Club Premier: All of the above and AppStories+, an extended version of our flagship podcast that’s delivered early, ad-free, and in high-bitrate audio.

Learn more here and from our Club FAQs.

Join Now]]>
You Can Use Clean Up with a Clear Conscience https://www.macstories.net/linked/you-can-use-clean-up-with-a-clear-conscience/ Mon, 28 Oct 2024 16:17:37 +0000 https://www.macstories.net/?p=77064 I enjoyed this take on Apple Intelligence’s Clean Up feature by Joe Rosensteel, writing for Six Colors last week:

The photographs you take are not courtroom evidence. They’re not historical documents. Well, they could be, but mostly they’re images to remember a moment or share that moment with other people. If someone rear-ended your car and you’re taking photos for the insurance company, then that is not the time to use Clean Up to get rid of people in the background, of course. Use common sense.

Clean Up is a fairly conservative photo editing tool in comparison to what other companies offer. Sometimes, people like to apply a uniform narrative that Silicon Valley companies are all destroying reality equally in the quest for AI dominance, but that just doesn’t suit this tool that lets you remove some distractions from your image.

It’s easy to get swept up in the “But what is a photo” philosophical debate (which I think raises a lot of interesting points), but I agree with Joe: we should also keep in mind that, sometimes, we’re just removing that random tourist from the background and our edit isn’t going to change the course of humankind’s history.

Also worth remembering:

For some reason, even the most literal of literal people is fine with composing a shot to not include things. To even (gasp!) crop things out of photos. You can absolutely change meaning and context just as much through framing and cropping as you can with a tool like Clean Up. No one is suggesting that the crop tool be removed or that we should only be allowed to take the widest wide-angle photographs possible to include all context at all times, like security camera footage.

→ Source: sixcolors.com

]]>
iOS and iPadOS 18.1: Everything New Besides Apple Intelligence https://www.macstories.net/reviews/ios-and-ipados-18-1-everything-new-besides-apple-intelligence/ Mon, 28 Oct 2024 15:04:07 +0000 https://www.macstories.net/?p=77016

Today, Apple released iOS and iPadOS 18.1, the first major release since the operating system versions that launched in September and were reviewed by Federico.

As you may know, the main highlight of this new release is the first wave of Apple Intelligence features available to the public. AI has arrived, and for better or for worse for Apple’s platforms, this is only the beginning. Be sure to check out John’s review of all the new Apple Intelligence features included in iOS and iPadOS 18.1 (as well as macOS Sequoia 15.1) for the details.

Fortunately, Apple Intelligence isn’t the only highlight of this release. It also includes a series of changes to the system, from Control Center and the Camera app to Shortcuts and the arrival of new health features for AirPods Pro 2 users.

Here’s a roundup of everything new besides Apple Intelligence in iOS and iPadOS 18.1.

Control Center

Control Center received a major overhaul in iOS and iPadOS 18. It is now fully customizable, can span across multiple pages, and supports custom controls provided by third-party apps installed on your device.

With iOS and iPadOS 18.1, Apple has enriched Control Center with a handful of new built-in connectivity controls that can be individually added from the gallery and placed anywhere in your custom Control Center layout. These new standalone controls exist alongside the previously available ‘Connectivity’ tile, which bundles together all of the main connectivity controls.

These are all the standalone connectivity controls available in iOS 18.1:

  • Wi-Fi
  • Cellular Data
  • Bluetooth
  • VPN
  • Personal Hotspot
  • AirDrop
  • Airplane Mode
  • Satellite

In addition to these new connectivity controls, you’ll also find two other new controls: ‘Measure’ and ‘Level’. These allow you to quickly open the Measure app directly in either of its main tabs.

Now, if you’ve always found Control Center to be hard to reach with one hand, especially on any of the taller iPhone models, you’ll be glad to hear that iOS 18.1 comes with a new Shortcuts action to open, close, or toggle Control Center. In practice, this means that it is now possible to reassign one of your two Lock Screen controls to open Control Center without having to form a claw with your hand to reach the top of the screen.

All you need to do is create a new shortcut with the new ‘Show Control Center’ action and customize one of the two controls on your Lock Screen to trigger that shortcut. Alternatively, you can assign your shortcut to the Action button or place it anywhere in the bottom half of your Home Screen for easy access.

Finally, Apple has added an easy way to reset Control Center to its default layout. This is useful whether you’re looking to start customizing your Control Center from scratch or you’re simply content with Apple’s default layout. To reset Control Center, head to Settings → Control Center and tap ‘Reset Control Center’.

Camera

The Camera app has been updated in iOS 18.1 to feature a new dedicated ‘Spatial’ mode. If you own an iPhone 15 Pro, 16, or 16 Pro, this mode will allow you to shoot spatial photos and videos thanks to the vertically-oriented camera sensors on those devices. When in this mode, to the right of the shutter button, you will find a two-way switch for toggling between capturing photos and videos. Previously, shooting spatial photos and videos was available via a toggle within the Camera app’s regular Photo and Video tabs.

Additionally, if you own an iPhone 16 or 16 Pro, you can now use the Camera Control to switch between the front- and rear-facing cameras with a couple of swipes and light presses.

Hearing Health Features for the AirPods Pro 2

Hearing Test results in the Health app in iOS 18.1. Source: Apple.

Hearing Test results in the Health app in iOS 18.1. Source: Apple.

Last month at its September 2024 event, Apple unveiled a series of major new health features coming to the AirPods Pro 2. These include the ability for users to take a hearing test, use the AirPods as over-the-counter hearing aids, and benefit from a new on-by-default hearing protection mode that will automatically lower loud environmental noise across all listening modes (Transparency, Adaptive Audio, and Noise Cancellation).

These new hearing health features will start rolling out today with the release of iOS 18.1 and an accompanying firmware update for the AirPods Pro 2 (version 7B19 or later), which should automatically install on users’ earbuds over the coming days. While there is no way to manually update your AirPods, you can increase your chances by keeping your AirPods Pro 2 fully charged and in their charging case overnight. Once your AirPods Pro 2 are properly updated – and once your region becomes eligible – head to the Health app on your iPhone to start your hearing test.

Apple has detailed on a new support page the worldwide availability of each of the new hearing health features for the AirPods Pro 2.

And More…

You’ll find a handful of other changes in iOS and iPadOS 18.1 as well:

Change the primary email address associated with your Apple Account. As most trans people know, changing your name online is particularly difficult. Name changes often require that you create an entirely new account when they aren’t supported by a service provider. While Apple already lets you update your first and last name on your Apple Account pretty easily, until today it was not straightforward to change the primary email associated with your account. This was a problem because the primary email address is not only used to sign in and recover your account, but it is also visible to people you collaborate and share documents with using iCloud. So if the email address you used to create your account contained your former name, you were out of luck.

Fortunately, in iOS and iPadOS 18.1, you can now easily set any of the email addresses associated with your Apple Account as the primary email to be used by Apple. To add an email address and choose a new primary email for your Apple Account, head to Settings, tap your name at the top of the screen, and choose ‘Sign-In & Security’.

In iOS 18.1, you can set any email address associated with your Apple Account as the primary email to be used by Apple.

In iOS 18.1, you can set any email address associated with your Apple Account as the primary email to be used by Apple.

Two new names for the Up Next queue in the Apple TV app. Starting with iOS, iPadOS, and tvOS 18.1, what used to be known as the Up Next queue in the Apple TV app is now called ‘Continue Watching’. It is still positioned below the carousel at the top of the Home tab, and it still functions essentially in the same way, letting you jump back into an unfinished movie, start watching a newly released episode, or play a movie or TV show that you have manually saved to the list. Consequently, the option to manually add a piece of media to the list has also been renamed to ‘Add to Watchlist’.

There are now effectively two terms inside the TV app that both refer to the same area in the app’s Home tab. I haven’t spent much time with this change yet, but I do wish Apple had chosen to add a dedicated Watchlist tab to the app instead — and I’m worried that this change in terminology will only add more complexity and confusion to a rather rapidly evolving app.

In the TV app's Home tab, the Up Next queue is now known as 'Continue Watching'. To manually add a movie or TV show to this queue, select 'Add to Watchlist'.

In the TV app’s Home tab, the Up Next queue is now known as ‘Continue Watching’. To manually add a movie or TV show to this queue, select ‘Add to Watchlist’.

Updated design for the emoji keyboard. The emoji keyboard has been updated to pave the way for the upcoming Genmoji feature in iOS 18.2. The emoji grid is now larger, Memoji and custom stickers are included, and the navigation strip for jumping to a specific emoji category now features updated glyphs that are more in line with the design of the emoji contained in the corresponding category. If you’re interested, Emojipedia founder Jeremy Burge went into detail on Threads about these design changes to the emoji keyboard.

In iOS 18.1, the emoji keyboard now features a larger grid, sections for Memoji and custom stickers, and redesigned glyphs.

In iOS 18.1, the emoji keyboard now features a larger grid, sections for Memoji and custom stickers, and redesigned glyphs.

Notification badges on the Lock Screen. Grouped notification banners on the Lock Screen are now adorned with an icon badge corresponding to the number of notifications contained within the group. Unlike the existing red notification badges on the Home Screen, these new badges on the Lock Screen are translucent.

Updated Calculator history. The new history panel in the Calculator app has been moved. It now resides in a floating sheet that appears above the keypad instead of in the sidebar.

Calculator History in iOS 18.0 (left) and iOS 18.1 (right)

Calculator History in iOS 18.0 (left) and iOS 18.1 (right)

RCS Business Messaging. Apple already baked RCS support into the Messages app in iOS 18. Now, the company is expanding it to allow businesses to communicate with their customers over RCS. Keep in mind that RCS Business Messaging will only work if your carrier supports RCS messaging in the first place.

Support for wired Xbox controllers. Since Xbox controllers use a custom fast USB protocol called GIP instead of the standard protocol used by other controllers, they previously couldn’t be used in wired mode with Apple devices. Now, in iOS and iPadOS 18.1 (as well as macOS Sequoia 15.1), Xbox controllers can be used with a wire, just like any other supported controller.

Send Game Center invitations directly from the Contacts app. Previously, you could only invite a friend to Game Center via its dedicated section in the Settings app. In iOS and iPadOS 18.1, you can send a Game Center invitation directly from the Contacts app.

Share songs from Apple Music to TikTok. Just as you’ve been able to with Instagram, you can now share songs from the Apple Music app to TikTok via the share sheet. To share a song to TikTok, long press on any track, tap ‘Share’, then select the TikTok app from the share sheet.


That’s it for iOS and iPadOS 18.1. Overall, despite the focus on its new Apple Intelligence features, this release still includes a significant list of non-AI changes and additions. Still, it will be interesting to see how Apple continues to update its operating systems over the next year while managing to ship its upcoming waves of Apple Intelligence features.

You can update your device to iOS and iPadOS 18.1 today by navigating to Settings → General → Software Update.


Access Extra Content and Perks

Founded in 2015, Club MacStories has delivered exclusive content every week for nearly a decade.

What started with weekly and monthly email newsletters has blossomed into a family of memberships designed every MacStories fan.

Club MacStories: Weekly and monthly newsletters via email and the web that are brimming with apps, tips, automation workflows, longform writing, early access to the MacStories Unwind podcast, periodic giveaways, and more;

Club MacStories+: Everything that Club MacStories offers, plus an active Discord community, advanced search and custom RSS features for exploring the Club’s entire back catalog, bonus columns, and dozens of app discounts;

Club Premier: All of the above and AppStories+, an extended version of our flagship podcast that’s delivered early, ad-free, and in high-bitrate audio.

Learn more here and from our Club FAQs.

Join Now]]>
New Developer Betas Released for iOS, iPadOS, and macOS with Image Playground, ChatGPT Integration, and More Apple Intelligence Features https://www.macstories.net/news/new-developer-betas-released-for-ios-ipados-and-macos-with-image-playground-chatgpt-integration-and-more-apple-intelligence-features/ Wed, 23 Oct 2024 17:11:32 +0000 https://www.macstories.net/?p=76992

iOS 18.1, iPadOS 18.1, and macOS 15.1 aren’t quite out the door, but Apple has already updated its developer betas with the next round of upcoming Apple Intelligence features. Developer betas of iOS 18.2, iPadOS 18.2, and macOS 15.2 are now available for download and include the following:

  • image generation in the form of Image Playground and Image Wand;
  • Genmoji (iOS and iPadOS only)
  • Visual Intelligence (iPhone 16 line only)
  • ChatGPT integration with Siri; and
  • new text manipulation features.
Image Playground. Source: Apple.

Image Playground. Source: Apple.

Image Playground is a feature that allows you to create images in two styles using in-app themes and other tools. Image Playground is available in apps like Messages, Freeform, Pages, and Keynote, but it’s also a standalone app. Regardless of where you use it, Image Playground looks like it’s designed to make it easy to create animated and sketch-style images using a variety of tools such as suggested concepts that pull from the context the image is created in, like a Messages thread. Creations can be previewed, there’s a history feature that allows you to undo changes made to images, and images are saved to an Image Playground Library that syncs across devices via iCloud.

Image Wand. Source: Apple.

Image Wand. Source: Apple.

Image Wand, which appears in the Apple Pencil tool palette, takes a rough hand-drawn sketch, photo, or note and turns any of them into an image similar to one created by Image Playground. Image Wand can be further refined by adding text, and if you circle a blank space, it will use surrounding text to build an image.

Also, Genmoji – which is only in the iOS and iPadOS betas for now – allows you to create emoji-style images that can be used in Messages and other apps as decorative stickers. Inputs can include a text description, people in your contacts, friends and family recognized in Photos, and characters created from whole cloth.

Visual Intelligence has been added to the Camera Control on the iPhone 16 line too. The feature lets you look up details about a place and work with text, copying, reading, summarizing, and translating it.

The next betas also integrate ChatGPT into Siri. As demoed at WWDC, you can opt to pose queries to ChatGPT without disclosing you identity or IP address and without the prompts being used to train OpenAI’s large language models. The ChatGPT integration is free and does not require an account with OpenAI either.

Writing Tools lets you describe your text changes in iOS 18.2, iPadOS 18.2, and macOS 15.2.

Writing Tools lets you describe your text changes in iOS 18.2, iPadOS 18.2, and macOS 15.2.

Finally, Apple has built a new Writing Tool that provides additional flexibility when manipulating text. From the Writing Tools UI, you’ll be able to submit a prompt to alter any text you’ve written. For instance, you could have Apple Intelligence make you sound more excited in your message or rewrite it in the form of a poem, neither of which is possible with the Writing Tools found in iOS and iPadOS 18.1 or macOS 15.1.

For developers, there are also new APIs for Writing Tools, Genmoji, and Image Playground.

As we’ve covered before, Apple’s AI models have been trained on a mix of licensed data and content from the web. If you’re a publisher or a creator who doesn’t want to be part of those models, you can opt out, but it doesn’t work retroactively. In other words, opting out won’t remove any data already ingested by Apple’s web crawlers, but it will work going forward.

I’m not a fan of generative AI tools, but I am looking forward to finally going beyond tightly controlled demos of these features. I want to see how well they work in practice and compare them to other AI tools. Apple appears to have put a lot of guardrails in place to avoid some of the disasters that have befallen other tech companies, but I’m pretty good at breaking software. It will be interesting to see how well these tools hold up under pressure.


Access Extra Content and Perks

Founded in 2015, Club MacStories has delivered exclusive content every week for nearly a decade.

What started with weekly and monthly email newsletters has blossomed into a family of memberships designed every MacStories fan.

Club MacStories: Weekly and monthly newsletters via email and the web that are brimming with apps, tips, automation workflows, longform writing, early access to the MacStories Unwind podcast, periodic giveaways, and more;

Club MacStories+: Everything that Club MacStories offers, plus an active Discord community, advanced search and custom RSS features for exploring the Club’s entire back catalog, bonus columns, and dozens of app discounts;

Club Premier: All of the above and AppStories+, an extended version of our flagship podcast that’s delivered early, ad-free, and in high-bitrate audio.

Learn more here and from our Club FAQs.

Join Now]]>
Using Apple Journal to Track Home Screen Setups https://www.macstories.net/linked/using-apple-journal-to-track-home-screen-setups/ Sat, 28 Sep 2024 19:06:07 +0000 https://www.macstories.net/?p=76743 I love this idea by Lee Peterson: using Apple’s Journal app (which got some terrific updates in iOS 18) to track your Home Screen updates over time.

Every so often, I see screenshots from people on Threads or Mastodon showing their Home Screens from over a decade ago. I routinely delete screenshots from my Photos library, and it bums me out that I never kept a consistent, personal archive of my ever-changing Home Screens over the years. Lee’s technique, which combines Journal with the excellent Shareshot app, is a great idea that I’m going to steal. Here’s my current Home Screen on iOS 18:

My iOS 18 Home Screen.

My iOS 18 Home Screen.

As you can see, I’m trying large icons in dark mode and there are some new entries in my list of must-have apps. The Home Screen is similar, but a bit more complex, on iPadOS, where I’m still fine-tuning everything to my needs.

I plan to write about my Home Screens and Control Center setup in next week’s issue of MacStories Weekly. In the meantime, I’m going to follow Lee’s approach and begin archiving screenshots in Journal.

→ Source: ljpuk.net

]]>
Control Center and Lock Screen Controls for iOS 18: A Roundup of My Favorite Indie Apps https://www.macstories.net/roundups/control-center-and-lock-screen-controls-a-roundupof-my-favorites/ Wed, 18 Sep 2024 18:59:14 +0000 https://www.macstories.net/?p=76613

This week, Apple released iOS and iPadOS 18 to the world. One of the main new features this year is the ability to fully customize Control Center. And not only is Control Center customizable, but it now also supports controls from third-party applications. If you open the new Controls Gallery in iOS and iPadOS 18, you will find controls and toggles from some of your favorite indie apps that have been updated to support the new release.

In addition to being available in Control Center, every one of these third-party controls can be mapped to the Action button on the iPhone 15 Pro or newer, and they can used to replace the two default controls at the bottom of Lock Screen – Flashlight and Camera – which have been there since the introduction of the iPhone X in 2017.

While you may think at first that there’s only so much you can do with a simple toggle in Control Center, the range of possibilities that this enables is actually pretty wide. That is why, today, I’m taking a look at a selection of apps that have been updated to offer their own controls for Control Center and the Lock Screen. They’re all unique, and some of them are unexpectedly powerful.

Let’s jump in.

Obscura

Instead of saving it for last, I want to immediately start with my favorite new control for iOS 18 so far. Obscura by Ben Rice McCarthy is an amazing camera app that I’ve been using on and off for the past few years. The app is beautifully designed, features advanced controls for taking the perfect picture, and includes great color filters that stand out to me as some of the most pleasing I’ve ever seen in an alternative camera app.

However, Obscura has always suffered from an issue that all camera apps on iOS have until today: they were never as easy to launch as the stock Camera app. Obscura already offered a Lock Screen widget that you could place below the clock, but it was never as accessible as the default Camera control at the bottom of the screen. Now, you can swap the default Camera launcher with one for Obscura.

When its control is pressed on the Lock Screen, Obscura immediately launches into the viewfinder. And just like with Apple’s Camera app, you can start taking pictures without the need to authenticate with Face ID first. Over the past few days of testing, this Lock Screen control has made a huge difference for me. I almost exclusively use Obscura to take photos now.

Perhaps the key takeaway here that is that this significant impact on my photo-taking habits only highlights the importance of EU users’ upcoming ability to set a default camera app on iOS and iPadOS. It seems clear to me now that alternative camera app developers have been unfairly disadvantaged by the way Apple has prioritized its own Camera app throughout the system for so many years — which is why I am so glad that alternatives like Obscura now have a chance to compete on equal ground.

Obscura is available for free on the App Store. All of its features can be unlocked with a $3.99/month or $19.99/year subscription.

Chronicling

Chronicling by Rebecca Owen is an everything tracker for iOS and iPadOS. While I haven’t used the app in a while, I’m happy to report that I’ve gotten back into it recently in an attempt to use it to log my water intake and some of my medication.

After setting up your own custom categories in the app, you now have the ability to add them to Control Center. There, tapping a category’s control will immediately log an event for it. If you use a medium-sized control, it will even display the date and time the category was last updated. I’ve found it extremely useful to be able to check on a category and immediately update it simply by going into Control Center.

This is a recurring theme with apps adding their own controls to Control Center: this new system feature easily turns an app’s core functionality into a quick toggle that can be accessed from wherever you are in iOS. For tracking apps like Chronicling, it’s a huge usability improvement.

Chronicling is available for free on the App Store. All of its features can be unlocked with a $0.99/month or $9.99/year subscription.

Longplay

Longplay is an amazing music player for iOS and iPadOS that John previously reviewed on MacStories. For those unfamiliar, Longplay is an alternative frontend for Apple Music that presents your music library as a beautiful album wall. The app puts an emphasis on listening to your favorite music one album at a time, from start to finish. Over the years, Longplay has become one of my favorite ways to browse through my library and rediscover music. I’m especially a big fan of its Home Screen widgets, which let me quickly start listening to a random album at any time of the day.

With iOS 18, Longplay is adding a ‘Play Random Album’ control that you can add to either Control Center or the Lock Screen. With just a tap of its control, Longplay starts playing an album from your library in the background, without even having to launch the app. Just like its Home Screen widgets, you can customize the app’s control to filter the selection that albums are chosen from. Once again, it’s a simple idea that makes the app’s core functionality accessible from anywhere on iOS. If I had an Action button (I’m currently daily driving the iPhone 14 Pro), I would definitely map it to this new Longplay control.

Longplay is available on the App Store for $5.99.

Control Mark

Control Mark by Arno Appenzeller is an exciting option if you’re looking to pin a specific website in Control Center. This new app lets you turn any URL into a custom control.

I’ve used it to create a button in Control Center that opens my Watch Later playlist on YouTube, which is useful for quickly catching up on my video queue. Unfortunately, the app doesn’t currently support custom app URL schemes, and it doesn’t seem to support Universal Links either. I’m not sure if this is due to a system limitation or not, but the result is that Control Mark will only open URLs in Safari.

Opening my YouTube Watch Later playlist from Control Center thanks to Control Mark.

While I’m hoping Arno can update Control Mark to support custom URL schemes and unlock the ability to open in-app links, it’s still a great option if you need to add a MacStories launcher to your Lock Screen.

Control Mark is available on the App Store for $0.99.

Noir

Noir has become one of my essential Safari extensions on all of Apple’s platforms. The app elegantly and automatically darkens websites when dark mode is enabled on your system. This week, Noir added a new control that lets you quickly toggle the extension on and off from Control Center.

Even though I admire the way Noir handles its styling, browser extensions that enforce a dark mode on the web always give mixed results. For this reason, Noir has an option to let you disable the extension on a per-website basis, and the app also includes a toggle for turning it off in Safari’s extension menu. However, on iOS and iPadOS, the extension menu is not as easily accessible as it is on the Mac; being able to access this toggle from Control Center is a great improvement.

Noir is available on the App Store for $2.99.

Shareshot

Shareshot is an alternative to Federico’s Apple Frames shortcut that John reviewed last month on MacStories. The app lets you add a device frame to your screenshots and customize how they look. With its iOS 18 update this week, Shareshot now features a new control called ‘Frame Last Screenshot’. As its name suggests, tapping this control will immediately open the app and frame your last saved screenshot. From there, you can start customizing your framed screenshot or save it right away.

I still use Apple Frames for most of my device-framing needs, but in my case, this new control speeds up the process so much that it has replaced Apple Frames in some circumstances. I highly recommend giving it a try.

Shareshot is available on the App Store for free. You can remove the watermark from framed screenshots and access other premium features by subscribing to Shareshot Pro for $1.99/month or $14.99/year.

Fantastical, Cardhop, and Remind Me Faster

Fantastical's New Event sheet (left) and Remind Me Faster's main view (center) can quickly be accessed from their new controls in Control Center (right).

Fantastical’s New Event sheet (left) and Remind Me Faster’s main view (center) can quickly be accessed from their new controls in Control Center (right).

Fantastical by Flexibits is my calendar app of choice. If you’re a Comfort Zone listener, you know that I’m not big on productivity apps. I struggle with task managers, and my calendar is the main way I keep track of important events in my daily life. For that to work, however, I need to discipline myself to actually update my calendar on a daily basis.

Fantastical’s new ‘Create Event’ control in iOS 18 has been a huge help in this regard. Adding it to my Lock Screen has removed a significant roadblock for me actually remembering to add events to my calendar. The control is always right there whenever I pick up my phone.

Additionally, the app comes with two other new controls: ‘Open Fantastical’ and ‘Search in Fantastical’.

Cardhop is Flexibits’ companion app to Fantastical, and its main purpose is to be a superpowered alternative to Apple’s Contacts app. Cardhop has been updated to add three new controls: ‘Add Contact’, ‘Scan Card’, and ‘Search’. I can imagine them being super useful if you’re the kind of person who needs to update their address book on a daily basis.

Remind Me Faster is a companion utility to Reminders on iOS and iPadOS. The app’s premise is simple: it’s a faster way to create and schedule reminders with natural language input and custom time presets. This week, the app was updated to offer a new control for opening Remind Me Faster’s task creation view. Just like Fantastical’s ‘Create Event’ control, placing Remind Me Faster on the Lock Screen is a fantastic way to create reminders throughout the day without the need to go through your iPhone’s Home Screen.

Fantastical and Cardhop are available for free on the App Store. An optional subscription to Flexibits Premium costs $57/year and unlocks all features for both apps.

Remind Me Faster is available for free on the App Store. An optional $1.49 in-app purchase unlocks natural language input, and a $0.99 in-app purchase unlocks custom time presets.


There will be many more apps releasing their own custom controls for Control Center, the Lock Screen, and the Action button soon. While these new controls are often simple shortcuts to specific features of their respective apps, I’m very glad that we’ve finally reached the point where app developers are able to plug into these parts of the OS on iPhone and iPad.

In my mind, this begs the question: how long until we can add these controls to the Home Screen? After all, legacy widgets used to be confined to the Today View, but Apple has since allowed us to place modern widgets on the Home Screen, making it more modular than ever. What if they could build on the success of third-party controls just like they did with Home Screen widgets? Let me toggle Noir, log my medication, and create a calendar event all with a single tap right from the Home Screen and build the ultimate modular setup.


Access Extra Content and Perks

Founded in 2015, Club MacStories has delivered exclusive content every week for nearly a decade.

What started with weekly and monthly email newsletters has blossomed into a family of memberships designed every MacStories fan.

Club MacStories: Weekly and monthly newsletters via email and the web that are brimming with apps, tips, automation workflows, longform writing, early access to the MacStories Unwind podcast, periodic giveaways, and more;

Club MacStories+: Everything that Club MacStories offers, plus an active Discord community, advanced search and custom RSS features for exploring the Club’s entire back catalog, bonus columns, and dozens of app discounts;

Club Premier: All of the above and AppStories+, an extended version of our flagship podcast that’s delivered early, ad-free, and in high-bitrate audio.

Learn more here and from our Club FAQs.

Join Now]]>
Chris Lawley’s iOS and iPadOS 18 Walkthrough https://www.macstories.net/linked/chris-lawleys-ios-and-ipados-18-walkthrough/ Wed, 18 Sep 2024 13:03:31 +0000 https://www.macstories.net/?p=76632

It’s been an unprecedented week for Apple’s OSes, with updates to every OS landing at the same time at the beginning of the week. Today we’ll publish our fourth and final OS review with Devon Dundee’s visionOS review, which means I’m finally getting a chance to catch my breath and enjoy what others have to say about Apple’s OSes.

If you haven’t seen it, Chris Lawley, co-host of Comfort Zone here on MacStories, has a fantastic walkthrough of iOS and and iPadOS 18 that covers everything from Home and Lock Screen customization and the all-new Control Center to updates to system apps like Freeform, Shortcuts, Safari, and Messages. The video is especially good if you’ve had a busy week and want to get up to speed on iOS and iPadOS 18 quickly.

Chris has included a lot of excellent lesser known tips in his video that will help you get the most out of the OS updates too.

→ Source: youtu.be

]]>
iOS and iPadOS 18 Review Extras: eBooks, Wallpapers, Screen Saver, and a Special Edition of MacStories Weekly https://www.macstories.net/news/ios-18-review-extras-club/ Mon, 16 Sep 2024 14:31:53 +0000 https://www.macstories.net/?p=76557

A short time ago, Federico published his annual iOS and iPadOS 18 review. As in past years, we’re releasing a wide variety of perks exclusively for Club MacStories members throughout the week, but this year, the perks are a little different – just like the review.

Here’s our friend Chris Lawley with the details:

First of all, we’re releasing this week’s episode of AppStories+ to everyone who listens to the show for free.

AppStories+ is the extended version of our flagship podcast that’s typically released a day early and ad-free in high-bitrate audio. The show is part of a Club Premier or AppStories+ subscription, but this week, everyone gets the extended version of the show. If you’re not a subscriber, you’ll still get an ad with the episode (we do have bills to pay), but you’ll also get the full extended version of the episode in high bitrate audio, just like subscribers. It’s our way of saying thanks to everyone who has listened to AppStories through the years and share why this year’s review is not just different, but part of an editorial evolution of MacStories.

As Chris explains in his video, we have an eBook version of the review for all Club members. We’ll also publish a special making-of edition of MacStories Weekly, our Club newsletter, on Saturday.

If you’re not already a member, you can join Club MacStories for $5/month or $50/year using the buttons below:


A short sample of one of the six screen savers for Club Plus and Premier members.

A short sample of one of the six screen savers for Club Plus and Premier members.

Also, this year, there will be even more perks than ever for Club MacStories Plus and Premier members, including:

  • More eBooks of my macOS Sequoia review, Jonathan Reed’s watchOS 11 review, and Devon Dundee’s visionOS 2 review;
  • A bonus eBook that collects tips and tricks from Federico’s iOS and iPadOS 18 review;
  • High-res wallpapers of the delightful illustrations created for Federico’s review by Scout Wilkinson; and
  • A screen saver developed by James Thomson that brings Scout’s artwork to life on your Mac.

To unlock all of these additional perks, use the buttons below to join Club MacStories Plus:

or Club Premier:


Now more than ever, an indie publication like MacStories depends on income from members who want to directly support what we do and our ability to collaborate with people like Scout and James more often. As always, though, we aim to provide as much value as we can to Club members, so we have more details after the break, for anyone who wants to learn more about this year’s perks and Club MacStories before joining.

Club-Wide Review Perks

The eBook

iOS and iPadOS 18: The MacStories Review. The eBook version of Federico’s review is fully interactive, complete with all the images and videos. The review is a great read on the web, where you can enjoy its special navigation features, but the eBook is an excellent alternative, especially for sitting back with your iPad and Apple Books, where you can take notes, highlight passages, and enjoy Scout Wilkinson’s wonderful illustrations.

Also (because we are asked this a lot), if you read the eBooks in Apple Books and want to zoom in to get a close look at any of the illustrations or screenshots, double-tap images on iOS devices (or double-click on the Mac) to open a full-sized version.

Federico’s iOS and iPadOS 18 review is available now as a free download exclusively for members of Club MacStories who can access them from their member downloads page.

MacStories Weekly

In Saturday’s edition of MacStories Weekly, Federico will share his annual ‘Making Of’ story. This year’s review was a big departure in format and style from prior years. In his column, Federico will explore what led him to take a different approach and what it means to MacStories and the Club.

You can join Club MacStories for $5/month or $50/year using the buttons below:

Club MacStories includes year-round perks too. In addition to the special iOS and iPadOS 18 review perks, Club MacStories features:

  • MacStories Weekly, a weekly newsletter with our favorite iOS and Mac apps, tips and in-depth automation tutorials, exclusive stories, interviews, and more.
  • The Monthly Log, a monthly newsletter with behind-the-scenes stories from the MacStories team delivered at the end of each month.
  • An early, ad-free, high-bitrate audio version of MacStories Unwind, the podcast that Federico and I record weekly. MacStories Unwind is a fun exploration of the differences between American and Italian culture and recommends media we enjoy, including books, movies, TV shows, music, and videogames.

Club MacStories Plus and Premier Perks

We have exclusive review perks for Club Plus and Premier members, too.

Three More OS eBooks and an iOS and iPadOS 18 Tips and Tricks eBook

Club Plus and Premier members get eBooks of every OS.

Club Plus and Premier members get eBooks of every OS.

We’ve created eBook versions of my macOS Sequoia review, Jonathan Reed’s watchOS 11 review, and Devon Dundee’s visionOS review. All three books will be released over the course of the next two days alongside the publication of the related reviews on MacStories. As with Federico’s review, the eBook versions of the other OS reviews are a fantastic way to sit back and soak in the details of everything that Apple released today for the Mac, Apple Watch, and Vision Pro.

Federico has also created a bonus eBook that compiles the best iOS and iPadOS 18 tips and tricks from his review;

Club members old and new will be able to download the books from their Club downloads page.

Wallpapers and Screen Savers

One of the six screen savers created from the illustrations in Federico's iOS and iPadOS 18 review.

One of the six screen savers created from the illustrations in Federico’s iOS and iPadOS 18 review.

Scout Wilkinson’s artwork is so wonderful that we’ve released it to Club Plus and Premier members as high-resolution wallpapers. There are a total of six colorful wallpapers for the iPad and Mac. Federico’s been using them for a while now in full resolution on his Lock Screen and blurred on his Home Screen, and they look fantastic.

Also, Procreate automatically generates a time-lapse of an artist’s work, and when Scout showed us the time-lapses of the review’s illustrations, we were mesmerized. With the help from James Thomson, we’ve brought those time-lapses to life for Club Plus and Premier members as macOS screen savers. There are six total screen savers that can be played at random, or you can pick your favorite to play every time your Mac’s screen sleeps. There are even speed controls for the playback.

These screen savers are really something special. It’s fascinating to watch Scout’s creative process play out in front of you in high-resolution as the illustrations take shape from rough sketches to the polished finished product.

By joining Club Plus or Premier, you’ll get everything. Federico’s eBook, our special edition of MacStories Weekly, three additional eBooks of our other OS reviews, full-res illustrated wallpapers, and macOS screen savers by Scout Wilkinson.

Join Club MacStories Plus:

Join Club Premier:

Of course, as a Club Plus or Premier member, you’ll also receive access to our vibrant Discord community, special audio events that are held in the Discord community after Apple events and released as a Club-only podcast, bonus columns from Federico and me, an advanced version of the Club website that includes advanced search and filtering controls, custom RSS feeds of Club articles, and more. And, for just $2/month or $20/year more, Premier members get AppStories+ too, the extended, ad-free, high-bitrate audio version of our flagship podcast, which is released early most weeks.


Today is the culmination of a lot of work by Federico and the entire MacStories team. Every year, we try to make our reviews extra special for readers of the site and Club members who want even more of the iOS, iPadOS, macOS, watchOS, and visionOS coverage that we’re known for. Thanks as always for reading MacStories and to our Club members who help us continue to do what we love.

Also, a special thanks to Scout Wilkinson and James Thomson for bringing Federico’s review to life this year. This review means a lot to us because it’s more than just a review of iOS and iPadOS 18. It also represents our vision of the future of the web as a place where individual creativity thrives and is celebrated. Our mission is to do everything we can to help make that a reality, and we hope you’ll consider helping us achieve our goal by signing up for the Club.

Thanks, and happy reading from the entire MacStories team.


Access Extra Content and Perks

Founded in 2015, Club MacStories has delivered exclusive content every week for nearly a decade.

What started with weekly and monthly email newsletters has blossomed into a family of memberships designed every MacStories fan.

Club MacStories: Weekly and monthly newsletters via email and the web that are brimming with apps, tips, automation workflows, longform writing, early access to the MacStories Unwind podcast, periodic giveaways, and more;

Club MacStories+: Everything that Club MacStories offers, plus an active Discord community, advanced search and custom RSS features for exploring the Club’s entire back catalog, bonus columns, and dozens of app discounts;

Club Premier: All of the above and AppStories+, an extended version of our flagship podcast that’s delivered early, ad-free, and in high-bitrate audio.

Learn more here and from our Club FAQs.

Join Now]]>
iOS and iPadOS 18: The MacStories Review https://www.macstories.net/stories/ios-and-ipados-18-the-macstories-review/ Mon, 16 Sep 2024 14:30:31 +0000 https://www.macstories.net/?p=76527 There is still fun beyond AI.]]> iOS 18 After One Month: Without AI, It’s Mostly About Apps and Customization https://www.macstories.net/stories/ios-18-public-beta-preview/ Mon, 15 Jul 2024 20:22:37 +0000 https://www.macstories.net/?p=76034 iOS 18 launches in public beta today.

iOS 18 launches in public beta today.

My experience with iOS 18 and iPadOS 18, launching today in public beta for everyone to try, has been characterized by smaller, yet welcome enhancements to Apple’s productivity apps, a redesign I was originally wrong about, and an emphasis on customization.

There’s a big omission looming over the rollout of these public betas, and that’s the absence of any Apple Intelligence functionalities that were showcased at WWDC. There’s no reworked Siri, no writing tools in text fields, no image generation via the dedicated Image Playground app, no redesigned Mail app. And that’s not to mention the AI features that we knew were slotted for 2025 and beyond, such as Siri eventually becoming more cognizant of app content and gaining the ability to operate more specifically inside apps.

As a result, these first public betas of iOS and iPadOS 18 may be – and rightfully so – boring for most people, unless you really happen to care about customization options or apps.

Fortunately, I do, which is why I’ve had a pleasant time with iOS and iPadOS 18 over the past month, noting improvements in my favorite system apps and customizing Control Center with new controls and pages. At the same time, however, I have to recognize that Apple’s focus this year was largely on AI; without it, it feels like the biggest part of the iOS 18 narrative is missing.

As you can imagine, I’m going to save a deeper, more detailed look at all the visual customization features and app-related changes in iOS and iPadOS 18 for my annual review later this year, where I also plan to talk about Apple’s approach to AI and what it’ll mean for our usage of iPhones and iPads.

For now, let’s take a look at the features and app updates I’ve enjoyed over the past month.

Apps

There are lots of app-related improvements in iOS 18, which is why I’m looking forward to the next few months on AppStories, where we’ll have a chance to discuss them all more in-depth. For now, here are my initial highlights.

Reminders and Calendar, Together At Last

I almost can’t believe I’m typing this in 2024, but as things stand in this public beta, I’m very excited about…the Calendar app.

In iOS and iPadOS 18, the Calendar app is getting the ability to show your scheduled reminders alongside regular calendar events. I know, I know: that’s not the most incredible innovation since apps like Fantastical have been able to display events and tasks together for over a decade now. The thing is, though, with time I’ve realized that I don’t need a tool as complex as Fantastical (which is a great app, but its new business-oriented features are something I don’t need); I’m fine with Apple’s built-in Calendar app, which does everything I need and has an icon on the Home Screen that shows me the current date.

By enabling the ‘Scheduled Reminders’ toggle in the app’s Calendars section, all your scheduled tasks from the Reminders app will appear alongside events in the calendar. You can click a reminder inside Calendar to mark it as complete or use drag and drop to reschedule it to another day. As someone who creates only a handful of calendar events each week but gives every task a due date and time, I find it very helpful to have a complete overview of my week in once place rather than having to use two apps for the job.

Reminders alongside events in the Calendar app.

Reminders alongside events in the Calendar app.

The integration even extends to Calendar’s Home Screen widgets, including the glorious ‘XL’ one on iPad, which can now show you reminders and events for multiple days at a glance.

Reminders are also displayed in the XL Calendar widget on iPad. iOS and iPadOS 18 also include the ability to resize widgets without deleting and re-adding them from scratch every time.

Reminders are also displayed in the XL Calendar widget on iPad. iOS and iPadOS 18 also include the ability to resize widgets without deleting and re-adding them from scratch every time.

Historically speaking, this is not the first time we’ve seen a direct, two-way communication between Apple apps on iOS: in iOS 12, Apple brought News articles to the Stocks app, which I covered in my review at the time as an exciting opportunity for system apps to cross-pollinate and for Apple to provide iOS users with an experience greater than the sum of its parts. This year, the company is going to much greater lengths with the same idea. Not only is Reminders data appearing inside Calendar, but tasks are interactive in the Calendar app, to the point where you can access the full-blown Reminders task creation UI from inside Calendar:

The Reminders UI embedded within Calendar.

The Reminders UI embedded within Calendar.

Does this mean the Calendar app has become the one productivity app to rule them all for me? Obviously not, since there are still plenty of reasons to open the standalone Reminders app, if only to browse my various lists and smart lists or use specific features like tagging and rich links. But the idea of providing a common ground between the two apps is solid, and as a result, I find myself spending more time managing my week inside the Calendar app now.

As we’ll see later this year, these two apps aren’t the only ones becoming capable of talking to each other in iOS 18: Notes and Calculator will also get the ability to share Math Notes and allow users to edit the same document in two distinct places. This is a trend worth keeping an eye on.

Speaking of Reminders, there are a handful of improvements in the app I want to mention.

For starters, subtasks now appear in the ‘Today’ and ‘Scheduled’ lists as well as custom smart lists. Previously, reminders nested within a parent reminder would be hidden from those special views, which – again, as someone who schedules everything in his task manager – hindered the utility of subtasks in the first place. To give you a practical example, I have an automation that creates a new list for the next issue of MacStories Weekly every week (which I adapted from my Things suite of shortcuts), and one of the tasks inside that list is an ‘App Debuts’ parent reminder. When I come across an interesting app or update during the week, I save it as a subtask of that reminder. In iOS and iPadOS 18, those subtasks can appear in the ‘Today’ view on Saturday morning, when it’s time to assemble MacStories Weekly.

Subtasks for a reminder in a specific list (left) can now appear in the Today page and other smart lists.

Subtasks for a reminder in a specific list (left) can now appear in the Today page and other smart lists.

Although the ‘Today’ default smart list still doesn’t support columns, it now lets you customize the order of its default sections: all-day, overdue, and timed reminders.

You can now customize the order of sections in the Today page.

You can now customize the order of sections in the Today page.

I’m also intrigued by Apple’s promise of new Shortcuts actions for Reminders, though I suspect the work is unfinished and we’re only seeing partial results in this public beta. There is a new ‘Create Reminder’ action in Shortcuts (which I can only see on my iPhone, not on the iPad) that exposes more options for task creation than the old ‘Add New Reminder’ action.

Namely, this action now lets you enter properties for a list’s sections and assignees; strangely enough, the action doesn’t contain an option to enter a URL attachment for a task, which the previous action offered. I’m guessing that, as part of Apple Intelligence and with the ultimate goal of making Siri more integrated with apps, Apple is going to retool a lot of their existing Shortcuts actions. (It’s about time.) I wouldn’t be surprised if more apps follow Reminders in modernizing their Shortcuts integrations within the iOS 18 cycle because of Apple Intelligence.

Passwords: The “Finally” of the Year

At long last – and the finally is deserved here – Apple has made a standalone Passwords app for iOS, iPadOS, and macOS. I (and many others) have been arguing in favor of cutting password management out of Settings to let it blossom into a full-blown app for years now; I’m not kidding when I say that, on balance, the addition of the Passwords app has been the most important quality-of-life improvement in iOS 18 so far.

I moved all of my passwords out of 1Password and into iCloud Keychain just before WWDC (following Simon’s excellent guide). As soon as I updated to iOS 18, they were all transferred to the Passwords app, and I didn’t have to do anything else. iCloud Keychain already supported key password management features like verification codes; with the Passwords app, you don’t need to go hunt for them inside Settings anymore thanks to a more intuitive UI that also adds some welcome options.

The Passwords app has a design that’s reminiscent of Reminders, with pinned sections at the top for passkeys, logins that have one-time codes, Wi-Fi networks, security recommendations, and deleted items. With the exception of the Wi-Fi section, these aren’t new features, but the presentation makes them easier to find. These sections are followed by your shared groups, which aren’t new either, but are more discoverable and prominent. The design of login item pages is consistent with iOS 17’s old iCloud Keychain UI, but the app now supports multiple URLs for the same login; the main list view also includes sorting options.

The Passwords app is my favorite addition of the year.

The Passwords app is my favorite addition of the year.

My favorite detail of the Passwords app, however, is this: if you need to quickly copy an item’s username, password, code, or URL, you can simply right-click it in the main view and copy what you need:

This is a good menu.

This is a good menu.

The best thing I can say about Passwords is that it obliterated a particular kind of frustration from my life. With Apple’s app, I don’t need to worry about my password manager not working anymore. In fact, I’d argue that Passwords’ strongest quality is that you never think about it that much, and that means it’s doing its job.

Those who, like me, come from more than a decade of using 1Password and have witnessed the app’s slow descent into instability know what I’m talking about: having to constantly worry about the app’s Safari extension not working, search not bringing up results inside the app, or the extension auto-filling the wrong information on a page. With Passwords, all these issues have evaporated for me, and I can’t describe how happy it makes me that I just don’t have these thoughts about my password manager anymore.

Don’t get me wrong; there are features of 1Password that Apple’s Passwords app can’t match yet, and that I’m guessing will be the reason why some folks won’t be able to switch to it just yet. The biggest limitation, in my opinion, is the lack of secure attachments: if you want to store a file (like, say, a PDF document or an encryption key) associated with a login item, well, you can’t with Passwords yet.

These limitations need to be addressed, and now that Passwords is a standalone experience, I’m more confident that Apple will have the headroom to do so rather than having to cram everything into a Settings page. Moving from 1Password to the Passwords app has been one of the most useful tech-related migrations I’ve made in recent memory; if you’re on the verge and primarily use Apple devices, I highly recommend taking the time to do it.

The New Photos App: Tradition, Discovery, and Customization

Beyond Apple Intelligence, I’d argue that the most important change of iOS 18 – and something you can try right now, unlike AI – is the redesigned Photos app. As I shared on MacStories a couple of weeks ago, I was initially wrong about it. Having used it every day for the past month, not only do I think Apple is onto something with their idea of a single-page app design, but, more importantly, the new app has helped me rediscover old memories more frequently than before.

The concept behind the new Photos app is fairly straightforward to grasp, yet antithetical to decades of iOS UI conventions: rather than organize different app sections into tabs, everything has been collapsed into a single page that you can scroll to move from your library to various kinds of collections and suggestions. And that’s not at all; this fluid, single-page layout (which is powered by SwiftUI) is also highly customizable, allowing users to fine-tune which sections they want to see at the top in the main carousel and which pinned collections, albums, or trips they want to see further down the page.

The new Photos UI on iPad.

The new Photos UI on iPad.

It’s easy to understand why the move from a tabbed interface to a unified single-page design may – at least in theory – bolster discovery inside the app: if people aren’t using the app’s other sections, well, let’s just move all those sections into the screen we know they’re using. Or, think about it this way: we already spend hours of our lives discovering all kinds of digital information – news, music, memes, whatever – by scrolling. Why not use the same gesture to rediscover photos in our libraries, too? (The opposite of doomscrolling, if you will.)

What I think Apple designers have achieved with the new Photos app – and what I will explore more in-depth later this year in my review – is balance between tradition, discovery, and customization. By default, the new Photos app shows your usual grid of recent photos and videos at the top, occupying roughly 60% of the screen on iPhone. Immediately below, the app will automatically compile smart collections for recent days (where only the best shots are highlighted, removing stuff like screenshots and receipts) as well as your favorite people and pets. So if you’re looking for the photo you just took, you can still find it in the grid, but there’s also a chance something else may catch your eye down the page.

Photos' new UI.

Photos’ new UI.

The grid can be expanded to full-screen with a swipe, which reveals a new segmented control to enable the Years and Months views as well as a menu for sorting options and new filters to exclude media types such as screenshots and videos. The transition from half-grid to full-grid is incredibly smooth and pleasant to look at.

Expanding the grid reveals new filters.

Expanding the grid reveals new filters.

So that’s tradition: if you want to keep using the Photos app as a grid of photos, you can, and the app supports all the features you know and love in the grid, such as long presses to show context menus and drag and drop. This is where Photos bifurcates from the past, though: if you want, at this point, you can also keep scrolling to discover more, or you can spend some time customizing the look of the app to your needs and preferences.

There are a lot of recommended sections (and more coming with AI in the future) and customization options – too many for me to cover in this preview article today. Allow me to highlight just a few. The main grid at the top of the app? That’s actually a carousel that you can swipe horizontally to move from the grid to other “pinned” sections, and you can customize which collections are displayed in here. In my app, I put my favorites, videos, photos of my dogs, featured photos, and screenshots (in this order) after the grid. This way, I can move to the right if I want to discover old favorites and memories, or I can quickly move the left to find all my screenshots quickly. Once again: it’s all about tradition, discovery, and customization.

Customizing the carousel.

Customizing the carousel.

I’ve become a huge fan of Recent Days, which is a section that follows the grid, automatically groups photos by day, and sort of serves as a visual calendar of your life. Apple’s algorithm, in my experience, does a great job at picking a key photo from a particular day, and more often than not, I find myself swiping through this section to remember what I did on any given day.

I also like the ability to customize Pinned Collections, which is another section on the page and, effectively, a user-customizable space for shortcuts to your Photos library. You can pin anything in here: media types, specific albums, specific trips (which are curated by iOS 18), photos of your pets, photos of people and groups of people (also new in iOS 18), and more.

Recent Days and Pinned Collections.

Recent Days and Pinned Collections.

I’ll save more comments and tidbits on the redesigned Photos app for my iOS and iPadOS 18 review later this year. For now, though, I’ll say this: a month ago, I thought Apple was going to revert this drastic redesign like they did with Safari three years ago; now, I think Apple has created something special, and they should be diligent enough to iterate and listen to feedback, but also stick to their ideas and see this redesign through. The new Photos app allows me to see recently-taken pictures like before; at the same time, it gives me an easier, less disorienting way to discover forgotten moments and memories from my library that are continuously surfaced throughout the app. And at any point, I can choose to customize what I see and shape the app’s experience into something that is uniquely mine.

I was skeptical about iOS 18’s Photos app at first, but now I’m a believer.

User Customization: Home Screen Icons and Control Center

Apple’s progressive embrace of user customization on its mobile platforms isn’t new. (We could trace the company’s efforts back to iOS 12’s rollout of Shortcuts and, of course, iOS 14’s Home Screen widgets.) For the first time in years, however, I feel like a part of Apple’s customization features in iOS 18 aren’t meant for people like me at all. Fortunately, there’s another aspect to this story that is very much up my alley and, in fact, the area of iOS 18 I’m having the most fun tweaking.

Let’s start with the part that’s not for me this year: Home Screen customization and icon theming. At a high level, Apple is bringing three key changes to the Home Screen in iOS 18:

  • You can now place app icons and widgets anywhere, leaving empty spaces around them.
  • You can make app icons larger, hiding their text labels in the process.
  • You can switch app icons to a special dark mode version, as well as apply any color tint you like to them.

Of these three changes, I’ve only been using the setting that makes icons larger and hides their labels. I think it makes my Home Screen look more elegant and less crowded by getting rid of app and shortcut titles underneath their icons.

Regular icons (left) compared to the new larger icon size in iOS 18.

Regular icons (left) compared to the new larger icon size in iOS 18.

As for the other two changes…they’re really not my kind of thing. I think the ability to freely rearrange icons on the Home Screen, going beyond the limitation of the classic grid, is long overdue and something that a lot of folks will joyfully welcome. Years ago, I would have probably spent hours making dynamic layouts for each of my Home Screen pages with a particular flow and order to their icons. These days, however, I like to use a single Home Screen page, with Spotlight and the App Library filling the gaps for everything else. And – as you’ve seen above – I like filling that Home Screen page to the brim with icons for maximum efficiency when I’m using my phone.

I don’t foresee a scenario in which I voluntarily give up free space on my Home Screen to make it more “aesthetic” – including on my iPad, where this feature is also supported, but I’d rather use the extra space there for more widgets. At the same time, I know that literally millions of other iPhone users will love this feature, so Apple is right in adding support for it. As a meme would put it, in this case, I think it’s best if people like me shut up and let other people enjoy things.

It’s a similar story with icon tinting, which takes on two distinct flavors with iOS 18. Apps can now offer a dark mode icon, a great way to make sure that folks who use their devices in dark mode all the time have matching dark icons on their Home Screens. Generally speaking, I like what Apple is doing with their system apps’ icons in dark mode, and I appreciate that there are ways for developers to fine-tune what their icons should look like in this mode. My problem is that I never use dark mode – not even at night – since I find it too low-contrast for my eyes (especially when reading), so I don’t think I’m ever going to enable dark icons on my Home Screen.

From left to right: large icons, dark mode icons (for some of my apps), and tinted icons.

From left to right: large icons, dark mode icons (for some of my apps), and tinted icons.

The other option is icon tinting using a color picker, and…personally, I just don’t think it looks good at all. With this feature, you can effectively apply a color mask on top of every app icon and change the intensity of the color you’ve selected, thus completely disregarding the color palettes chosen by the app’s creator. To my eyes, the results look garish, to the point where even Apple’s promotional examples – historically, images that are meant to make the new version of iOS appear attractive – look awful to me. Are there going to be people out there who will manage to make a tinted layout that looks nice and that they’re going to love? I’m sure. And this is why user customization is important: we all have different tastes and needs, and I think it’s great when software doesn’t judge us for what we like (or dislike) and lets us shape the computer however we want.

I want to wrap up this story with the one customizable feature that I adore in iOS 18, and which I know is going to define my summer: the new Control Center.

In iOS 18, Control Center is becoming a multi-page, customizable affair. Controls now come in multiple sizes, and they’re powered by the same technologies that allow developers to create widgets and Shortcuts actions (WidgetKit and App Intents). This rewrite of Control Center has some deep ramifications: for the first time, third-party apps can offer native controls in Control Center, controls can be resized like widgets, and there is a Controls Gallery (similar to the Widget Gallery on the Home and Lock Screens) to pick and choose the controls you want.

The new Control Center can span multiple pages with support for resizable controls and third-party ones.

The new Control Center can span multiple pages with support for resizable controls and third-party ones.

Given the breadth of options at users’ disposal now, Apple decided to eschew Control Center’s single-page approach. System controls have been split across multiple Control Center pages, which are laid out vertically (rather than horizontally, as in the iOS 10 days); plus, users can create new pages to install even more controls, just like they can create new Home Screen pages to use more widgets.

Basically, Apple has used the existing foundation of widgets and app intents to supercharge Control Center and make it a Home Screen-like experience. It’s hard for me to convey in an article how much I love this direction: app functionalities that maybe do not require opening the full app can now be exposed anywhere (including on the Lock Screen), and you get to choose where those controls should be positioned, across how many pages, and how big or small they should be.

You can customize Control Center sort of like the Home Screen by rearranging and resizing controls. There is a Controls Gallery (center) and controls can be configured like widgets, too.

You can customize Control Center sort of like the Home Screen by rearranging and resizing controls. There is a Controls Gallery (center) and controls can be configured like widgets, too.

If you know me, you can guess that I’m going to spend hours infinitely tweaking Control Center to accommodate my favorite shortcuts (which you can run from there!) as well as new controls from third-party apps. I’m ecstatic about the prospect of swapping the camera and flashlight controls at the bottom of the Lock Screen (which are now powered by the same tech) with new, custom ones, and I’m very keen to see what third-party developers come up with in terms of controls that perform actions in their apps without launching them in the foreground. A control that I’m testing now, for instance, starts playing a random album from my MusicBox library without launching the app at all, and it’s terrific.

So far, the new Control Center feels like peak customization. Power users are going to love it, and I’m looking forward to seeing what mine will look like in September.

iOS and iPadOS 18

There’s a lot more I could say about iOS 18 and its updated apps today. I could mention the ability to create colored highlights in Notes and fold sections, which I’m using on a daily basis to organize my iOS and iPadOS research material. I could point out that Journal is receiving some terrific updates across the board, including search, mood logging based on Health, and integration with generic media sources (such as third-party podcast apps and Spotify, though this is not working for me yet). I could cover Messages’ redesigned Tapbacks, which are now colorful and finally support any emoji you want.

But I’m stopping here today, because all of those features deserve a proper, in-depth analysis after months of usage with an annual review this fall.

Should you install the iOS and iPadOS 18 public betas today? Unless you really care about new features in system apps, the redesigned Photos app, or customization, probably not. Most people will likely check out iOS 18 later this year to satisfy their curiosity regarding Apple Intelligence, and that’s not here yet.

What I’m trying to say, though, is that even without AI, there’s plenty to like in the updated apps for iOS and iPadOS 18 and the reimagined Control Center – which, given my…complicated feelings on the matter is, quite frankly, a relief.

I’ll see you this fall, ready, as always, with my annual review of iOS and iPadOS.


You can also follow our 2024 Summer OS Preview Series through our dedicated hub, or subscribe to its RSS feed.


Access Extra Content and Perks

Founded in 2015, Club MacStories has delivered exclusive content every week for nearly a decade.

What started with weekly and monthly email newsletters has blossomed into a family of memberships designed every MacStories fan.

Club MacStories: Weekly and monthly newsletters via email and the web that are brimming with apps, tips, automation workflows, longform writing, early access to the MacStories Unwind podcast, periodic giveaways, and more;

Club MacStories+: Everything that Club MacStories offers, plus an active Discord community, advanced search and custom RSS features for exploring the Club’s entire back catalog, bonus columns, and dozens of app discounts;

Club Premier: All of the above and AppStories+, an extended version of our flagship podcast that’s delivered early, ad-free, and in high-bitrate audio.

Learn more here and from our Club FAQs.

Join Now]]>
Apple Executives on the Photos Overhaul in iOS 18 https://www.macstories.net/linked/apple-executives-on-the-photos-overhaul-in-ios-18/ Mon, 08 Jul 2024 10:41:04 +0000 https://www.macstories.net/?p=75954 Alvin Cabral, writing for The National, got a nice quote from Apple’s Billy Sorrentino on the redesigned Photos app in iOS 18:

“As our features, users and libraries have grown, so has the density of the [Photos] app. So rather than hunt and peck throughout, we’ve created a simple streamlined single view photos experience based on deep intelligence,” Billy Sorrentino, senior director at Apple’s human interface design unit, told The National.

“Ultimately, we wanted to remove friction” in how Photos is used, he added.

It’s been a few weeks since I installed iOS 18 on my primary iPhone, and I feel pretty confident in saying this: I was wrong about the new Photos app at first.

I’ll reserve more in-depth comments for the public beta and final release of iOS 18; of course, given the drastic redesign of the app, there’s also a chance Apple may scrap their plans and introduce a safer update with fewer structural changes. However, over the past few weeks, I noticed that not only do I find myself discovering more old photos in iOS 18, but the modular approach of the more customizable Photos app really works for me. I was able to fine-tune the top carousel to my liking, and I customized pinned collections with shortcuts to my favorite sections. Put simply, because of these changes, I use the Photos app a lot more and find navigating it faster than before.

Anecdotally, when I showed my girlfriend the new Photos app, she argued that the single-page design should be nicer than iOS 17 since she never used the other tabs in the app anyway. I don’t think she’s alone in that regard, which is why I believe Apple should stick with this major redesign this time around.

→ Source: thenationalnews.com

]]>
Designing Dark Mode App Icons https://www.macstories.net/linked/designing-dark-mode-app-icons/ Thu, 13 Jun 2024 00:17:59 +0000 https://www.macstories.net/?p=75768

Apple’s announcement of “dark mode” icons has me thinking about how I would approach adapting “light mode” icons for dark mode. I grabbed 12 icons we made at Parakeet for our clients to illustrate some ways of going about it.

Before that though, let’s take some inventory. Of the 28 icons in Apple’s preview image of this feature, only nine have white backgrounds in light mode. However, all icons in dark mode have black backgrounds.

Actually, it’s worth noting that five “light mode” icons have black backgrounds, which Apple slightly adjusted to have a consistent subtle black gradient found on all of their new dark mode icons. Four of these—Stocks, Wallet, TV, and Watch—all seem to be the same in both modes. However, no other (visible) icons are.

Fantastic showcase by Louie Mantia of how designers should approach the creation of dark mode Home Screen icons in iOS 18. In all the examples, I prefer Mantia’s take to the standard black background version.

See also: Gavin Nelson’s suggestion, Apple’s Human Interface Guidelines on dark mode icons, and the updated Apple Design Resources for iOS 18.

→ Source: lmnt.me

]]>
iOS and iPadOS 18: The MacStories Overview https://www.macstories.net/stories/ios-and-ipados-18-the-macstories-overview/ Tue, 11 Jun 2024 00:24:46 +0000 https://www.macstories.net/?p=75708 Image: Apple.

Image: Apple.

At its WWDC 2024 keynote held earlier today online and with an in-person event at Apple Park in Cupertino, California, Apple officially announced the next versions of the operating systems for iPhone and iPad – iOS and iPadOS 18.

As widely speculated in the lead up to the event, Apple’s focus for both OSes largely revolves around artificial intelligence, or as the company likes to refer to the AI acronym now, “Apple Intelligence”. The new AI features in iOS and iPadOS promise to make both operating systems, well, more intelligent than before thanks to a completely revamped Siri and proactive functionalities that learn from users’ habits and apps. Presented as a fast, private, and personal set of features that draws from the user’s context and combines it with generative models, Apple Intelligence – which will debut in U.S. English only later this year, with a beta expected later this summer – will power a variety of new system features and experiences, starting from a revamped Siri and text analysis features to image creation, performing actions inside apps, and more.

But AI-related improvements aren’t the only new features Apple announced today. From a renewed focus on Home Screen customization and redesigned Control Center to a new design for tab bars on iPad and expanded Tapbacks in Messages, Apple has showed that, while they can follow the rest of the tech industry in rethinking how AI can enhance how we use our devices, they can continue shipping other functionalities for iPhone and iPad, too. Or, at the very least, they certainly can for the iPhone and iOS.

We’ll have in-depth overviews for both iOS and iPadOS 18 when the public betas for each OS come out next month, and, of course, we’ll continue diving into the announcements later this week on MacStories via our WWDC 2024 hub as well as AppStories. We’ll also have a dedicated story about Apple Intelligence coming later on MacStories with the highlights of all the AI-infused features announced by Apple today.

In the meantime, here’s a recap of everything else that Apple showed today for iOS and iPadOS 18.

iOS 18

As you may expect, I want to highlight the non-AI features coming to iOS later this year, which I will then cover more in-depth in my annual review on MacStories.

Customization on the Home Screen and Control Center

Apple’s focus on user customization of the Home and Lock Screens is nothing new: starting with iOS 14 and widgets, the company always found ways to deliver on the customization front with every annual iOS release. This year’s changes in iOS 18, however, promise to be some of the most substantial the Home Screen has seen since the introduction of the original iPhone in 2007.

For the first time in 17 years, users will be able to rearrange icons on the Home Screen more freely by placing them wherever they want. The Home Screen in iOS 18 will support the ability to grab icons and say, drag them toward the bottom of the screen, creating empty rows between other icons, or even off to the side of the screen. Apple showcased this ability in the context of not letting icons hide parts of the user’s wallpaper anymore, and I’m sure people will come up with tons of creative strategies that take advantage of this. The ability to arrange content anywhere in an open space also applies to widgets.

Examples of dark mode icons (left), tinted icons (center), and custom icon layouts. Image: Apple.

Examples of dark mode icons (left), tinted icons (center), and custom icon layouts. Image: Apple.

However, the customization options on the Home Screen go beyond placement of icons and widgets. iOS 18 includes support for a special dark mode version of app icons, which can now take on a darker appearance to match the system’s interface. But that’s not all: through the use of a new ‘customization sheet’, users will be able to apply a specific tint to app icons and effectively recolor them. By default, iOS will suggest some color choices based on the installed wallpaper, but thanks to a built-in color picker, users can pick any color they want. There is also an option to make icons larger for those who prefer a denser layout; larger icons won’t display text labels underneath them at all. Right now, it appears that icon re-coloring is enforced system-wide and there’s no way for developers to opt out of it.

More tinted icons (left) and larger app icons without text labels. Image: Apple.

More tinted icons (left) and larger app icons without text labels. Image: Apple.

This year, customization has extended to Control Center too, which is undergoing its first redesign since iOS 11. In a move that is reminiscent of iOS 10, Control Center is going back to a multi-page design, with the important distinction that, unlike iOS 10, you don’t have to use multiple pages and the primary screen of Control Center is also fully customizable.

Control Center's main page (left) and the HomeKit control group. Image: Apple.

Control Center’s main page (left) and the HomeKit control group. Image: Apple.

By default, controls in the new Control Center look more rounded than before and there’s a new look for the Now Playing widget. There are two important differences in this screen compared to iOS 17, though: the plus button in the top left corner, and the scrollable list of pages on the right.

On the right side of the screen, you’ll be able to move across new groups of controls dedicated to music and audio, HomeKit, and wireless radios. Apple seems to have learned the lesson from iOS 10 and built in a fast method to instantly get to a specific page: keep swiping when invoking Control Center, and you’ll be able to directly land on a specific sub-page.

Music Controls (left) and the new Controls gallery. Image: Apple.

Music Controls (left) and the new Controls gallery. Image: Apple.

Furthermore, the aforementioned ‘+’ button points to another major addition to Control Center: customization. For the first time, you can customize controls in Control Center from Control Center itself by dragging them around, like icons on the Home Screen. You can even resize certain controls and make them bigger directly from Control Center. And there’s more: at long last, there’s a Controls API that will allow third-party apps to add their own controls to Control Center. Much like widgets, these controls will be available in a ‘Controls Gallery’, which will allow you to choose which controls you’d like to add from your favorite apps.

We know how Apple likes to reuse and extend its previous technologies, so it’s no surprise that, in developing the new Control Center and Controls API, they were able to tap into other areas of the OS. The same Controls API will allow you to finally swap the Camera and Flashlight toggles on the Lock Screen for something else, such as a button to launch another camera app. And, all the controls marked as available via the Controls API will also be assignable to the Action button in iOS 18.

As a big fan of user customization, I’m thrilled to see the expansion of Home Screen icon and widget modes, and the renewed capabilities of Control Center are long overdue but welcome nonetheless. From what I’ve seen so far, I’m liking the consistency between Home/Lock Screen widget gallery and the Controls gallery, and I’m pleased that Apple was able to integrate the same controls with the Lock Screen and Action button, too.

Privacy Enhancements: Lock and Hide Your Apps

Privacy enhancements are a staple of Apple’s annual OS cycle, and this year’s updates are no different.

First and foremost, iOS 18 now lets you lock any app you want with Face ID, Touch ID, or your passcode. This means that every time you want to open an app that you manually locked, you’ll have to authenticate. In addition, if an app is locked, its content won’t show up in places like Spotlight search or notifications.

The hidden folder in the App Library (left), the new prompt to hide apps (center), and the ability to lock apps. Image: Apple.

The hidden folder in the App Library (left), the new prompt to hide apps (center), and the ability to lock apps. Image: Apple.

Besides locking apps, you can also hide them from view. If you have a certain type of app on your device that you’d rather not show, you can hide it from the Home Screen. Hidden apps go into a hidden, locked folder in the App Library, similar to how hidden photos go into a locked ‘Hidden’ album in the Photos app. I’m sure a lot of people will be taking advantage of this option for, well, you know what kind of apps.

The new accessory setup screen (left) and contacts picker. Image: Apple.

The new accessory setup screen (left) and contacts picker. Image: Apple.

Lastly in the realm of privacy-related changes, Apple showed off a redesigned contact picker to grant apps access to specific contacts only, as well as a new privacy screen for apps requesting network access and a new prompt for setting up third-party accessories.

Apps

Traditionally, the Apps chapter of my annual iOS reviews is always the largest one. This year will be no different thanks to various improvements to the plethora of system apps built by Apple and the arrival of Apple Intelligence-based features in many of them. Let’s take a look at what Apple announced today.

Mail

Later this year (read: maybe not in September?), the Mail app will receive support for Gmail-like categorization of your inbox into different sections. Using on-device intelligence, Mail will analyze your messages and automatically sort them into Transactions, Updates (social media and newsletters), and Promotions. Emails can be re-categorized and archived all at once; there will also be a ‘digest’ view that shows you all emails from a specific business, like all updates from an airline, for example.

Mail categories and the digest view (right). Image: Apple.

Mail categories and the digest view (right). Image: Apple.

With Apple Intelligence (which we’ll cover in a standalone story), Mail will receive support for email summarization as well as generative features in the compose field to rewrite and shorten messages.

Messages

In addition to the previously-announced (and briefly shown in slides) support for RCS, the Messages app is getting some notable additions in iOS 18.

For starters, Tapback reactions have been redesigned to be colorful and expand beyond the original set of six reactions. In iOS 18 you’ll be able to add any emoji Tapback you want – and those are real Tapbacks, not the odd sticker-based ones Apple rolled out last year. You’ll be able to choose any emoji you want from a picker and send it as a Tapback reaction. Finally.

Following in Mail’s footsteps, Messages will gain the ability to schedule texts to send later. This will be done through a specific ‘Send Later’ app in the iMessage app picker, which lets you choose a date and time when you want a message to be sent to somebody. While a message is scheduled, you can edit it, cancel it, or send it immediately.

Scheduled messages (left), redesigned Tapbacks (center), and text formatting. Image: Apple.

Scheduled messages (left), redesigned Tapbacks (center), and text formatting. Image: Apple.

The Messages app will also support formatting for selected text with options for bold, italic, underline, and strikethrough. Additionally, a selection of eight text effects will let you add a little flair to words or even emoji with a different style than full-screen effects, which were available before. For example, you can make your text jitter, nod, or explode. Because why not.

Apple also announced that iOS’ Satellite capabilities will grow in iOS 18 to accommodate sending iMessages as well. So even if you’re off the grid, you’ll be able to send messages with the Messages app with end-to-end encryption, SMS fallback, and support for text, emoji, and Tapbacks.

Passwords

Our long national nightmare is over: Apple is launching a dedicated Passwords app later this year. I’ve long argued that passwords deserved to be moved out of Settings to blossom into a standalone, pre-installed app, and I know I wasn’t alone.

The new Passwords app. Image: Apple.

The new Passwords app. Image: Apple.

I’m glad Apple listened and decided to turn their excellent, built-in iCloud Keychain into a full-blown Passwords app that will be available across iOS, iPadOS, macOS, and Windows. The Passwords app will be able to store logins, passkeys, and even Wi-Fi credentials.

From a first look, it appears that Passwords now supports multiple website URLs for the same login, but not secure file attachments, which is one of the reasons some people still prefer using a third-party password manager like 1Password or Bitwarden.

Photos

The Photos app is getting a big redesign in iOS 18 that is surely going to take some time getting used to. The new design revolves around a single-page UI that eschews a tab bar in favor of a split-screen approach with your grid of photos shown at the top, followed by a series of collections that encompass both traditional albums, previous categories such as ‘People and Pets’ and Memories, as well as new sections such as Trips and Recent Days.

The best way to think about this redesign – which I’m sure will be debated a lot this summer – is that everything can now be considered a “collection” that you can pin for quick access to the top of the Photos UI. The top of the interface is still taken up by the regular photo grid (which you can more easily filter for content now), but that part can also be scrolled horizontally to swipe between the grid and other collections. For example, you can swipe from the grid of recents to, say, featured photos, your favorites, or any other collections you want to pin there.

The redesigned Photos app. Image: Apple.

The redesigned Photos app. Image: Apple.

Below the grid, you’ll see a new ‘Recent Days’ section, which organizes photos by day with clutter such as receipts and scanned documents filtered out by default. There’s a new Trips view that collects all of your past trips, and everything can be reordered or pinned (in two separate sections?) for quick access.

It’s a lot to take in at once, and this new design can be quite daunting at first. I understand that Apple wants to try a unified design for the Photos app to put a stronger emphasis on rediscovering memories, but I wonder if maybe packing too much information all at once on-screen could be disorienting for less proficient users. The new Photos design almost feels like an exercise in showing off what Apple can build with SwiftUI just because they can; time will tell if users will also appreciate that.

Reminders

Apple’s task manager is getting some fascinating improvements in iOS 18.

For the first time, you’ll be able to integrate Reminders with the Calendar app and see your due tasks appear in the calendar too, where you can mark them as complete or even edit them with a Reminders UI embedded within the Calendar app. If you’re a hyper-scheduler, you’re going to love this feature.

According to Apple, you’ll also be able to see your subtasks for a reminder while in the Today and Scheduled views (and other smart lists); there will be some unspecified new Shortcuts actions for Reminders; more languages will be supported for automatic categorization in grocery lists; and you’ll have the option to order sections in the Today list.

And More

Here’s a quick rundown of other changes coming to iOS 18. Obviously, we’ll cover even more changes when iOS 18 will launch in public beta next month.

Game Mode

Previously seen on macOS, Game Mode will become available on iOS too. This mode minimizes background activity while playing videogames, which helps boost a game’s performance and frame rate. It also improves latency with connected AirPods and Bluetooth-based game controllers.

Maps

The Maps app will see the addition of topographic maps with the ability to create custom walking and hiking routes.

Journal

While it’s still not coming to iPad (which is a bummer), the Journal app is being updated in iOS 18 with the ability to log your state of mind (previously seen in Health and watchOS) and a new stats view that shows you patterns and streaks based on your journaling habits. Logging your state of mind will be integrated directly into the keyboard with a specific mode for the Journal app, while stats – officially dubbed ‘Insights’ – will be available both as a summary at the top of the main Journal page as well as in a standalone section.

In addition, Journal will feature a new search mode to find past entries, audio transcription, and interactive widgets to start a new entry with prompt suggestions you can cycle through directly from the Home Screen.

iPadOS 18

As I feared, iPadOS 18 is not a meaningful update for iPad users who hoped Apple would fill some of the longstanding platform gaps between the Mac and iPad. With no Stage Manager improvements, no changes to audio routing, and seemingly very little happening in Shortcuts in terms of new actions (for now), it’s hard to be excited about iPadOS 18. Sadly, everything I wrote last month in my article about iPadOS still stands today.

iPadOS 18 can be boiled down to four things:

  • Feature parity with iOS’ Home Screen and Control Center customization features (at least we don’t have to wait until next year for these).
  • A new look for tab bars.
  • Calculator app for iPad.
  • New Apple Pencil features.
Home Screen customization on iPad. Image: Apple.

Home Screen customization on iPad. Image: Apple.

I’ve already covered the customization features, so let’s focus on the iPad-specific stuff.

Apple showed off a redesigned look for tab bars on iPad, which instead of being docked at the bottom of the screen are now floating at the top. Supposedly, these tab bars should help apps’ content better extend from edge to edge and thus help them take advantage of the iPad’s form factor more. Tab bars can elegantly morph into vertical sidebars, and they also support customization of items displayed in them, which is a nice comeback for a feature that was last seen many years ago on iOS.

Then there’s the Calculator app. After several years of waiting, at the very least Apple found a way to “productize” Calculator for iPadOS and make it, well, more than a basic calculator. In addition to the iPhone-inspired design, Calculator supports a history view, unit conversions, and – the big feature – Math Notes. This is a feature that’s based on drawing and the Apple Pencil, and it’s best considered as Freeform meets AI meets Calculator. As you write your expressions, Math Notes will learn your handwriting style, and as soon as you write an equal sign, it’ll insert the result for you in a style that resembles your own handwriting. You can save Math Notes to revisit them later; Math Notes even support declaring variables inline and turning equations into inline graphs.

Even press assets for iPadOS 18 are too few compared to iOS 18. This is the only Math Notes image I could find. Source: Apple.

Even press assets for iPadOS 18 are too few compared to iOS 18. This is the only Math Notes image I could find. Source: Apple.

Math Notes are part of a larger effort to extend the Apple Pencil’s productivity features, which are, in my opinion, the most exciting part of iPadOS 18. When screen-sharing with someone over SharePlay, you can now remotely draw on-screen to show instructions (and even request remote access to control someone’s device and help them with something). In Notes, a new Smart Script feature uses an on-device model to recreate your handwriting style and refine it as you go; in theory, this will smooth out your handwriting, straighten it, and make it more legible. This even supports spell-checking and applying your handwriting style to text you paste from other apps.

And…yeah, that’s about it for the improvements in iPadOS 18. If this section is of any indication, the iPadOS 18 chapter of my review will be a very short one this year.

iOS and iPadOS 18

It’s only been a few hours since Apple wrapped up its WWDC keynote, and I would sum up my thoughts and first impressions on iOS and iPadOS 18 thusly:

  • Unsurprisingly, pro features for iPadOS users are nowhere to be seen, adding to my concerns regarding who’s in charge of this platform and what their vision for it actually is. It’s quite telling that the marquee additions to iPad this year are…a Calculator app and a redesigned tab bar.
  • Customization features in iOS and iPadOS look fantastic, and they’re a clever way to get people to upgrade. Both the Home Screen and Control Center get an immediate thumbs up from me.
  • It’s clear that Apple went from “Let’s make sure we can ship visionOS” at WWDC 2023 to “Let’s make sure we can ship AI” at WWDC 2024.

So many of the features shown by Apple today are predicated upon the arrival of Apple Intelligence, which will be available later this summer for U.S. English users only. The more intriguing parts of iOS and iPadOS are right there, with a rebuilt Siri, text analysis and summarization features, categorization for priority notifications and messages, and other generative capabilities for text, emoji, and images. But even then, other interesting abilities demonstrated today, such as Siri taking action on your behalf inside apps, won’t be available until next year.

On the surface, both iOS and iPadOS 18 look like relatively minor updates, mostly comprised of app-related features in search of an ~~author~~ AI model that is not available yet.

The vibe check for now? Disappointed about iPadOS, optimistic about Apple Intelligence and iOS 18, but mildly creeped out by generative image creation features.

Let’s circle back in July.


You can follow all of our WWDC coverage through our WWDC 2024 hub or subscribe to the dedicated WWDC 2024 RSS feed.


Access Extra Content and Perks

Founded in 2015, Club MacStories has delivered exclusive content every week for nearly a decade.

What started with weekly and monthly email newsletters has blossomed into a family of memberships designed every MacStories fan.

Club MacStories: Weekly and monthly newsletters via email and the web that are brimming with apps, tips, automation workflows, longform writing, early access to the MacStories Unwind podcast, periodic giveaways, and more;

Club MacStories+: Everything that Club MacStories offers, plus an active Discord community, advanced search and custom RSS features for exploring the Club’s entire back catalog, bonus columns, and dozens of app discounts;

Club Premier: All of the above and AppStories+, an extended version of our flagship podcast that’s delivered early, ad-free, and in high-bitrate audio.

Learn more here and from our Club FAQs.

Join Now]]>